Skip to content

Commit

Permalink
modified: dsm/__init__.py
Browse files Browse the repository at this point in the history
	modified:   dsm/datasets.py
	modified:   dsm/dsm_api.py
	modified:   dsm/dsm_torch.py
	modified:   dsm/losses.py
	modified:   dsm/utilities.py
  • Loading branch information
chiragnagpal committed Nov 1, 2020
1 parent d90b58b commit 41cd52e
Show file tree
Hide file tree
Showing 6 changed files with 262 additions and 116 deletions.
45 changes: 34 additions & 11 deletions dsm/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,17 @@
# If not, see <https://www.gnu.org/licenses/>.

"""
[![Build Status](https://travis-ci.org/chiragnagpal/DeepSurvivalMachines.svg?\
branch=master)](https://travis-ci.org/chiragnagpal/DeepSurvivalMachines)
&nbsp;&nbsp;&nbsp;\
[![License: GPL v3](https://img.shields.io/badge/License-GPLv3-blue.svg)]\
(https://www.gnu.org/licenses/gpl-3.0)
&nbsp;&nbsp;&nbsp;\
[![GitHub Repo stars](https://img.shields.io/github/stars/autonlab/Deep\
SurvivalMachines?style=social)](https://github.com/autonlab/DeepSurvival\
Machines)
Python package `dsm` provides an API to train the Deep Survival Machines
and associated models for problems in survival analysis. The underlying model
is implemented in `pytorch`.
Expand Down Expand Up @@ -54,11 +65,6 @@
parametric distributions. The parameters of these mixture distributions as
well as the mixing weights are modelled using Neural Networks.
#### Usage Example
>>> from dsm import DeepSurvivalMachines
>>> model = DeepSurvivalMachines()
>>> model.fit()
>>> model.predict_risk()
Deep Recurrent Survival Machines
--------------------------------
Expand All @@ -71,9 +77,10 @@
data like vital signs, degradation monitoring signals in predictive
maintainance. **DRSM** allows the learnt representations at each time step to
involve historical context from previous time steps. **DRSM** implementation in
`dsm` is carried out through an easy to use API that accepts lists of data
streams and corresponding failure times. The module automatically takes care of
appropriate batching and padding of variable length sequences.
`dsm` is carried out through an easy to use API,
`DeepRecurrentSurvivalMachines` that accepts lists of data streams and
corresponding failure times. The module automatically takes care of appropriate
batching and padding of variable length sequences.
..warning:: Not Implemented Yet!
Expand All @@ -90,6 +97,21 @@
..warning:: Not Implemented Yet!
Example Usage
-------------
>>> from dsm import DeepSurvivalMachines
>>> from dsm import datasets
>>> # load the SUPPORT dataset.
>>> x, t, e = datasets.load_dataset('SUPPORT')
>>> # instantiate a DeepSurvivalMachines model.
>>> model = DeepSurvivalMachines()
>>> # fit the model to the dataset.
>>> model.fit(x, t, e)
>>> # estimate the predicted risks at the time
>>> model.predict_risk(x, 10)
References
----------
Expand Down Expand Up @@ -143,13 +165,14 @@
<img style="float: right;" width ="200px" src="https://www.cmu.edu/brand/downloads/assets/images/wordmarks-600x600-min.jpg">
<img style="float: right;padding-top:50px" src="https://www.autonlab.org/user/themes/auton/images/AutonLogo.png">
<img style="float: right;" width ="200px" src="https://www.cmu.edu/brand/\
downloads/assets/images/wordmarks-600x600-min.jpg">
<img style="float: right;padding-top:50px" src="https://www.autonlab.org/\
user/themes/auton/images/AutonLogo.png">
<br><br><br><br><br>
<br><br><br><br><br>
"""

from dsm.dsm_api import DeepSurvivalMachines, DeepRecurrentSurvivalMachines
31 changes: 30 additions & 1 deletion dsm/datasets.py
Original file line number Diff line number Diff line change
Expand Up @@ -114,7 +114,6 @@ def _load_pbc_dataset(sequential):
else, returns collapsed results for each time step. To train
recurrent neural models you would typically use True.
References
----------
[1] Fleming, Thomas R., and David P. Harrington. Counting processes and
Expand Down Expand Up @@ -193,6 +192,36 @@ def _load_support_dataset():
def load_dataset(dataset='SUPPORT', **kwargs):
"""Helper function to load datasets to test Survival Analysis models.
Currently implemented datasets include:
**SUPPORT**: This dataset comes from the Vanderbilt University study
to estimate survival for seriously ill hospitalized adults [1].
(Refer to http://biostat.mc.vanderbilt.edu/wiki/Main/SupportDesc.
for the original datasource.)
**PBC**: The Primary biliary cirrhosis dataset [2] is well known
dataset for evaluating survival analysis models with time
dependent covariates.
**FRAMINGHAM**: This dataset is a subset of 4,434 participants of the well
known, ongoing Framingham Heart study [3] for studying epidemiology for
hypertensive and arteriosclerotic cardiovascular disease. It is a popular
dataset for longitudinal survival analysis with time dependent covariates.
References
-----------
[1]: Knaus WA, Harrell FE, Lynn J et al. (1995): The SUPPORT prognostic
model: Objective estimates of survival for seriously ill hospitalized
adults. Annals of Internal Medicine 122:191-203.
[2] Fleming, Thomas R., and David P. Harrington. Counting processes and
survival analysis. Vol. 169. John Wiley & Sons, 2011.
[3] Dawber, Thomas R., Gilcin F. Meadors, and Felix E. Moore Jr.
"Epidemiological approaches to heart disease: the Framingham Study."
American Journal of Public Health and the Nations Health 41.3 (1951).
Parameters
----------
dataset: str
Expand Down
Loading

0 comments on commit 41cd52e

Please sign in to comment.