Skip to content

Commit

Permalink
🔀 Merge branch 'cluster_active_subglacial_lakes' into crossover_tracks (
Browse files Browse the repository at this point in the history
#149)

Closes #149 Find active subglacial lake points using unsupervised clustering.
  • Loading branch information
weiji14 committed Sep 15, 2020
2 parents a661023 + 9142e87 commit a2eac51
Show file tree
Hide file tree
Showing 24 changed files with 2,298 additions and 753 deletions.
6 changes: 4 additions & 2 deletions .github/workflows/python-app.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,11 @@ name: Test DeepIceDrain

on:
push:
branches: [ master ]
branches:
- master
pull_request:
branches: [ master ]
branches:
- "**"

jobs:
test:
Expand Down
33 changes: 26 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ in Antarctica using remote sensing and machine learning.

![ICESat-2 ATL11 rate of height change over time in Antarctica 2018-10-14 to 2020-05-13](https://user-images.githubusercontent.com/23487320/90118294-2601ff80-ddac-11ea-8b93-7bc9b15f2be0.png)

![DeepIceDrain Pipeline](https://yuml.me/diagram/scruffy;dir:LR/class/[Land-Ice-Elevation|atl06_play.ipynb]->[Convert|atl06_to_atl11.ipynb],[Convert]->[Ice-Sheet-H(t)-Series|atl11_play.ipynb],[Ice-Sheet-H(t)-Series]->[Height-Change-over-Time-(dhdt)|atlxi_dhdt.ipynb])
![DeepIceDrain Pipeline](https://yuml.me/diagram/scruffy;dir:LR/class/[Land-Ice-Elevation|atl06_play.ipynb]->[Convert|atl06_to_atl11.ipynb],[Convert]->[Ice-Sheet-H(t)-Series|atl11_play.ipynb],[Ice-Sheet-H(t)-Series]->[Height-Change-over-Time-(dhdt)|atlxi_dhdt.ipynb],[Height-Change-over-Time-(dhdt)]->[Subglacial-Lake-Finder|atlxi_lake.ipynb])

# Getting started

Expand Down Expand Up @@ -57,7 +57,7 @@ To just try out the scripts, download the `environment.yml` file from the reposi
conda env create --name deepicedrain --file environment.yml
pip install git+https://github.com/weiji14/deepicedrain.git

### Advanced
### Intermediate

To help out with development, start by cloning this [repo-url](/../../)

Expand All @@ -77,11 +77,6 @@ Then install the python libraries listed in the `pyproject.toml`/`poetry.lock` f

poetry install

If you have a [CUDA](https://en.wikipedia.org/wiki/CUDA)-capable GPU,
you can also install the optional "cuda" packages to accelerate some calculations.

poetry install --extras cuda

Finally, double-check that the libraries have been installed.

poetry show
Expand All @@ -93,6 +88,30 @@ Finally, double-check that the libraries have been installed.

jupyter labextension list # ensure that extensions are installed


### Advanced

This is for those who want full reproducibility of the conda environment,
and more computing power by using Graphical Processing Units (GPU).

Making an explicit conda-lock file
(only needed if creating a new conda environment/refreshing an existing one).

conda env create -f environment.yml
conda list --explicit > environment-linux-64.lock

Creating/Installing a virtual environment from a conda lock file.
See also https://docs.conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html#building-identical-conda-environments.

conda create --name deepicedrain --file environment-linux-64.lock
conda install --name deepicedrain --file environment-linux-64.lock

If you have a [CUDA](https://en.wikipedia.org/wiki/CUDA)-capable GPU,
you can also install the optional "cuda" packages to accelerate some calculations.

poetry install --extras cuda


## Running jupyter lab

conda activate deepicedrain
Expand Down
164 changes: 164 additions & 0 deletions antarctic_subglacial_lakes.geojson

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion atl11_play.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -303,7 +303,7 @@
"# Do the actual computation to find data points within region of interest\n",
"placename: str = \"kamb\" # Select Kamb Ice Stream region\n",
"region: deepicedrain.Region = regions[placename]\n",
"ds_subset: xr.Dataset = region.subset(ds=ds)\n",
"ds_subset: xr.Dataset = region.subset(data=ds)\n",
"ds_subset = ds_subset.unify_chunks()\n",
"ds_subset = ds_subset.compute()"
]
Expand Down
2 changes: 1 addition & 1 deletion atl11_play.py
Original file line number Diff line number Diff line change
Expand Up @@ -181,7 +181,7 @@
# Do the actual computation to find data points within region of interest
placename: str = "kamb" # Select Kamb Ice Stream region
region: deepicedrain.Region = regions[placename]
ds_subset: xr.Dataset = region.subset(ds=ds)
ds_subset: xr.Dataset = region.subset(data=ds)
ds_subset = ds_subset.unify_chunks()
ds_subset = ds_subset.compute()

Expand Down
Loading

0 comments on commit a2eac51

Please sign in to comment.