Repository for the Use Case 5 "Deep Image Annotation" containing:
- models based on the EDDL/ECVL libraries;
- models based on the PyTorch-Lightning library.
Previous models, used during the development, and all of the PyTorch models can be found in the release "v1_dev" available to download.
conda_envs
: YAML files for creating the environments based on the EDDl/ECVL (cuDNN version) and PyTorch-Lightningdemo
,demo_src
: code to be used in demossrc
: source codesrc/eddl
: EDDL-based modelssrc/PyTorch-Lightning
: PyTorch (Lightning) modelssrc/preproc
: code for pre-processing the IU-CHEST and the MIMIC-CXR datasets
-
First, preprocess the datasets.
- For the IU-CHEST dataset, use the
Makefile.mk
insrc/preproc
- For the MIMIC-CXR dataset, use the Jupyter Notebook
src/preproc/NB_mimic_ds.ipynb
- For the IU-CHEST dataset, use the
-
Set up the experiments using the Jupyter notebooks
src/eddl/0_experiments.ipynb
andsrc/eddl/0_experiments_mimic.ipynb
for, respectively, the IU-CHEST and the MIMIC-CXR datasets. Some experiments are provided in the two notebooks. -
For the experiments already defined in the jupyter notebooks at the item above, use the two makefiles
src/eddl/Makefile.mk
andsrc/eddl/Makefile_MIMIC.mk
for, respectively, the IU-CHEST and the MIMIC-CXR datasets.
Once configured an experiment, the correct sequence of the scripts is: 1_train_cnn.py
, 2_train_rnn.py
, 3_test_rnn.py
.
Franco Alberto Cardillo, ILC-CNR, francoalberto.cardillo@ilc.cnr.it