This repository is a demonstrator for using Streaming Deep Learning to perform Super Resolution on DestinE Climate Data. It leverages a ResNet model with 8x super-resolution capability, applied to climate datasets streamed directly from the Earth DataHub service. The repository also provides a web application for visualizing inference results and generating GeoTIFF files from both low-resolution/high-resolution outputs.
- Low Resolution (LR): Climate Digital Twin (DT) temperature at 2 meters (t2m), IFS-NEMO model, hourly data on single levels.
- Ground Truth (HR): High-Resolution (HR) Climate Digital Twin temperature at 2 meters (t2m), IFS-NEMO model, hourly data on single levels.
-
Install Python Download and install Python
wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh sh Miniconda3-latest-Linux-x86_64.sh
-
Clone the repository:
git git@github.com:tetaud-sebastien/DestinE_eXchange_SR.git cd DestinE_eXchange_SR
-
Install the required packages Create python environment:
conda create --name env python==3.12
Activate the environment
conda activate env
Install python package
pip install -r requirements.txt
Install COG file generation dependencies
- gdal_translate -> Installation on Linux
.
├── README.md
├── app.py
├── assets
│ └── banner.svg
├── auth
│ └── cacheb-authentication.py
├── cfg
│ └── config.yaml
├── data
│ ├── datasets.py
│ └── loaders.py
├── inference.py
├── models
│ └── models.py
├── notebook.ipynb
├── train.py
├── trainer.py
└── utils
└── general.py
Before to train your model, please got on the main Jupyter Notebook and the following cells to setup your destine Credentials:
%%capture cap
%run auth/cacheb-authentication.py
output_1 = cap.stdout.split('}\n')
token = output_1[-1][0:-1]
from pathlib import Path
with open(Path.home() / ".netrc", "a") as fp:
fp.write(token)
It is possible to train the model either on the following notebook: notebook.ipynb
The training script takes a configuration file as input, which parses the training parameters(TODO). You can also run the script directly using the following command:
python train.py
You can access to the Tensboard via the following typing the cli in the root directory:
tensorboard --logdir .
python app.py --model_path your/model/path.pt --config_path your/training/config.json
Example:
python app.py --model_path lightning_logs/version_0/checkpoints/best-val-ssim-epoch=49-val_ssim=0.54.pt --config_path lightning_logs/version_0/config.json
- Expand to N Parameters: Although the current setup processes only one variable (t2m), the trainer will be extended to handle multiple parameters simultaneously.
- Multiple Sources: The architecture is flexible enough to incorporate data from various sources, allowing for a multi-source approach to super-resolution tasks in climate modeling
Feel free to ask questions at the following email adress: sebastien.tetaud@esa.int or open a ticket.