Official code, datasets and checkpoints for "Timer: Generative Pre-trained Transformers Are Large Time Series Models" (ICML 2024)
-
Updated
Feb 15, 2025 - Python
Official code, datasets and checkpoints for "Timer: Generative Pre-trained Transformers Are Large Time Series Models" (ICML 2024)
"AnyGraph: Graph Foundation Model in the Wild"
"OpenCity: Open Spatio-Temporal Foundation Models for Traffic Prediction"
The code repository for "Model Spider: Learning to Rank Pre-Trained Models Efficiently"
HiCFoundation is a generalizable Hi-C foundation model for chromatin architecture, single-cell and multi-omics analysis across species.
Add a description, image, and links to the large-models topic page so that developers can more easily learn about it.
To associate your repository with the large-models topic, visit your repo's landing page and select "manage topics."