A framework for large scale recommendation algorithms.
-
Updated
Jun 16, 2025 - Python
A framework for large scale recommendation algorithms.
[NeurIPS 2023] Michelangelo: Conditional 3D Shape Generation based on Shape-Image-Text Aligned Latent Representation
InternEvo is an open-sourced lightweight training framework aims to support model pre-training without the need for extensive dependencies.
Repository for Project Insight: NLP as a Service
Sparse Inferencing for transformer based LLMs
Federated Learning Utilities and Tools for Experimentation
The official code for "TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting (ICLR 2024)". TEMPO is one of the very first open source Time Series Foundation Models for forecasting task v1.0 version.
Retrieval-based Voice Conversion (RVC) implemented with Hugging Face Transformers.
[TMI 2023] XBound-Former: Toward Cross-scale Boundary Modeling in Transformers
Official repository for the paper "ALERT: A Comprehensive Benchmark for Assessing Large Language Models’ Safety through Red Teaming"
An open source community implementation of the model from "DIFFERENTIAL TRANSFORMER" paper by Microsoft.
Symbolic music generation taking inspiration from NLP and human composition process
NewsAgent is an enterprise-grade news aggregation agent designed to fetch, query, and summarize news from multiple sources at scale.
Public repo for the paper: "Modeling Intensification for Sign Language Generation: A Computational Approach" by Mert Inan*, Yang Zhong*, Sabit Hassan*, Lorna Quandt, Malihe Alikhani
CHARacter-awaRE Diffusion: Multilingual Character-Aware Encoders for Font-Aware Diffusers That Can Actually Spell
✨ Solve multi_dimensional multiple knapsack problem using state_of_the_art Reinforcement Learning Algorithms and transformers
Zeta implemantion of "Rethinking Attention: Exploring Shallow Feed-Forward Neural Networks as an Alternative to Attention Layers in Transformers"
The official fork of THoR Chain-of-Thought framework, enhanced and adapted for Emotion Cause Analysis (ECAC-2024)
Annotated implementation of vanilla Transformers to guide through all the ambiguities.
A Transformer Implementation that is easy to understand and customizable.
Add a description, image, and links to the transformers-models topic page so that developers can more easily learn about it.
To associate your repository with the transformers-models topic, visit your repo's landing page and select "manage topics."