Highlights
- Pro
Pinned Loading
-
torchdistill
torchdistill PublicA coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Train…
-
sc2-benchmark
sc2-benchmark Public[TMLR] "SC2 Benchmark: Supervised Compression for Split Computing"
-
supervised-compression
supervised-compression Public[WACV 2022] "Supervised Compression for Resource-Constrained Edge Computing Systems"
-
hnd-ghnd-object-detectors
hnd-ghnd-object-detectors Public[ICPR 2020] "Neural Compression and Filtering for Edge-assisted Real-time Object Detection in Challenged Networks" and [ACM MobiCom EMDL 2020] "Split Computing for Complex Object Detectors: Challen…
-
head-network-distillation
head-network-distillation Public[IEEE Access] "Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-constrained Edge Computing Systems" and [ACM MobiCom HotEdgeVideo 2019] "Distilled Split Deep Neural …
-
guess-blind-entities
guess-blind-entities Public[JCDL WOSP 2020] "Citations Beyond Self Citations: Identifying Authors, Affiliations, and Nationalities in Scientific Papers"
Java
If the problem persists, check the GitHub status page or contact support.