Skip to content

Literature reviews of (Unsupervised/self-supervised) pretraining on medical datasets

Notifications You must be signed in to change notification settings

peterlipan/Awesome-Medical-Pretraining

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 

Repository files navigation

Awesome-Medical-Pretraining

Literature reviews of (Unsupervised/self-supervised) pretraining on medical datasets

Name Title Modalities Paper Code
UniMedI Unified Medical Image Pre-training in Language-Guided Common Semantic Space CT, X-Ray, Text ICLR'24 OpenReview N/A
LVM-Med LVM-Med: Learning Large-Scale Self-Supervised Vision Models for Medical Imaging via Second-order Graph Matching MRI, CT, X-ray, Dermoscopy, Endoscope, Retinal Image, Ultrasound, Cell Image NeurIPS'23 Link
SAMv2 SAMv2: A Unified Framework for Learning Appearance, Semantic and Cross-Modality Anatomical Embeddings Radiology arXiv(2023) Link
GVSL Geometric Visual Similarity Learning in 3D Medical Image Self-supervised Pre-training CT, MR CVPR'23 link
MedKLIP MedKLIP: Medical Knowledge Enhanced Language-Image Pre-Training for X-ray Diagnosis X-Ray, Text ICCV'23 Link
LIMITR LIMITR: Leveraging Local Information for Medical Image-Text Representation X-Ray, Text ICCV'23 Link
Alice Anatomical Invariance Modeling and Semantic Alignment for Self-supervised Learning in 3D Medical Image Analysis CT ICCV'23 Link
PRIOR PRIOR: Prototype Representation Joint Learning from Medical Images and Reports Image, Text ICCV'23 Link
Med-VLP Towards Unifying Medical Vision-and-Language Pre-training via Soft Prompts Image, Text ICCV'23 Link
MRM MRM: Masked Relation Modeling for Medical Image Pre-Training with Genetics Image, Genetics ICCV'23 Link
M3AE Multi-Modal Masked Autoencoders for Medical Vision-and-Language Pre-Training Image, Text MICCAI'22 Link
SAM SAM: Self-Supervised Learning of Pixel-Wise Anatomical Embeddings in Radiological Images Radiology TMI(2022) Link

About

Literature reviews of (Unsupervised/self-supervised) pretraining on medical datasets

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published