Skip to content

implementations of various foundational deep learning models using PyTorch

Notifications You must be signed in to change notification settings

limJhyeok/models_from_scratch

Repository files navigation

Deep Learning Models from scratch with PyTorch

언어 선택 / Language Selection:

This repository contains implementations of various foundational deep learning models using PyTorch. The purpose of this project is twofold:

  1. Master PyTorch: Develop and refine skills in building deep learning models from scratch using PyTorch.
  2. Understand Core Architectures: Gain an in-depth understanding of CNN (Convolutional Neural Network) and Transformer architectures, which form the foundation of modern deep learning.

Implemented Models

Model Colab
LeNet
One of the earliest convolutional neural networks introduced by Yann LeCun. A foundational model for image classification tasks.
Open in Colab
VGG Network
Known for its simplicity and use of very deep networks composed of 3x3 convolution layers. Explores the role of network depth in improving accuracy.
Open in Colab
GoogLeNet (Inception)
Introduced the Inception module, allowing for efficient computation with reduced parameters. Demonstrates the importance of designing models with computational efficiency in mind.
Open in Colab
ResNet
Introduced the concept of residual connections, addressing the vanishing gradient problem in deep networks. Paved the way for very deep architectures with hundreds of layers.
Open in Colab
EfficientNet
Scales models effectively across width, depth, and resolution using a compound scaling method. Combines efficiency and performance, achieving state-of-the-art results.
Open in Colab
Transformer
Revolutionized deep learning by replacing RNNs with self-attention mechanisms. Forms the backbone of many modern models like BERT and GPT.
Open in Colab

Why These Models?

These models were selected to cover the evolution of deep learning architectures:

  • LeNet introduces CNN basics.
  • VGG and GoogLeNet explore architectural innovations for improving depth and efficiency.
  • ResNet solves the challenges of training very deep networks.
  • EfficientNet optimizes model scaling.
  • Transformer breaks away from traditional CNNs and RNNs, showcasing the versatility of attention mechanisms.

Resources

About

implementations of various foundational deep learning models using PyTorch

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published