Skip to content

This project implements a multilayer perceptron (MLP) neural network in Rust from scratch. It includes forward and backward propagation for different modules such as `Linear`, `Sigmoid`, and `CrossEntropyLoss`.

Notifications You must be signed in to change notification settings

edugzlez/simple-dnn-mnist-rs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MLP Neural Network for MNIST in Rust (From Scratch)

This project implements a multilayer perceptron (MLP) neural network in Rust from scratch. It includes forward and backward propagation for different modules such as Linear, Sigmoid, and CrossEntropyLoss.

Dataset

Download the MNIST dataset from Kaggle and place the files in the data folder.

Features

  • Training progress tracking: Training and validation loss/accuracy are monitored using kdam.
  • Efficient matrix operations: Layers and optimizers leverage the ndarray crate.
  • CSV data handling: The dataset is loaded using the polars crate.

Usage

To see available options, run:

cargo run --release -- --help

Usage: simple-dnn-mnist [OPTIONS]

Options:
  -t, --train <TRAIN>                                  [default: data/mnist_train.csv]
  -v, --validation <VALIDATION>                        [default: data/mnist_test.csv]
      --training-batch-size <TRAINING_BATCH_SIZE>      [default: 32]
      --validation-batch-size <VALIDATION_BATCH_SIZE>  [default: 64]
      --lr <LR>                                        [default: 1e-2]
  -e, --epochs <EPOCHS>                                [default: 100]
  -h, --help                                           Print help
  -V, --version                                        Print version

Use the following command to train the model:

$ cargo run --release -- -t data/mnist_train.csv -v data/mnist_test.csv -e 100 -l 1e-2

example

About

This project implements a multilayer perceptron (MLP) neural network in Rust from scratch. It includes forward and backward propagation for different modules such as `Linear`, `Sigmoid`, and `CrossEntropyLoss`.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages