Implementations of neural networks in python for the classification of MNIST datasets.
-
Updated
Nov 24, 2024 - Jupyter Notebook
Implementations of neural networks in python for the classification of MNIST datasets.
The implementation of Coordinate Descent Method Accelerated by Universal Metaalgorithm with efficient amortised complexity of iteration & Experiments with sparse SoftMax function, where the proposed method is better than FGM
Assignment 1 : To build a neural network using softmax as activation function
JavaFx Application for Convolutional Network to perfom Image Classification using Softmax Output Layer, Back Propagation, Gradient Descent, Partial Derivatives, Matrix Flattening, Matrix Unfolding, Concurrent Task, Performance Histogram, Confusion Matrix
Add a description, image, and links to the softmax-function topic page so that developers can more easily learn about it.
To associate your repository with the softmax-function topic, visit your repo's landing page and select "manage topics."