Building Convolutional Neural Networks From Scratch using NumPy
-
Updated
Jun 19, 2023 - Python
Building Convolutional Neural Networks From Scratch using NumPy
QReLU and m-QReLU: Two novel quantum activation functions for Deep Learning in TensorFlow, Keras, and PyTorch
Super Resolution's the images by 3x using CNN
Library which can be used to build feed forward NN, Convolutional Nets, Linear Regression, and Logistic Regression Models.
Neural Network to predict which wearable is shown from the Fashion MNIST dataset using a single hidden layer
A classifier to differentiate between Cat and Non-Cat Images
Neural Network from scratch without any machine learning libraries
Using MNSIT as a training dataset, this model is trained to predict the handwritten digits.
Building Convolution Neural Networks from Scratch
rede neural totalmente conectada, utilizando mini-batch gradient descent e softmax para classificação no dataset MNIST
Backward pass of ReLU activation function for a neural network.
Gesture Recognition by CNN created using Networks Library created by me.
Feed Forward Neural Network to classify the FB post likes in classes of low likes or moderate likes or high likes, back propagtion is implemented with decay learning rate method
Channelwise Partial Convolutions for hardware aware applications
Add a description, image, and links to the relu-layer topic page so that developers can more easily learn about it.
To associate your repository with the relu-layer topic, visit your repo's landing page and select "manage topics."