#
gelu
Here are 3 public repositories matching this topic...
"The 'Activation Functions' project repository contains implementations of various activation functions commonly used in neural networks. "
python algorithms swish gaussian artificial-neural-networks prelude artificial-intelligence-algorithms maxout sigmoid tanh elu-activation selu relu activation-functions gelu sigmoid-activation relu-activation tanh-activation leakyrelu sigmoid-activation-function
-
Updated
Mar 18, 2024 - Python
PyTorch implementation of normalization-free LLMs investigating entropic behavior to find desirable activation functions
pythia leaky-relu relu privacy-preserving-machine-learning pytorch-implementation gelu gpt-2 model-optimization transformers-models normalization-free-training llm-inference llm-evaluation llm-architecture private-inference entropy-collapse attention-we
-
Updated
Nov 2, 2024 - Python
Improve this page
Add a description, image, and links to the gelu topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the gelu topic, visit your repo's landing page and select "manage topics."