Skip to content

Latest commit

 

History

History
26 lines (19 loc) · 582 Bytes

README.md

File metadata and controls

26 lines (19 loc) · 582 Bytes

activators

Artificial Neural Networks Activation Functions

  1. Rectified Linear Unit (ReLU)
  2. Logistic Sigmoid
  3. Hyperbolic Tangent

Installing

PM> Install-Package ann.activators.koryakinp

Example

var activator = ActivatorFactory.Produce(ActivatorType.Relu);
var x = activator.CalculateValue(3.5);
var dx = activator.CalculateDeriviative(3.5);

Authors

Pavel koryakin koryakinp@koryakinp.com

License

This project is licensed under the MIT License - see the LICENSE.md for details.