Skip to content

Jeffresh/Gradient-descent-feature-scaling

Repository files navigation

Gradient descent feature scaling

Abstract

A way to speed up gradient descent is having each feature in the same range. There are two ways to do that, one is feature scaling, and the other is mean normalization. We can mix together the two techniques using this formula:

equation

Being equation = equation or equation = equation.

And being equation the number of the feature.

So in this repo you will find a vectorized implementation of the above described.

About

A way to speed up gradient descent is having each feature in the same range.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages