Skip to content

Latest commit

 

History

History
23 lines (10 loc) · 1.3 KB

README.md

File metadata and controls

23 lines (10 loc) · 1.3 KB

KNN-visualization

K-Nearest Neighbors Algorithm is one of the simple, easy-to-implement, and yet effective supervised machine learning algorithms. It is called a lazy learner algorithm because it does not learn from the training set immediately instead it stores the dataset and at the time of classification, it performs an action on the datase KNN-1

It's advantages-

  • No Training Period: KNN is called Lazy Learner (Instance based learning). It does not learn anything in the training period. It does not derive any discriminative function from the training data.
  • Since the KNN algorithm requires no training before making predictions, new data can be added seamlessly which will not impact the accuracy of the algorithm.
  • KNN is very easy to implement. There are only two parameters required to implement KNN i.e. the value of K and the distance function.

Here we will be using Euclidean distance to calculate the distance of a new data point from each point in our training dataset

image