K- Nearest Neighbors is one of the most used algorithms in classification problems. implementing it from scratch is a great programming exercise and can give us a lot of insights about data processing, learning optimization, and general statistical learning insights, that’s what I’ll try to accomplish in this article and it hopefully will lead us to a second part to discuss PCA analysis.

If you are familiar with libraries such as scikit-learn, you know how easy can be to implement a machine learning algorithm. But there’s a lot happening behind a classification or regression algorithm, and understanding the way those algorithms work is essential to build a truly efficient model that work’s properly. Also, implementing ML algorithms from scratch is the most efficient way to really learn what Artificial Intelligence is about. …

If you searched for Neural Networks, Deep Learning, Machine Learning, or anything that has to do with Artificial Intelligence, you’ve probably heard about the Perceptron. The1958’s famous algorithm created by Frank Rosenblatt is the landing mark in Neural Networks and consequently Deep Learning, and understanding the way it works (i.e., the math behind it) is essential to go a step further to more complex IA models, specifically in Neural Networks.

We can say that all a perceptron does is apply some transformations (mathematical functions) in a set of inputs to represent them in a significant way in space (actually, that’s all that any machine learning algorithm does, but keep that between us). Besides the transforms, we need one more step to make things work: the *learning step.* The learning step checks if our transformation is working fine. for that, we need the answer to our problem to compare if our actual output is correct (or close to the correct answer). …

About