KNN Classification

PUBLISHED: MAY 2, 2026β€’2 MIN READ

KNN Classification / ClassifierPicture this: you’re at a crowded Indian wedding, trying to figure out which side of the family a guest belongs to. You don’t ask

Divya Sachan
Divya SachanAuthor
corphish
#341
corphish

KNN Classification / Classifier

Picture this: you’re at a crowded Indian wedding, trying to figure out which side of the family a guest belongs to. You don’t ask for their ID. you just look at who they’re sitting with. If they’re surrounded by the groom’s cousins cracking jokes, they’re probably part of the Barat. If they’re huddled with the aunts discussing the catering, they’re likely from the bride’s side. Judging someone by their immediate neighborsβ€”is exactly how the K-Nearest Neighbors (KNN) algorithm works. It is perhaps the most "human" algorithm in Data Science, relying on the simple, intuition that similar things always stay close together.

KNN Classification Notehub

Euclidean Distance

Euclidean distance in K-dimension

Setting K data points which are nearest to X where K is generally an odd number then checking the class of all those K data points which are in neighborhood of X and then use both of majority concept to decide the class of X this is called normal KNN classifier.

Let’s say we have the following data points:

PointXYClass
A12πŸŸ₯ Red
B23πŸŸ₯ Red
C31🟦 Blue
D65πŸŸ₯ Red
E77πŸŸ₯ Red
F86πŸŸ₯ Red
G34🟦 Blue
H43🟦 Blue
I52🟦 Blue
J63πŸŸ₯ Red
X44❓ Unknown

Now we want to classify point X = (4, 4) using K = 5

πŸ“ Step 1: Calculate Euclidean Distances from X(4,4)

Distance=(x2βˆ’x1)2+(y2βˆ’y1)2Distance=\sqrt{\left(x_2 - x_1\right)^2 + \left(y_2 - y_1\right)^2}
PointCoordinatesDistance to X(4,4)Class
A(1,2)3.6πŸŸ₯
B(2,3)2.24πŸŸ₯
C(3,1)3.16🟦
D(6,5)2.24πŸŸ₯
E(7,7)4.24πŸŸ₯
F(8,6)4.47πŸŸ₯
G(3,4)1.0🟦
H(4,3)1.0🟦
I(5,2)2.24🟦
J(6,3)2.24πŸŸ₯

πŸ” Step 2: Choose the K (5) Nearest Neighbors

Sorted by distance:

  1. G (1.0) 🟦
  2. H (1.0) 🟦
  3. B (2.24) πŸŸ₯
  4. D (2.24) πŸŸ₯
  5. I (2.24) 🟦

🧠 Step 3: Voting (Majority Class)

From 5 nearest neighbors:

  • πŸŸ₯ Red β†’ 2 (B, D)
  • 🟦 Blue β†’ 3 (G, H, I)

βœ… Majority = Blue, so
πŸ‘‰ X(4,4) is classified as 🟦 Blue

🎯 Weighted KNN Classifier

Unlike the normal KNN where all neighbors have equal vote, in Weighted KNN, closer neighbors have more influence in deciding the class of the new point.

πŸ’‘ Weight is usually inverse of the distance
β€” smaller distance β†’ higher weight.

πŸ§ͺ Example: Weighted KNN

PointXYClass
A23πŸŸ₯ Red
B32🟦 Blue
C43πŸŸ₯ Red
D51🟦 Blue
E35πŸŸ₯ Red
F54🟦 Blue
x42❓ Unknown

πŸ‘‰ Classify point X = (4, 2) with K = 3

πŸ“ Step 1: Compute Euclidean Distance to X(4,2)

PointCoordinatesDistance to X(4,2)Class
A(2,3)2.24πŸŸ₯
B(3,2)1.0🟦
C(4,3)1.0πŸŸ₯
D(5,1)1.41🟦
E(3,5)3.16πŸŸ₯
F(5,4)2.24🟦

Step 2: Pick K = 3 Nearest Neighbors

Sorted by distance:

  1. B (1.0) 🟦
  2. C (1.0) πŸŸ₯
  3. D (1.41) 🟦

Step 3: Assign Weights

wi=1(di)2w_{i}=\frac{1}{\left(d_i\right)^2}
PointDistanceClassWeight
B1.0🟦1.0
C1.0πŸŸ₯1.0
D1.41🟦0.5

Step 4: Weighted Voting

  • 🟦 Blue: 1.00 (B) + 0.5 (D) = 1.50
  • πŸŸ₯ Red: 1.00 (C) = 1.00

βœ… Weighted majority = Blue

πŸ‘‰ So, X(4,2) is classified as 🟦 Blue