KNN Classification / Classifier
Picture this: youβre at a crowded Indian wedding, trying to figure out which side of the family a guest belongs to. You donβt ask for their ID. you just look at who theyβre sitting with. If theyβre surrounded by the groomβs cousins cracking jokes, theyβre probably part of the Barat. If theyβre huddled with the aunts discussing the catering, theyβre likely from the brideβs side. Judging someone by their immediate neighborsβis exactly how the K-Nearest Neighbors (KNN) algorithm works. It is perhaps the most "human" algorithm in Data Science, relying on the simple, intuition that similar things always stay close together.

Euclidean Distance
Euclidean distance in K-dimension
Setting K data points which are nearest to X where K is generally an odd number then checking the class of all those K data points which are in neighborhood of X and then use both of majority concept to decide the class of X this is called normal KNN classifier.
Letβs say we have the following data points:
| Point | X | Y | Class |
| A | 1 | 2 | π₯ Red |
| B | 2 | 3 | π₯ Red |
| C | 3 | 1 | π¦ Blue |
| D | 6 | 5 | π₯ Red |
| E | 7 | 7 | π₯ Red |
| F | 8 | 6 | π₯ Red |
| G | 3 | 4 | π¦ Blue |
| H | 4 | 3 | π¦ Blue |
| I | 5 | 2 | π¦ Blue |
| J | 6 | 3 | π₯ Red |
| X | 4 | 4 | β Unknown |
Now we want to classify point X = (4, 4) using K = 5
π Step 1: Calculate Euclidean Distances from X(4,4)
| Point | Coordinates | Distance to X(4,4) | Class |
| A | (1,2) | 3.6 | π₯ |
| B | (2,3) | 2.24 | π₯ |
| C | (3,1) | 3.16 | π¦ |
| D | (6,5) | 2.24 | π₯ |
| E | (7,7) | 4.24 | π₯ |
| F | (8,6) | 4.47 | π₯ |
| G | (3,4) | 1.0 | π¦ |
| H | (4,3) | 1.0 | π¦ |
| I | (5,2) | 2.24 | π¦ |
| J | (6,3) | 2.24 | π₯ |
π Step 2: Choose the K (5) Nearest Neighbors
Sorted by distance:
- G (1.0) π¦
- H (1.0) π¦
- B (2.24) π₯
- D (2.24) π₯
- I (2.24) π¦
π§ Step 3: Voting (Majority Class)
From 5 nearest neighbors:
- π₯ Red β 2 (B, D)
- π¦ Blue β 3 (G, H, I)
β
Majority = Blue, so
π X(4,4) is classified as π¦ Blue
π― Weighted KNN Classifier
Unlike the normal KNN where all neighbors have equal vote, in Weighted KNN, closer neighbors have more influence in deciding the class of the new point.
π‘ Weight is usually inverse of the distance
β smaller distance β higher weight.
π§ͺ Example: Weighted KNN
| Point | X | Y | Class |
| A | 2 | 3 | π₯ Red |
| B | 3 | 2 | π¦ Blue |
| C | 4 | 3 | π₯ Red |
| D | 5 | 1 | π¦ Blue |
| E | 3 | 5 | π₯ Red |
| F | 5 | 4 | π¦ Blue |
| x | 4 | 2 | β Unknown |
π Classify point X = (4, 2) with K = 3
π Step 1: Compute Euclidean Distance to X(4,2)
| Point | Coordinates | Distance to X(4,2) | Class |
| A | (2,3) | 2.24 | π₯ |
| B | (3,2) | 1.0 | π¦ |
| C | (4,3) | 1.0 | π₯ |
| D | (5,1) | 1.41 | π¦ |
| E | (3,5) | 3.16 | π₯ |
| F | (5,4) | 2.24 | π¦ |
Step 2: Pick K = 3 Nearest Neighbors
Sorted by distance:
- B (1.0) π¦
- C (1.0) π₯
- D (1.41) π¦
Step 3: Assign Weights
| Point | Distance | Class | Weight |
| B | 1.0 | π¦ | 1.0 |
| C | 1.0 | π₯ | 1.0 |
| D | 1.41 | π¦ | 0.5 |
Step 4: Weighted Voting
- π¦ Blue:
1.00 (B) + 0.5 (D)= 1.50 - π₯ Red:
1.00 (C)= 1.00
β Weighted majority = Blue
π So, X(4,2) is classified as π¦ Blue

