Perceptron

PUBLISHED: MAY 2, 20262 MIN READ

PerceptronThink of the Perceptron as a simple decision-maker—it looks at inputs, weighs their importance, and makes a yes-or-no judgment. It was one of the firs

Abhijeet Singh Rajput Profile Photo
Abhijeet Singh RajputAuthor
gallade
#475
gallade

Perceptron

Think of the Perceptron as a simple decision-maker—it looks at inputs, weighs their importance, and makes a yes-or-no judgment. It was one of the first steps toward building intelligent systems and still plays a crucial role in understanding how neural networks work today. By learning how to draw a boundary between different classes of data, the perceptron forms the foundation for more advanced models used in real-world applications.

📘 Definition:

A Perceptron is an artificial neuron in which the activation function is a threshold function.
Perceptron model diagram showing input features, weights, summation, and step activation function producing binary output

📌 Terminology:

  • ​ = input signals
  • = associated weights
  • = Bias input (typically set to 1).
  • = Bias weight (adjusts activation threshold).
  • = Weighted sum of inputs.
  • = Activation function (determines output).
  • = Final output signal.

The neuron is called as perceptron if the output of the neuron is given by the following functions

O(x1,x2,..xn)O\left(x_1,x_2,..x_{n}\right)
1ifw0+w1x1+wnxn0-1\quad if\quad w_0+w_1x_1+w_{n}x_{n}\leq0

This is a step function — if the weighted sum exceeds the threshold, the neuron "fires" (outputs 1), else it outputs 0.

Perceptron Learning Algorithm

In the algorithm, we use the following notations

SymbolDescription
Number of input variables
output for input vector
The n-dimensional input vector (j-th training sample)
Desired output for input
Value of i-th feature in j-th input
Bias input (always 1)
Weight for the i-th input variable
Weight for the i-th input at l-th iteration

🔷 Algorithm Steps

🔹 Step 1: Initialization

Initialize the weights:

w0,w1,...,wnw_0,w_1,...,w_{n}
  • Can be initialized to 0 or small random values.
  • Also, initialize a threshold (bias term).

🔹 Step 2: Training (for each training sample)

For each example in the training set .
perform the following step over the input and desired output

  1. Compute the output
    This is the predicted output of the perceptron after processing input using the current weights at iteration . The perceptron computes this using:
yj(t)=i=0nwi(t)xjiy_{j}(t)=\sum_{i=0}^{n}w_{i}(t)\cdot x_{ji}
yj(t)=f(w0(t)xj0+w1(t)xj1+...+wn(t)xjn)y_{j}(t)=f\left(w_0(t)\cdot x_{j0}+w_1(t)\cdot x_{j1}+...+w_{n}(t)\cdot x_{jn}\right)
  1. Update weights for each using:
wi(t+1)=wi(t)+(djyj(t))xjiwi(t+1)=w_{i}(t)+(d_{j}-y_{j}(t))\cdot x_{ji}
for i =0,1,...,n\text{for i }=0,1,...,n

🔹 Step 3: Repeat Until Convergence

  • Repeat Step 2 until either:
    • The average error per iteration is less than a predefined threshold:
1sj=1sdjyj(t)<error threshold\frac{1}{s}\sum_{j=1}^{s}|dʲ-yʲ(t)|<\text{error threshold}
  • OR
    • A maximum number of iterations is reached.