OR Gate Model using Perceptron
Let’s mathematically demonstrate how to train a perceptron to implement the OR gate. The OR gate produces an output of 1 if at least one of its inputs is 1. We’ll follow the Perceptron algorithm to find the weights and bias for this logic gate.
Problem Description (OR Gate):
The OR Gate has the following truth table:
Perceptron Model:
The Perceptron model predicts:
- If (Wx + b > 0), then (y' = 1).
- If (Wx + b <= 0), then (y' = 0).
Weights and Bias:
- Initialize weights (w_1) and (w_2) as 1.
- Set bias (b) as -0.5.
Calculations:
For each row in the OR truth table:
- Row 1 (x1=0, x2=0):
- (w_1 . x1 + w_2 . x2 + b = 0 + 0 - 0.5 = -0.5)
- Since (-0.5 <= 0), (y’ = 0).
- Row 2 (x1=0, x2=1):
- (w_1 . x1 + w_2 . x2 + b = 0 + 1 - 0.5 = 0.5)
- Since (0.5 > 0), (y’ = 1).
- Row 3 (x1=1, x2=0):
- (w_1 . x1 + w_2 . x2 + b = 1 + 0 - 0.5 = 0.5)
- Since (0.5 > 0), (y’ = 1).
- Row 4 (x1=1, x2=1):
- (w_1 . x1 + w_2 . x2 + b = 1 + 1 - 0.5 = 1.5)
- Since (1.5 > 0), (y’ = 1).
Conclusion:
The perceptron model for the OR gate is: [ y’ = x1 + x2 - 0.5 ]
No comments