In: Computer Science
Using these four feature vectors in the order listed (See Below, the first vector is [0, 1, 0, 1]) with a Bias of constant one and assume the random initial weights are [0.1, -0.6, 0.3, -0.7], calculate the next four iterations, calculate the next four weights using the perceptron learning algorithm. Assume the learning rate, alpha is equal to 0.2
| 
 X  | 
 Y  | 
 Z  | 
 Bias  | 
 Class  | 
| 
 0  | 
 1  | 
 0  | 
 1  | 
 A(+1)  | 
| 
 1  | 
 0  | 
 0  | 
 1  | 
 A(+1)  | 
| 
 1  | 
 1  | 
 1  | 
 1  | 
 A(+1)  | 
| 
 0  | 
 0  | 
 0  | 
 1  | 
 B(-1)  | 
Please find the code below,
Perceptron.ipynb
import numpy as np
R = int(input('Please input the no. of TRAINING DATA - '))
C = int(input('Please input the no. of NEURONS IN INPUT LAYER - '))
x = []
t = []
print('Please input TRAINING DATA & OUTPUT CLASS ASSOCIATED WITH IT - ')
for i in range(R):
    temp = list(map(float, input('X'+str(i+1)+' : ').split()))
    temp.append(1.0)
    x.append(temp)
    t.append(float(input('T'+str(i+1)+' : ')))
x = np.array(x)
t = np.array(t)
w = np.array(list(map(float, input('Please input value of WEIGHTS & BIAS : ').split())))
alpha = float(input('Please input the LEARNING RATE VALUE - '))
theta = float(input('Please input the THETA - '))
prev_w = np.random.random([R,C+1])
curr_w = np.random.random((R,C+1))
ep = 1;
while(np.array_equal(prev_w,curr_w)==False):
    print('\n******** EPOCH - ',ep,'********')
    ep += 1
    prev_w = np.array(curr_w)
    for i in range(R):
        yin = sum(x[i]*w)
        # apply activation func.
        if(yin>theta):
            y = 1
        elif(yin<(-1)*theta):
            y = -1
        else:
            y = 0
        delw = np.zeros(C+1, dtype=float)
        if(y!=t[i]):
            delw = alpha*t[i]*x[i]
        w = w + delw
        curr_w[i] = w
    print('NEW WEIGHTS & BIAS - \n',curr_w)
print('\nFINAL WEIGHTS & BIAS - ',w)
OUTPUT:


If you have any doubts ask in the comments, also don't forget to upvote the solution.