In: Computer Science
A perceptron with a unipolar step function has two inputs with weights w1= 0.5 and w2= -0.2, and a threshold theta=0.3 (theta can therefore be considered as a weight for an extra input which is always set to -1). The perceptron is trained using the learning rule ∆w = η (d − y) x, where x is the input vector, η is the learning rate, w is the weight vector, d is the desired output, and y is the actual output.What are the new values of the weights and threshold after one step of training with the input vector x = [0, 1]^T and desired output 1, using a learning rate η = 0.5?