In: Physics
How the weight updation equation impact on the convergence of Hebb’s learning rule, perceptron learning rule and ADALINE
Solution : The Hebbian Learning rule is a learning rule that specifies how much the weight of the connection between two units should be increased or decreased in proportion to the product of their activation....The Hebbian Rule works well as long as all the input patterns are orthogonal or uncorrelated.
Perceptron learning rule is an example of supervised training , in which the learning rule is provided with a set of example of proper network behaviour.
{p1,t1},{p2,t2},.....{pQ,tQ},
Where pQ is an input to the network and tQ is the corresponding target output. As each input is applied to the network, the network output is compared to the target. The learning rule then adjust the weights and biases of network in order to move the network output closer to the target.
ADALINE define an error function that measure the performance in terms of the weights , input, output and desired output. Take derivatives of the functions with respect to the weights and modify such that the error is decreased.