In: Computer Science
Solution 1: The number of neurons in the output layer actually depends upon the number of classes in which the inputs are actually going to be classified. In this case, the number of classes is actually equally equal to 10 since the digits have to be classified into 0-9 digits.
Solution 2: The number of inputs neurons in a neural network actually depends upon the number of inputs and in this case, the number of neurons actually depends upon the size of the image which is equal to 16 X 16 = 256, this is because the image is nothing but a matrix that contains at least 256 values inside it. Therefore, the input neurons in this case are equal to 256.
Solution 3: Total number of neurons = 256 + 256 + 10 = 522. The bias weights actually depend upon your choice and how much you want to regularize the network.
Solution 4: Total number of neural links would be equal to 256 X 256 X 10 = 655360.
Solution 5: For the hidden layer, you must use the ReLu activation function because it prevents the neural network from suffering through the exploding gradient or vanishing gradient problem. You should use the softmax classifier as the activation function in the output layer to optimally classify all the digits.
Here's the solution to your question and it is absolutely correct, please please please provide it a 100% rating. Thanks for asking and happy learning!!