In: Computer Science
Machine learning Neural Networks question:
Which one statement is true about neural networks? (Select the
single best answer), and please explain why they are true or
false:
(A) We always train neural networks by optimising a convex cost
function.
(B) Neural networks are more robust to outliers than support vector
machines.
(C) Neural networks always output values between 0 and 1.
(D) A neural network with a large number of parameters often can
better use big training data than support vector machines.
(A) FALSE
The cost function of neural network is J(W,b) and it is claimed to be non-conxex. (For detail purposes : if you permute the neurons in the hidden layer and do the same permutation on the weights of the adjacent layers then the loss doesn't change. Hence if there is a nonzero global minimum as a function of weights,then it can't be unique since the permutation of weights gives another minimum. Hence the function is not convex.)
(B) FALSE
Neural networks requires large large dataset and setting up neural network algorithms is much more tedious.. hence SVM are more robust tom outliers.
(C) TRUE
The reason these network outputs number between 0 & 1 is in the layer activations of the network. The last layer in these networks is usually a softmax layer (or, if you're doing just binary classification, a sigmoid layer). Softmax and sigmoid functions have the nice property that they give outputs between 0 and 1 (softmax has the added nice property that it gives outputs which sum to 1).
(D) TRUE
As neural network contain large datasets than svm's , they are more prone to adapt large datasets. SVM's are conceptually simpler .