In: Statistics and Probability
Which of the following statements are true? Briefly explain your answer.
1. Training a k-nearest-neighbors classifier takes less computational time than testing it.
2. The more training examples, the more accurate the prediction of a k-nearest-neighbors.
3. k-nearest-neighbors cannot be used for regression.
4. A k-nearest-neighbors is sensitive to the number of features.
1. Training a k-nearest-neighbors classifier takes less computational time than testing it.
True:
KNN algorithm does more computation on test time rather than train time, Given data with N unique features,then N dimension of the models, so take maximumfeture then they take high computational time.
2. The more training examples, the more accurate the prediction of a k-nearest-neighbors.
False
If you have more training data, the “nearest neighbor” is probably closer (meaning the input is even more similar). More similar input means, in theory, more similar output.
3. k-nearest-neighbors cannot be used for regression.
False (we can use for regression also)
4. A k-nearest-neighbors is sensitive to the number of
features.
True
Given data with N unique features, the feature vector would be a vector of length N, where entry I of the vector represents that data point’s value for feature I. Each feature vector can thus be thought of as a point in R^N.