In: Computer Science
Machine learning Adaboost question:
Assume that the weak learners are a finite set of decision stumps. We then train an AdaBoost classifier. Can the boosting algorithm select the same weak classifier more than once? Explain.
A Classifier that performs poorly but better than random guessing is a weak classifier.Presently on the off chance that we use decision stumps as weak learners, at that point the algorithm can choose same weak classifier more than once.During the boosting process, a weak classifier is first trained with sample training data set as usual:
All training data samples cannot be classified correctly since the weak classifier is less powerful.The central thought behind boosting is to allot enormous weight to the (difficult) training data samples that were not correctly classified by the first classifier and train the second classifier with the weights. Because of this weighting scheme, the subsequent classifier prepared in this way would effectively classify some of the difficult training samples.
Difficult training samples will have large weights so repeating this weighted learning technique will lead to a classifier that can correctly classify the most difficult training samples. Therefore Adaboost can choose same weak classifier more than once.