In: Computer Science
Assume that the weak learners are a finite set of decision stumps. We then train a AdaBoost classifier.
Can the boosting algorithm select the same weak classifier more than once? Explain.
First let us try to understand what are weak learners.A weak classifier is simply a classifier that performs poorly, but performs better than random guessing. Now if use use decision stumps as weak learners then the algorithm can select same weak classifier more than once.
During the process of boosting, a weak classifier is first trained with training samples as usual:
As the weak classifier is less powerful, because of this all the training samples cannot be correctly classified. The fundamental idea behind boosting is to assign large weight to the (difficult) training samples that were not correctly classified by the first classifier and train the second classifier with the weights. Due to this weighting scheme, the second classifier trained in this way would correctly classify some of the difficult training samples.
As difficult training samples will have large weights, repeating this weighted learning procedure would lead to a classifier that can correctly classify the most difficult training samples.So yes Adaboost can select same weak classifier more than once.