AdaBoost in and out?

I am not a technical person trying to implement image classification. In this article, I came across the ADA Boost algorithm, which was implemented after the bag of opportunity step for video keyframes. Can someone explain in non-professional terms what ADA Boost does, and what is its input and output? Can someone point me to the code for the same?

+3


source to share


2 answers


First, it would be nice if you could link / name the document you are linking to.

AdaBoost is a meta-classification algorithm as it combines several classifiers called weak learners. These weak learners are often very simple, for example. they only classify data based on one function and perform somewhat better than random ones.

In image classification, AdaBoost will use a set of images (with corresponding captions showing which class each sample belongs to) and a set of weak learners as input. AdaBoost will then find the weak learner with the lowest error rate (i.e., the best results) across the data. All correctly classified data samples are now lighter because they are now less important, while sampled samples are given a higher weight. Now AdaBoost will start a new round and select the best weak student based on the newly weighted data. In other words, he will find a new weak learner who is better at classifying patterns that previously unselected weak learners could not classify.



The algorithm will continue to select these weak learners for a certain number of iterations. The exit consists of a group of selected weak learners. The scientist classifier can now classify new images based on the majority of votes of each weak classifier in the group (often the weak classifiers themselves are also weighted based on the achieved error rate).

You might want to take a look at software that has already implemented AdaBoost like WEKA or the computer-centric vision of OpenCV .

+4


source


Adaboost takes a bunch of weak classifiers and combines them to form a strong classifier. Outputs are a sequence of weights w_i

for weak classifiers used in the term to form a single weighted classifier. There are many intermediate results from the algorithm, but perhaps the most important is the weight itself.



Although not originally conceived in this way, Adaboost is equivalent to setting the model forward step by step on the training set, using weak classifiers at each step using an exponential loss function: L(y,f(x)) = exp(-y*f(x))

where f(.)

is our classifier. Thus, some aspects of the algorithm are clearer. The exponential loss function is often used for classification problems, for a good reason.

+1


source







All Articles