• Short for Adaptive Boosting, AdaBoost is another ensemble algorithm.

Differences with decision trees

  • In random forests, we built complete trees each time, but in AdaBoost, each tree only consists of a node and two leaves (which is called a stump).
  • Another difference to RF is that in AdaBoost, stumps do not have an equal vote towards the final prediction.
  • In RF, trees are made independently. In AB, the order is important. The errors of each stump affect how the succeeding stump is built.