What is the difference between Adaboost vs Gradient boost?

Adaboost (Adaptive Boosting) is a simple boosting technique that is a predecessor to modern algorithms like GBM and its offshoots. While GBM starts by constructing weak learners, Adaboost simplifies this process further by creating stumps, or decision trees with just a single split. Both GBM and Adaboost create decision trees in a sequential manner based on the residuals of prior iterations, but in terms of the final prediction produced, a different mechanism arises. GBM gives equal allocation to each tree in the ensemble, while Adaboost weights each tree’s contribution based on its accuracy, thus giving more weight to trees, or stumps, that produce more accurate predictions.