In some classification contexts, it might be more of interest to obtain predicted probabilities of class membership rather than simply the labels themselves.
The “gradient” in Gradient Boosting Machine is a reference to the concept of gradient descent.
XGBoost is a modern implementation of Gradient Boosting Machine that works largely the same as the standard GBM.
Adaboost (Adaptive Boosting) is a simple boosting technique that is a predecessor to modern algorithms like GBM and its offshoots.
For any decision-tree based method, feature importance can be measured in a couple of ways.
In Random Forest, decision trees are constructed independently and the results are aggregated through either averaging or majority vote after all trees are created.
Advantages: High accuracy
Disadvantages: Requires some computing power and time spent in parameter tuning
Key hyper-parameters for a GBM are: Number of trees, learning rate, and maximum depth
A decision tree serves as the building block of most bagging and boosting algorithms and is always built using the concept of maximizing information.