A decision tree serves as the building block of most bagging and boosting algorithms and is always built using the concept of maximizing information. Bagging, and specifically Random Forest, provides a mechanism for constructing an ensemble of decision trees, which creates a prediction that results from the aggregation of all trees in the ensemble. Random Forest is a specific example of a bagging method that creates each decision tree using a bootstrap sample of the original data and then performs aggregation to determine a final prediction for each observation.