AIML.com

Machine Learning Resources

What are the best ways to safeguard against overfitting a GBM?

Tuning the combination of number of trees and learning rate is a good way to ensure you are creating a model with appropriate complexity. A proper setting for hyper-parameters such as maximum depth further avoids overfitting. Finally, offshoots such as XGBoost add additional regularization for this purpose.

Find out all the ways
that you can