– What is Underfitting?
– What is Overfitting?
– What is the Bias-Variance tradeoff?
Underfitting occurs when a machine learning model is too simple and is not able to capture the underlying patterns in the data. To mitigate underfitting, there are several strategies that can be employed:
- Increase the complexity of the model: A model that is too simple may not have enough capacity to capture the underlying patterns in the data. Increasing the complexity of the model, such as by adding more layers or neurons, or changing the degree of polynomial can help the model learn more complex patterns.
- Reduce the regularization strength: Regularization is a technique used to prevent overfitting, but if the regularization strength is too high, it can lead to underfitting. Reducing the regularization strength can allow the model to be more flexible and better capture the patterns in the data.
- Add more relevant features: If the model is not given access to enough relevant features, it may not be able to capture the underlying patterns in the data. Adding more relevant features can help the model learn more complex patterns.
- Specify an appropriate form of the relationship between the predictors and target, i.e. include higher order terms in a regression if there is a complex relationship
- Increase the amount of data: Insufficient data can lead to underfitting. Increasing the amount of data can help the model learn more complex patterns and improve its performance.
- Change the model architecture: If a particular model architecture is not able to capture the patterns in the data, changing the model architecture can be an effective way to mitigate underfitting.
- Train for longer: If the model is still underfitting after trying the above strategies, training the model for longer may help it learn more complex patterns.
It is important to note that mitigating underfitting is a balancing act. Increasing the complexity of the model or reducing the regularization strength can also lead to overfitting. Therefore, it is important to carefully monitor the model’s performance on both the training and validation data to ensure that it is not overfitting or underfitting.