### What is Regression?

Predicting a continuous numerical value (ex: wage, selling price, etc.) is Regression.

- Machine Learning 101 (30)
- Statistics 101 (38)
- Supervised Learning (114)
- Regression (42)
- Classification (46)
- Logistic Regression (10)
- Support Vector Machine (10)
- Naive Bayes (4)
- Discriminant Analysis (5)
- Classification Evaluations (9)

- Classification & Regression Trees (CART) (23)

- Unsupervised Learning (55)
- Clustering (28)
- Distance Measures (9)
- Dimensionality Reduction (9)

- Deep Learning (23)
- Data Preparation (34)
- General (5)
- Standardization (6)
- Missing data (7)
- Textual Data (16)

Predicting a continuous numerical value (ex: wage, selling price, etc.) is Regression.

Linear models are a class of models in which a response variable is linearly related to one or more predictors.

Linear regression is a statistical technique that relates the mean, or expected value, of a continuous response variable through a weighted combination of one or more independent predictor variables.

Some of the assumptions of the linear regression model includes independence, normality, constant variance and linearity

Linear regression minimizes the squared difference between the actual values of the response and the predicted values from the model.

Global F-test, R-Squared, MSE, MAE, RMSE, Information Criteria (AIC, BIC)

Regularization involves adding a penalty for complexity to the model objective function to improve a model’s generalization performance.

L1 regularization, or LASSO (Least Absolute Shrinkage and Selection Operator), is a kind of regularization

L2, or Ridge regularization, is a form of regularization in which the penalty is based on the squared magnitude of the coefficients.

If a primary interest is to conduct automatic variable selection, only LASSO can do that.

Find out all the ways

that you can