While SVM is used most often in classification scenarios, it can be extended to regression cases by allowing the user to provide a maximum margin of error allowed for any observation. Support Vector Regression (SVR) finds the optimal hyperplane in the feature space by minimizing the number of observations that fall beyond the provided margin of error. Instead of minimizing squared error, like in Least Squares, it minimizes the coefficient values subject to the maximum amount of error allowed, the latter of which is a hyper-parameter that can be tuned. As some points, such as outliers, might fall outside a respectable margin of error, another hyper-parameter called the slack parameter, or the deviation from the margin, can be tuned in the training process to provide an additional constraint to the objective function.