### How would you evaluate a Classification model using ROC/AUC?

The ROC curve is produced by plotting the False Positive Rate (FPR) on the x-axis and the True Positive Rate (TPR) on the y-axis for all decision rules.

- Machine Learning 101 (30)
- Statistics 101 (38)
- Supervised Learning (108)
- Regression (36)
- Classification (46)
- Logistic Regression (10)
- Support Vector Machine (10)
- Naive Bayes (4)
- Discriminant Analysis (5)
- Classification Evaluations (9)

- Classification & Regression Trees (CART) (23)

- Unsupervised Learning (46)
- Clustering (17)

- Regularization (6)
- Deep Learning (23)
- Data Preparation (43)
- General (5)
- Standardization (6)
- Missing data (7)
- Textual Data (16)
- Dimensionality Reduction (9)

False Positive Rate measures the proportion of actual negative observations that were predicted to be positive.

One of the most useful tools for evaluating the performance of any classification algorithm is the confusion matrix.

Find out all the ways

that you can