### What are some common distance metrics that can be used in clustering?

Some common distance metrics are: Euclidean distance, Mahalanobis distance, Manhattan distance, Minkowski distance, Cosine Similarity, Jaccard distance

- Machine Learning 101 (30)
- Statistics 101 (38)
- Supervised Learning (113)
- Unsupervised Learning (55)
- Deep Learning (23)
- Data Preparation (36)
- General (6)
- Standardization (6)
- Missing data (7)
- Textual Data (17)

Some common distance metrics are: Euclidean distance, Mahalanobis distance, Manhattan distance, Minkowski distance, Cosine Similarity, Jaccard distance

The Euclidean Distance, or L2 norm, is the most common distance metric used in clustering.

Cosine Similarity measures similarity using the cosine of the angle generated between two vectors in p-dimensional space.

The Kullback-Leibler (KL) Divergence is a method of quantifying the similarity between two statistical distributions.

The Jaccard Index measures similarity for two sets of data by computing the ratio of items present in both sets

The concept of Mutual Information measures the amount of information shared between two random variables

The Minkowski Distance is a general form for computing distances using an Lp norm.

The Manhattan distance, or L1 norm, measures the sum of absolute distance between two vectors.

The Mahalanobis Distance is a multivariate form of the Euclidean Distance that accounts for correlation between dimensions.

**Partner Ad**

Find out all the ways

that you can

- Machine Learning 101 (30)
- Statistics 101 (38)
- Supervised Learning (113)
- Unsupervised Learning (55)
- Deep Learning (23)
- Data Preparation (36)
- General (6)
- Standardization (6)
- Missing data (7)
- Textual Data (17)