The website is in Maintenance mode. We are in the process of adding more features.
Any new bookmarks, comments, or user profiles made during this time will not be saved.

Machine Learning Resources

What is Vector Normalization? How is that useful? 

Bookmark this question

If the documents in the corpus are of varying sizes, the larger documents are more likely to have higher word counts across the vocabulary simply due to them containing more words. In that case, normalization can scale the word counts to a more even level across all documents. Techniques for normalizing text vectors include L2 normalization or dividing by the number of tokens in the document, which roughly corresponds to the rate of occurrence of a given token in a document. 

Leave your Comments and Suggestions below:

Please Login or Sign Up to leave a comment

Partner Ad  

Find out all the ways
that you can

Explore Questions by Topics

Partner Ad

Learn Data Science with Travis - your AI-powered tutor |