AIML.com

Machine Learning Resources

What is the Central Limit Theorem (CLT), and what are its implications for statistical inference?

Bookmark this question

The Central Limit Theorem (CLT) states that with a large enough number of samples, no matter the underlying distribution of the data, the sampling distribution, or distribution of all of the samples taken, will be a normal (or Gaussian) distribution. This implies that even if the underlying distribution of the data is far from normal, such as a skewed distribution like the Gamma or even a discrete distribution like the Binomial, with sufficient repeated samples from the population, the sample means can be modeled using a Normal distribution. As the sample size increases further, the sampling distribution is more narrowly centered around the mean, meaning the standard error is lower. 

For example, consider the case of estimating the income of all adults in the United States. Income is usually a right-skewed variable, meaning that a small proportion of adults have an income that is significantly higher than the average of the population as a whole. Say we were able to obtain the income of all adults in a select few communities that happen to be representative of the nation as a whole. We could then record the average income from each community of adults we found and create a histogram of all of the sample averages. If we sampled from enough communities, the distribution of our samples (sampling distribution) would resemble a bell-shaped normal distribution, and if we took even more samples, the distribution would be even more concentrated around the mean of our sampling distribution.

Leave your Comments and Suggestions below:

Please Login or Sign Up to leave a comment

Partner Ad  

Find out all the ways
that you can