Parametric models assume that the data is generated from a specific distribution having a finite number of corresponding parameters, such as a Normal with mean of 0 and variance of 1, or a linear regression with the number of parameters being the number of features plus the intercept.

Non-parametric models do not make such an assumption; however, this does not imply that there are no parameters involved in the process. Rather, the parameters that define the model are not explicitly known beforehand. This implies that there could be many, even an infinite, number of parameters that generate the data in a nonparametric process, meaning that as the number of observations increases, the need might arise for more parameters in order to model the data. Examples of non-parametric models in machine learning are K-Nearest Neighbors and Decision Tree methods.