When we use supervised machine learning techniques we need to specify the number of parameters that our model will need to represent the data (number of clusters, number of Gaussians, etc.).
Somewhat, we are making our model inflexible. In this talk we will study the nonparametric models, in specific, Bayesian Nonparametric Models (BNP) whose main purpose is getting more flexible models since that in BNP the parameters can be automatically inferred by the model.
The outline is the next:
- Parametric vs Nonparametric models
- A review on probability distributions
- Non-parametric Bayesian Methods
- Dirichlet Process
- Python (and R maybe) libraries for NPB