You know and love the Gaussian / Normal / Bell Curve. It is very common, appearing in almost all domains. In particular, it is the work-horse of statistics.
If one Gaussian distribution is awesome, what would be more awesome? TWO GAUSSIAN DISTRIBUTIONS!
That is what Gaussian Mixture Models (GMM) are - take your old friend the single Gaussian distribution and mix in another Gaussian distribution.
How does that dark magic happen? The Expectation–Maximization (EM) Algorithm.
This technical talk will start with a quick review of the Gaussian, then move in to GMMs, and discuss how to estimate a GMM with the EM algorithm. An introductory level of statistics is assumed. If you need a refresher, check out https://galvanizeopensource.github.io/stats-shortcourse/ or https://www.khanacademy.org/math/statistics-probability/modeling-distributions-of-data
Suggested Preparation Materials:
- Listen EM on Talking Machines podcast
- Watch GMM & EM series, videos 16.3-16.13 from mathematicalmonk
- "What is the expectation maximization algorithm?" pdf in readings folder
Challenge Preparation Materials:
- "EM Demystified- An Expectation-Maximization Tutorial" in readings folder
- Theoretical Statistical approach to EM