# Show Reference: "A Gentle Tutorial of the {EM} Algorithm and its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models"

A Gentle Tutorial of the EM Algorithm and its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models (April 1998) by Jeff A. Bilmes
@techreport{bilmes-1998,
abstract = {We describe the maximum-likelihood parameter estimation problem and how the {ExpectationMaximization} ({EM}) algorithm can be used for its solution. We ﬁrst describe the abstract
form of the {EM} algorithm as it is often given in the literature. We then develop the {EM} parameter estimation procedure for two applications: 1) ﬁnding the parameters of a mixture of
Gaussian densities, and 2) ﬁnding the parameters of a hidden Markov model ({HMM}) (i.e.,
the {Baum-Welch} algorithm) for both discrete and Gaussian mixture observation models.
We derive the update equations in fairly explicit detail but we do not prove any convergence properties. We try to emphasize intuition rather than mathematical rigor.},
author = {Bilmes, Jeff A.},
institution = {International Computer Science Institute},
keywords = {algorithmic, bayes, learning, math, probability, unsupervised-learning},
month = apr,
posted-at = {2012-01-06 11:07:32},
priority = {2},
title = {A Gentle Tutorial of the {EM} Algorithm and its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models},
url = {http://facartes.unal.edu.co/ggonzalez/ml/bilmes98gentle.pdf},
year = {1998}
}