Expectation-Maximization (EM) Algorithm in the General
- Choose initial values of the parametersθ(0)
- Expectation step: in thej-th step, computeQ(θ',θ() =j)E(l0(θ';T)|X,θ() as a function of the dummy argumentj)θ'
- Maximization step: in thej-th step, calculate the new estimateθ(by maximizingj+1)Q(θ',θ() overj)θ'
EM algorithm for the Gaussian Mixture Model
- Choose initial values of the parameters
- Expectation step: in thej-th step, compute the matrixW= (w)ijwith the weightsnxkwij
- Maximization step: in thej-th step, for allr=1, ...,kcompute:
- The mixture weightswhereis the "amount" of the feature vectors that are assigned to ther-th mixture component
- Mean estimates
- Covariance estimateof sizepxpwith
- PerformnTrialsstarts of the EM algorithm withnIterationsiterations and start values:
- Initial means -kdifferent random observations from the input data set
- Initial weights - the values of 1/k
- Initial covariance matrices - the covariance of the input data
- Regard the result of the best EM algorithm in terms of the likelihood function values as the result of initialization