% This code for learning a Gaussian mixture model through EM % was given to me as a homework solution in a machine learning % class. I found it helpful in understanding EM. I hope % the original author doesn't mind me placing it here. % This code assumes there are 3 Gaussians and the % data has 2 dimensions. mu = initmu; sigma = zeros(2,2,3); for j=1:3, sigma(:,:,j) = eye(2); end; prior = ones(3,1) * 1/3; for iter = 1:20, % 20 iterations seems to be enough for this problem % E step assignment = zeros(200,3); for i=1:200, prob = zeros(3,1); prob(1) = prior(1)*gaussian(x(i,:)', mu(1,:)', sigma(:,:,1)); prob(2) = prior(2)*gaussian(x(i,:)', mu(2,:)', sigma(:,:,2)); prob(3) = prior(3)*gaussian(x(i,:)', mu(3,:)', sigma(:,:,3)); tot = sum(prob); assignment(i,1) = prob(1) / tot; assignment(i,2) = prob(2) / tot; assignment(i,3) = prob(3) / tot; end; % M step nassign = zeros(3,1); newmu = zeros(3,2); newSigma = zeros(2,2,3); for i=1:200, for j=1:3, newmu(j,:) = newmu(j,:) + assignment(i,j) * x(i,:); nassign(j) = nassign(j) + assignment(i,j); end; end; for j=1:3, newmu(j,:) = newmu(j,:)/nassign(j); end; for i=1:200, for j=1:3, newSigma(:,:,j) = newSigma(:,:,j) + ... assignment(i,j)*(x(i,:)-newmu(j,:))'*(x(i,:)-newmu(j,:)); end; end; for j=1:3, newSigma(:,:,j) = newSigma(:,:,j)/nassign(j); end; mu = newmu; sigma = newSigma; prior = nassign/sum(nassign); end;