Skip to main content Accessibility help
Internet Explorer 11 is being discontinued by Microsoft in August 2021. If you have difficulties viewing the site on Internet Explorer 11 we recommend using a different browser such as Microsoft Edge, Google Chrome, Apple Safari or Mozilla Firefox.

Chapter 32: Expectation Maximization

Chapter 32: Expectation Maximization

pp. 1276-1318

Authors

, École Polytechnique Fédérale de Lausanne
Resources available Unlock the full potential of this textbook with additional resources. There are Instructor restricted resources available for this textbook. Explore resources
  • Add bookmark
  • Cite
  • Share

Summary

We formulated the maximum-likelihood (ML) approach in the previous chapter, where an unknown parameter θ is estimated by maximizing the log-likelihood function. We showed there that in some cases of interest this problem can be solved analytically in closed form and an expression for the parameter estimate can be determined in terms of the observations. However, there are many important scenarios where the ML solution cannot be pursued in closed form, either due to mathematical intractability or due to missing data or hidden variables that are unobservable. In this chapter, we motivate and describe the expectation maximization (EM) procedure as a useful tool for constructing ML estimates under these more challenging conditions. We also illustrate how EM can be used to fit mixture models onto data.

About the book

Access options

Review the options below to login to check your access.

Purchase options

eTextbook
US$110.00
Hardback
US$110.00

Have an access code?

To redeem an access code, please log in with your personal login.

If you believe you should have access to this content, please contact your institutional librarian or consult our FAQ page for further information about accessing our content.

Also available to purchase from these educational ebook suppliers