Skip to main content Accessibility help
Internet Explorer 11 is being discontinued by Microsoft in August 2021. If you have difficulties viewing the site on Internet Explorer 11 we recommend using a different browser such as Microsoft Edge, Google Chrome, Apple Safari or Mozilla Firefox.

Chapter 39: Decoding Hidden Markov Models

Chapter 39: Decoding Hidden Markov Models

pp. 1563-1608

Authors

, École Polytechnique Fédérale de Lausanne
Resources available Unlock the full potential of this textbook with additional resources. There are Instructor restricted resources available for this textbook. Explore resources
  • Add bookmark
  • Cite
  • Share

Summary

We continue our discussion of hidden Markov models (HMMs) and consider in this chapter the solution of decoding problems. Specifically, given a sequence of observations {yn,n=1,2,…,N}, we would like to devise mechanisms that allow us to estimate the underlying sequence of state or latent variables {zn,n=1,2,…,N}. That is, we would like to recover the state evolution that “most likely” explains the measurements. We already know how to perform decoding for the case of mixture models with independent observations by using (38.12a)–(38.12b). The solution is more challenging for HMMs because of the dependency among the states.

About the book

Access options

Review the options below to login to check your access.

Purchase options

eTextbook
US$110.00
Hardback
US$110.00

Have an access code?

To redeem an access code, please log in with your personal login.

If you believe you should have access to this content, please contact your institutional librarian or consult our FAQ page for further information about accessing our content.

Also available to purchase from these educational ebook suppliers