Skip to main content Accessibility help
×
Hostname: page-component-848d4c4894-2pzkn Total loading time: 0 Render date: 2024-05-16T15:05:18.807Z Has data issue: false hasContentIssue false

14 - Myopic policy bounds for POMDPs and sensitivity to model parameters

from Part III - Partially Observed Markov Decision Processes: Structural Results

Published online by Cambridge University Press:  05 April 2016

Vikram Krishnamurthy
Affiliation:
Cornell University/Cornell Tech
Get access

Summary

Chapter 12 discussed stopping time POMDPs and gave sufficient conditions for the optimal policy to have a monotone structure. In this chapter we consider more general POMDPs (not necessarily with a stopping action) and present the following structural results:

  1. Upper and lower myopic policy bounds using copositivity dominance: For general POMDPs it is difficult to provide sufficient conditions for monotone policies. Instead, we provide sufficient conditions so that the optimal policy can be upper and lower bounded by judiciously chosen myopic policies. These sufficient conditions involve the copositive ordering described in Chapter 10. The myopic policy bounds are constructed to maximize the volume of belief states where they coincide with the optimal policy. Numerical examples illustrate these myopic policies for continuous and discrete valued observations.

  2. Lower myopic policy bounds using Blackwell dominance: Suppose the observation probabilities for actions 1 and 2 can be related via the following factorization: B(1) = B(2) R where R is a stochastic matrix. We then say that B(2) Blackwell dominates B(1). If this Blackwell dominance holds, we will show that a myopic policy coincides with the optimal policy for all belief states where choosing action 2 yields a smaller instantaneous cost than choosing action 1. Thus, the myopic policy forms a lower bound to the optimal policy. We provide two examples: scheduling an optimal filter versus an optimal predictor, and scheduling with ultrametric observation matrices.

  3. Sensitivity to POMDP parameters: The final result considered in this chapter is: how does the optimal cumulative cost of POMDP depend on the transition and observation probabilities? We provide two sets of results: ordinal and cardinal. The ordinal results use the copositive ordering of transition matrices and Blackwell dominance of observation matrices that yield an ordering of the achievable optimal costs of a POMDP. The cardinal results determine explicit formulas for the sensitivity of the POMDP optimal costs and policy to small variations of the transition and observation probabilities.

The partially observed Markov decision process

Throughout this chapter we will consider discounted cost infinite horizon POMDPs discussed in §7.6. Let us briefly review this model.

Type
Chapter
Information
Partially Observed Markov Decision Processes
From Filtering to Controlled Sensing
, pp. 312 - 340
Publisher: Cambridge University Press
Print publication year: 2016

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×