Hostname: page-component-89b8bd64d-shngb Total loading time: 0 Render date: 2026-05-12T08:06:29.650Z Has data issue: false hasContentIssue false

Comparing two algorithms for automatic planning by robots in stochastic environments*

Published online by Cambridge University Press:  09 March 2009

Alan D. Christiansen
Affiliation:
Computer Science Department, Tulane University, New Orleans, LA 70118-5674 (USA). Supported at CMU by an AT&T Bell Laboratories Ph.D. Scholarship and by the National Science Foundation under grant DMC-8520475. A portion of this work was completed during a visit to the Laboratoire d'Informatique Fondamentale et d'ntelligence Artificielle (LIF1A) in Grenoble. France. supported by INRIA.
Kenneth Y. Goldberg
Affiliation:
Institute for Robotics and Intelligent Systems, University of Southern California. Los Angeles. CA 90089-0273 (USA).Supported by the National Science Foundation under Awards No. IRl-9123747. and DDM-9215362 (Strategic Manufacturing Initiative).

Summary

Planning a sequence of robot actions is especially difficult when the outcome of actions is uncertain, as is inevitable when interacting with the physical environment. In this paper we consider the case of finite state and action spaces where actions can be modeled as Markov transitions. Finding a plan that achieves a desired state with maximum probability is known to be an NP-Complete problem. We consider two algorithms: an exponential-time algorithm that maximizes probability, and a polynomial-time algorithm that maximizes a lower bound on the probability. As these algorithms trade off plan time for plan quality, we compare their performance on a mechanical system for orienting parts. Our results lead us to identify two properties of stochastic actions that can be used to choose between these planning algorithms for other applications.

Information

Type
Articles
Copyright
Copyright © Cambridge University Press 1995

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Article purchase

Temporarily unavailable