Hostname: page-component-89b8bd64d-mmrw7 Total loading time: 0 Render date: 2026-05-10T06:59:18.674Z Has data issue: false hasContentIssue false

Exact distributions for reward functions on semi-Markov and Markov additive processes

Published online by Cambridge University Press:  14 July 2016

Valeri T. Stefanov*
Affiliation:
The University of Western Australia
*
Postal address: School of Mathematics and Statistics, The University of Western Australia, Crawley, WA 6009, Australia. Email address: stefanov@maths.uwa.edu.au
Rights & Permissions [Opens in a new window]

Abstract

Core share and HTML view are not available for this content. However, as you have access to this content, a full PDF is available via the 'Save PDF' action button.

The distribution theory for reward functions on semi-Markov processes has been of interest since the early 1960s. The relevant asymptotic distribution theory has been satisfactorily developed. On the other hand, it has been noticed that it is difficult to find exact distribution results which lead to the effective computation of such distributions. Note that there is no satisfactory exact distribution result for rewards accumulated over deterministic time intervals [0, t], even in the special case of continuous-time Markov chains. The present paper provides neat general results which lead to explicit closed-form expressions for the relevant Laplace transforms of general reward functions on semi-Markov and Markov additive processes.

Information

Type
Research Papers
Copyright
© Applied Probability Trust 2006