Skip to main content Accessibility help
×
Home
Hostname: page-component-544b6db54f-5rlvm Total loading time: 0.14 Render date: 2021-10-16T22:30:29.950Z Has data issue: true Feature Flags: { "shouldUseShareProductTool": true, "shouldUseHypothesis": true, "isUnsiloEnabled": true, "metricsAbstractViews": false, "figures": true, "newCiteModal": false, "newCitedByModal": true, "newEcommerce": true, "newUsageEvents": true }

Number of successes in Markov trials

Published online by Cambridge University Press:  01 July 2016

U. Narayan Bhat*
Affiliation:
Southern Methodist University
Ram Lal*
Affiliation:
Southern Methodist University
*
Postal address for both authors: Department of Statistical Science, Southern Methodist University Dallas, TX 75275, USA.
Postal address for both authors: Department of Statistical Science, Southern Methodist University Dallas, TX 75275, USA.
Rights & Permissions[Opens in a new window]

Abstract

HTML view is not available for this content. However, as you have access to this content, a full PDF is available via the ‘Save PDF’ action button.

Markov trials are a sequence of dependent trials with two outcomes, success and failure, which are the states of a Markov chain. The distribution of the number of successes in n Markov trials and the first-passage time for a specified number of successes are obtained using an augmented Markov chain model.

Type
Letters to the Editor
Copyright
Copyright © Applied Probability Trust 1988 

References

Bhat, U. N. and Lal, R. (1988) Control charts for Markov dependent production processes. Tech. Rep. # SMU/DS/TR/213, Department of Statistical Science, SMU, Dallas, Texas 75275, USA.Google Scholar
Bhat, U. N., Lal, R., and Karunaratne, M. (1987) A sequential inspection plane for Markov dependent production processes. Tech. Rep. # SMU/DS/TR/209 (Rev.), Department of Statistical Science, SMU, Dallas, Texas 75275, USA.Google Scholar
Gabriel, K. R. (1959) The distribution of the number of successes in a sequence of dependent trials. Biometrika 46, 454460.CrossRefGoogle Scholar
Kemeny, J. G. and Snell, J. L. (1960) Finite Markov Chains. Van Nostrand, New York.Google Scholar
You have Access
2
Cited by

Send article to Kindle

To send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle.

Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Number of successes in Markov trials
Available formats
×

Send article to Dropbox

To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Dropbox.

Number of successes in Markov trials
Available formats
×

Send article to Google Drive

To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Google Drive.

Number of successes in Markov trials
Available formats
×
×

Reply to: Submit a response

Please enter your response.

Your details

Please enter a valid email address.

Conflicting interests

Do you have any conflicting interests? *