Skip to main content Accessibility help
×
Home

Number of successes in Markov trials

  • U. Narayan Bhat (a1) and Ram Lal (a1)

Abstract

Markov trials are a sequence of dependent trials with two outcomes, success and failure, which are the states of a Markov chain. The distribution of the number of successes in n Markov trials and the first-passage time for a specified number of successes are obtained using an augmented Markov chain model.

    • Send article to Kindle

      To send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle.

      Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

      Find out more about the Kindle Personal Document Service.

      Number of successes in Markov trials
      Available formats
      ×

      Send article to Dropbox

      To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Dropbox.

      Number of successes in Markov trials
      Available formats
      ×

      Send article to Google Drive

      To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Google Drive.

      Number of successes in Markov trials
      Available formats
      ×

Copyright

Corresponding author

Postal address for both authors: Department of Statistical Science, Southern Methodist University Dallas, TX 75275, USA.

References

Hide All
Bhat, U. N. and Lal, R. (1988) Control charts for Markov dependent production processes. Tech. Rep. # SMU/DS/TR/213, Department of Statistical Science, SMU, Dallas, Texas 75275, USA.
Bhat, U. N., Lal, R., and Karunaratne, M. (1987) A sequential inspection plane for Markov dependent production processes. Tech. Rep. # SMU/DS/TR/209 (Rev.), Department of Statistical Science, SMU, Dallas, Texas 75275, USA.
Gabriel, K. R. (1959) The distribution of the number of successes in a sequence of dependent trials. Biometrika 46, 454460.
Kemeny, J. G. and Snell, J. L. (1960) Finite Markov Chains. Van Nostrand, New York.

Keywords

Number of successes in Markov trials

  • U. Narayan Bhat (a1) and Ram Lal (a1)

Metrics

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed.