Skip to main content Accessibility help
×
Home
Hostname: page-component-54cdcc668b-rfpnn Total loading time: 0.561 Render date: 2021-03-09T01:22:39.647Z Has data issue: true Feature Flags: { "shouldUseShareProductTool": true, "shouldUseHypothesis": true, "isUnsiloEnabled": true, "metricsAbstractViews": false, "figures": false, "newCiteModal": false, "newCitedByModal": true }

THE M/G/1-TYPE MARKOV CHAIN WITH RESTRICTED TRANSITIONS AND ITS APPLICATION TO QUEUES WITH BATCH ARRIVALS

Published online by Cambridge University Press:  21 July 2011

Juan F. Pérez
Affiliation:
Performance Analysis of Telecommunication Systems (PATS), Department of Mathematics and Computer Science, University of Antwerp – IBBT, Middelheimlaan 1, B-2020 Antwerp, Belgium E-mail: juanfernando.perez@ua.ac.be; benny.vanhoudt@ua.ac.be
Benny Van Houdt
Affiliation:
Performance Analysis of Telecommunication Systems (PATS), Department of Mathematics and Computer Science, University of Antwerp – IBBT, Middelheimlaan 1, B-2020 Antwerp, Belgium E-mail: juanfernando.perez@ua.ac.be; benny.vanhoudt@ua.ac.be
Corresponding

Abstract

We consider M/G/1-type Markov chains where a transition that decreases the value of the level triggers the phase to a small subset of the phase space. We show how this structure—referred to as restricted downward transitions—can be exploited to speed up the computation of the stationary probability vector of the chain. To this end we define a new M/G/1-type Markov chain with a smaller block size, the G matrix of which is used to find the original chain's G matrix. This approach is then used to analyze the BMAP/PH/1 queue and the BMAP[2]/PH[2]/1 preemptive priority queue, yielding significant reductions in computation time.

Type
Research Article
Copyright
Copyright © Cambridge University Press 2011

Access options

Get access to the full version of this content by using one of the access options below.

References

1.Alfa, A.S. (1998) Matrix-geometric solution of discrete time MAP/PH/1 priority queue. Naval Research Logistics 45: 2350.3.0.CO;2-N>CrossRefGoogle Scholar
2.Bini, D., Latouche, G., & Meini, B. (2003) Solving nonlinear matrix equations arising in tree-like stochastic processes. Linear Algebra and Its Applications 366: 3964.CrossRefGoogle Scholar
3.Bini, D., Latouche, G., & Meini, B. (2005) Numerical methods for structured Markov chains. Oxford: Oxford University Press.CrossRefGoogle Scholar
4.Bini, D. & Meini, B. (1996) On the solution of a nonlinear matrix equation arising in queueing problems. SIAM Journal of Matrix Analysis and Applications 17: 906926.CrossRefGoogle Scholar
5.Chandramouli, Y., Neuts, M., & Ramaswami, V. (1989) A queueing model for meteor burst packet communication systems. IEEE Transactions on Communications 37: 10241030.CrossRefGoogle Scholar
6.Cumani, A. (1982) On the canonical representation of homogeneous Markov processes modeling failure-time distributions. Microeconomics and Reliability 22: 583602.CrossRefGoogle Scholar
7.Diamond, J.E. & Alfa, A.S. (2000) On approximating higher order MAPs with MAPs of order two. Queueing Systems 34: 269288.CrossRefGoogle Scholar
8.Gardiner, J.D., Laub, A.J., Amato, J.J., & Moler, C.B. (1992) Solution of the Sylvester matrix equation AXBT+CXDT=E. ACM Transactions on Mathematical Software 18: 223231.CrossRefGoogle Scholar
9.Golub, G.H., Nash, S., & Van Loan, C. (1979) A Hessenberg–Schur method for the problem AX+XB=C, IEEE Transactions on Automatic Control 24: 909913.CrossRefGoogle Scholar
10.Golub, G.H. & Van Loan, C. (1996) Matrix computations. Baltimore, MD: The Johns Hopkins University Press.Google Scholar
11.Grassmann, W.K. & Tavakoli, J. (2008) Solving QBD processes when levels can increase only in certain phases. Presented at the MAM6 conference, Beijing.Google Scholar
12.He, Q. & Neuts, M.F. (1998) Markov chains with marked transitions. Stochastic Processes and their Applications 74: 3752.CrossRefGoogle Scholar
13.Kemeny, J.G., Snell, J.L., & Knapp, A.W. (1976) Denumerable Markov chains. New York: Springer-Verlag.CrossRefGoogle Scholar
14.Latouche, G. & Ramaswami, V. (1999) Introduction to matrix analytic methods in stochastic modeling. Philadelphia: SIAM.CrossRefGoogle Scholar
15.Lucantoni, D. (1991) New results on the single server queue with a batch markovian arrival process. Stochastic Models 7: 146.CrossRefGoogle Scholar
16.Meini, B. (1997) An improved FFT-based version of Ramaswami's formula. Stochastic Models 13(2): 223238.CrossRefGoogle Scholar
17.Miller, D.R. (1981) Computation of steady-state probabilities for M/M/1 priority queues. Operations Research 29: 945958.CrossRefGoogle Scholar
18.Neuts, M.F. (1981) Matrix-geometric solutions in stochastic models. Baltimore, MD: The John Hopkins University Press.Google Scholar
19.Neuts, M.F. (1989) Structured stochastic matrices of M/G/1 type and their applications. New York: Marcel Dekker.Google Scholar
20.Pérez, J.F. & Van Houdt, B. (2009) Quasi-birth-and-death processes with restricted transitions and its application. Performance Evaluation 68: 126141.CrossRefGoogle Scholar
21.Pérez, J.F. & Van Houdt, B. (2009) Exploiting restricted transitions in quasi-birth-and-death processes. Proceedings of the 6th International Conference on Quantitative Evaluation of SysTems (QEST).Google Scholar
22.Ramaswami, V. (1988) A stable recursion for the steady state vector in Markov chains of M/G/1 type. Stochastic Models 4: 183188.CrossRefGoogle Scholar
23.Ramaswami, V. & Latouche, G. (1982) A general class of Markov processes with explicit matrix-geometric solutions. OR Spectrum 8(4): 209218.CrossRefGoogle Scholar
24.Whitt, W. (1982) Approximating a point process by a renewal process, I: Two basic methods. Operations Research 30: 125147.CrossRefGoogle Scholar
25.Zhao, J.A., Li, B., Cao, X.-R., & Ahmad, I. (2006) A matrix-analytic solution for the DBMAP/PH/1 priority queue. Queueing Systems 53(3): 127145.CrossRefGoogle Scholar

Full text views

Full text views reflects PDF downloads, PDFs sent to Google Drive, Dropbox and Kindle and HTML full text views.

Total number of HTML views: 0
Total number of PDF views: 22 *
View data table for this chart

* Views captured on Cambridge Core between September 2016 - 9th March 2021. This data will be updated every 24 hours.

Send article to Kindle

To send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle.

Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

THE M/G/1-TYPE MARKOV CHAIN WITH RESTRICTED TRANSITIONS AND ITS APPLICATION TO QUEUES WITH BATCH ARRIVALS
Available formats
×

Send article to Dropbox

To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Dropbox.

THE M/G/1-TYPE MARKOV CHAIN WITH RESTRICTED TRANSITIONS AND ITS APPLICATION TO QUEUES WITH BATCH ARRIVALS
Available formats
×

Send article to Google Drive

To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Google Drive.

THE M/G/1-TYPE MARKOV CHAIN WITH RESTRICTED TRANSITIONS AND ITS APPLICATION TO QUEUES WITH BATCH ARRIVALS
Available formats
×
×

Reply to: Submit a response


Your details


Conflicting interests

Do you have any conflicting interests? *