Hostname: page-component-78857b5c4d-76nq5 Total loading time: 0 Render date: 2023-08-04T00:50:12.929Z Has data issue: false Feature Flags: { "corePageComponentGetUserInfoFromSharedSession": true, "coreDisableEcommerce": false, "coreDisableSocialShare": false, "coreDisableEcommerceForArticlePurchase": false, "coreDisableEcommerceForBookPurchase": false, "coreDisableEcommerceForElementPurchase": false, "coreUseNewShare": false, "useRatesEcommerce": true } hasContentIssue false

DYNAMIC CONTROL OF A SINGLE-SERVER SYSTEM WHEN JOBS CHANGE STATUS

Published online by Cambridge University Press:  07 June 2017

Gabriel Zayas-Cabán
Affiliation:
Center for Healthcare Engineering and Patient Safety, University of Michigan, Ann Arbor, MI, USA. E-mail: gzayasca@umich.edu
Hyun-Soo Ahn
Affiliation:
Ross School of Business, University of Michigan, Ann Arbor, MI, USA. E-mail: hsahn@umich.edu

Abstract

From health care to maintenance shops, many systems must contend with allocating resources to customers or jobs whose initial service requirements or costs change when they wait too long. We present a new queueing model for this scenario and use a Markov decision process formulation to analyze assignment policies that minimize holding costs. We show that the classic cμ rule is generally not optimal when service or cost requirements can change. Even for a two-class customer model where a class 1 task becomes a class 2 task upon waiting, we show that additional orderings of the service rates are needed to ensure the optimality of simple priority rules. We then show that seemingly-intuitive switching curve structures are also not optimal in general. We study these scenarios and provide conditions under which they do hold. Lastly, we show that results from the two-class model do not extend to when there are n≥3 customer classes. More broadly, we find that simple priority rules are not optimal. We provide sufficient conditions under which a simple priority rule holds. In short, allowing service and/or cost requirements to change fundamentally changes the structure of the optimal policy for resource allocation in queueing systems.

Type
Research Article
Copyright
Copyright © Cambridge University Press 2017 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1.Albright, S.C. (1980). Optimal maintenance-repair policies for the machine repair problem. Naval Research Logistics Quarterly 27(1): 1727.CrossRefGoogle Scholar
2.Alidaee, B. & Womer, N.K. (1999). Scheduling with time dependent processing times: review and extensions. Journal of the Operational Research Society 50(7): 711720.CrossRefGoogle Scholar
3.Argon, N.T., Ziya, S. & Righter, R. (2008). Scheduling impatient jobs in a clearing system with insights on patient triage in mass casualty incidents. Probability in the Engineering and Informational Sciences 22(3): 301332.CrossRefGoogle Scholar
4.Armstrong, M.J. (2002). Age repair policies for the machine repair problem. European Journal of Operational Research 138(1): 127141.CrossRefGoogle Scholar
5.Bachman, A., Janiak, A. & Kovalyov, M.Y. (2002). Minimizing the total weighted completion time of deteriorating jobs. Information Processing Letters 81(2): 8184.CrossRefGoogle Scholar
6.Bhulai, S., Brooms, A.C. & Spieksma, F.M. (2014). On structural properties of the value function for an unbounded jump Markov process with an application to a processor sharing retrial queue. Queueing Systems 76(4): 425446.CrossRefGoogle Scholar
7.Blok, H. & Spieksma, F.M. (2015). Countable state Markov decision processes with unbounded jump rates and discounted cost: optimality equation and approximations. Advances in Applied Probability 47(4): 10881107.CrossRefGoogle Scholar
8.Boxma, O.J. & Vlasiou, M. (2007). On queues with service and interarrival times depending on waiting times. Queueing Systems 56(3-4): 121132.Google Scholar
9.Brown, M. & Ross, S.M. (1973). Optimal issuing policies. Management Science 19(11): 12921294.CrossRefGoogle Scholar
10.Brown, M. & Solomon, H. (1973). Optimal issuing policies under stochastic field lives. Journal of Applied Probability 10(4): 761768.CrossRefGoogle Scholar
11.Browne, S. & Yechiali, U. (1990). Scheduling deteriorating jobs on a single processor. Operations Research 38(3): 495498.CrossRefGoogle Scholar
12.Buyukkoc, C., Varaiya, P. & Walrand, J. (1985). The cμ-rule revisited. Advances in Applied Probability 17(1): 237238.CrossRefGoogle Scholar
13.Cai, X., Wu, X. & Zhou, X. (2014). Optimal stochastic scheduling. Springer.CrossRefGoogle Scholar
14.Chan, C.W., Farias, V.F. & Escobar, G.J. (2016). The impact of delays on service times in the intensive care unit. Management Science. doi: 10.1287/mnsc.2016.2441.Google Scholar
15.Cheng, T.C.E., Ding, Q. & Lin, B.M.T. (2004). A concise survey of scheduling with time-dependent processing times. European Journal of Operational Research 152(1): 113.CrossRefGoogle Scholar
16.Down, D.G., Koole, G. & Lewis, M.E. (2011). Dynamic control of a single-server system with abandonments. Queueing Systems 67(1): 6390.CrossRefGoogle Scholar
17.Guo, X. & Hernández-Lerma, O. (2003). Continuous-time controlled Markov chains with discounted rewards. Acta Applicandae Mathematica 79(3): 195216.CrossRefGoogle Scholar
18.Guo, X. & Hernández-Lerma, O. (2009). Continuous-time Markov decision processes. Berlin, Heidelberg: Springer.CrossRefGoogle Scholar
19.Haque, L. & Armstrong, M.J. (2007). A survey of the machine interference problem. European Journal of Operational Research 179(2): 469482.CrossRefGoogle Scholar
20.Kawai, H. (1983). An optimal ordering and replacement policy of a Markovian degradation system under complete observation. Journal of Operations Research Society of Japan 26: 279290.Google Scholar
21.Kawai, H. (1984). An optimal inspection and replacement policy of a Markovian deterioration system. In Osaki, S. & Hatoyama, Y. (eds.), Stochastic models in reliability theory. Lecture Notes in Economics and Mathematical Systems, vol 235. Berlin, Heidelberg: Springer, pp. 177186.CrossRefGoogle Scholar
22.Legros, B, Jouini, O. & Koole, G. (2014). A uniformization approach for the dynamic control of multi-server queueing systems with abandonments. Technical report, Working paper.Google Scholar
23.Leung, J.Y.T. (2004). Handbook of scheduling: algorithms, models, and performance analysis. New York, NY: Chapman and Hall/CRC Press.Google Scholar
24.Lippman, S.A. (1975). Applying a new device in the optimization of exponential queuing systems. Operations Research 23(4): 687710.CrossRefGoogle Scholar
25.Master, N., Chan, C.W. & Bambos, N. (2016). Myopic policies for non-preemptive scheduling of jobs with decaying value. Probability in the Engineering and Informational Sciences 136. doi: 10.1017/S0269964816000474.Google Scholar
26.Mosheiov, G. (1991). V-shaped policies for scheduling deteriorating jobs. Operations Research 39(6): 979991.Google Scholar
27.Mosheiov, G. (1996). λ-shaped policies to schedule deteriorating jobs. Journal of the Operational Research Society 47(9): 11841191.CrossRefGoogle Scholar
28.Pinedo, M.L. (2012). Scheduling: theory, algorithms, and systems. Boston, MA: Springer US.CrossRefGoogle Scholar
29.Prieto-Rumeau, T. & Hernández-Lerma, O. (2012). Discounted continuous-time controlled Markov chains: convergence of control models. Journal of Applied Probability 49(4): 10721090.CrossRefGoogle Scholar
30.Prieto-Rumeau, T. & Hernández-Lerma, O. (2012). Selected topics on continuous-time controlled Markov chains and Markov games, vol. 5. London, UK: Imperial College Press.CrossRefGoogle Scholar
31.Prieto-Rumeau, T. & Lorenzo, J.M. (2010). Approximating ergodic average reward continuous-time controlled Markov chains. IEEE Transactions on Automatic Control 55(1):201207.CrossRefGoogle Scholar
32.Puterman, M.L. (1994). Markov decision processes: discrete stochastic dynamic programming. Wiley Series in Probability and Mathematical Statistics. New York: John Wiley and Sons, Inc.CrossRefGoogle Scholar
33.Puterman, M.L. & Shin, M.C. (1978) Modified policy iteration algorithms for discounted markov decision problems. Management Science 24(11): 11271137.CrossRefGoogle Scholar
34.Saghafian, S., Hopp, W.J., Van Oyen, M.P., Desmond, J.S. & Kronick, S.L. (2014). Complexity-augmented triage: a tool for improving patient safety and operational efficiency. Manufacturing & Service Operations Management 16(3): 329345.CrossRefGoogle Scholar
35.Srinivasan, R. & Parlikad, A.K. (2014). Semi-Markov decision process with partial information for maintenance decisions. IEEE Transactions on Reliability 63(4): 891898.CrossRefGoogle Scholar
36.Stengos, D. & Thomas, L.C. (1980). The blast furnaces problem. European Journal of Operational Research 4(5): 330336.CrossRefGoogle Scholar
37.Tijms, H.C. & van der Duyn Schouten, F.A. (1985). A Markov decision algorithm for optimal inspections and revisions in a maintenance system with partial information. European Journal of Operational Research 21(2): 245253.CrossRefGoogle Scholar
38.Whitt, W. (1990). Queues with service times and interarrival times depending linearly and randomly upon waiting times. Queueing Systems 6(1): 335351.CrossRefGoogle Scholar