Book contents
- Frontmatter
- Contents
- Preface
- Acknowledgments
- I Introduction to Queueing
- II Necessary Probability Background
- 3 Probability Review
- 4 Generating Random Variables for Simulation
- 5 Sample Paths, Convergence, and Averages
- III The Predictive Power of Simple Operational Laws: “What-If” Questions and Answers
- IV From Markov Chains to Simple Queues
- V Server Farms and Networks: Multi-server, Multi-queue Systems
- VI Real-World Workloads: High Variability and Heavy Tails
- VII Smart Scheduling in the M/G/1
- Bibliography
- Index
4 - Generating Random Variables for Simulation
from II - Necessary Probability Background
Published online by Cambridge University Press: 05 February 2013
- Frontmatter
- Contents
- Preface
- Acknowledgments
- I Introduction to Queueing
- II Necessary Probability Background
- 3 Probability Review
- 4 Generating Random Variables for Simulation
- 5 Sample Paths, Convergence, and Averages
- III The Predictive Power of Simple Operational Laws: “What-If” Questions and Answers
- IV From Markov Chains to Simple Queues
- V Server Farms and Networks: Multi-server, Multi-queue Systems
- VI Real-World Workloads: High Variability and Heavy Tails
- VII Smart Scheduling in the M/G/1
- Bibliography
- Index
Summary
In Chapter 3 we reviewed the most common discrete and continuous random variables. This chapter shows how we can use the density function or cumulative distribution function for a distribution to generate instances of that distribution. For example, we might have a system in which the interarrival times of jobs are well modeled by an Exponential distribution and the job sizes (service requirements) are well modeled by a Normal distribution. To simulate the system, we need to be able to generate instances of Exponential and Normal random variables. This chapter reviews the two basic methods used in generating random variables. Both these methods assume that we already have a generator of Uniform(0,1) random variables, as is provided by most operating systems.
Inverse-Transform Method
This method assumes that (i) we know the c.d.f. (cumulative distribution function), Fx (x) = P{X ± x}, of the random variable X that we are trying to generate, and (ii) that this distribution is easily invertible, namely that we can get x from FX (x).
The Continuous Case
Idea: We would like to map each instance of a uniform r.v. generated by our operating system – that is, u ∈ U (0, 1) – to some x, which is an instance of the random variable X, where X has c.d.f. FX. We assume WLOG that X ranges from 0 to ∞. Let's suppose there is some mapping that takes each u and assigns it a unique x. Such a mapping is illustrated by g-1(·) in Figure 4.1.
- Type
- Chapter
- Information
- Performance Modeling and Design of Computer SystemsQueueing Theory in Action, pp. 70 - 78Publisher: Cambridge University PressPrint publication year: 2013