Book contents
- Frontmatter
- Contents
- About this book
- Acknowledgments
- Introduction
- 0 Notational conventions
- PART ONE BASIC COMPLEXITY CLASSES
- PART TWO LOWER BOUNDS FOR CONCRETE COMPUTATIONAL MODELS
- PART THREE ADVANCED TOPICS
- 17 Complexity of counting
- 18 Average case complexity: Levin's theory
- 19 Hardness amplification and error-correcting codes
- 20 Derandomization
- 21 Pseudorandom constructions: Expanders and extractors
- 22 Proofs of PCP theorems and the Fourier transform technique
- 23 Why are circuit lower bounds so difficult?
- Appendix: Mathematical background
- Hints and selected exercises
- Main theorems and definitions
- Bibliography
- Index
- Complexity class index
20 - Derandomization
from PART THREE - ADVANCED TOPICS
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Contents
- About this book
- Acknowledgments
- Introduction
- 0 Notational conventions
- PART ONE BASIC COMPLEXITY CLASSES
- PART TWO LOWER BOUNDS FOR CONCRETE COMPUTATIONAL MODELS
- PART THREE ADVANCED TOPICS
- 17 Complexity of counting
- 18 Average case complexity: Levin's theory
- 19 Hardness amplification and error-correcting codes
- 20 Derandomization
- 21 Pseudorandom constructions: Expanders and extractors
- 22 Proofs of PCP theorems and the Fourier transform technique
- 23 Why are circuit lower bounds so difficult?
- Appendix: Mathematical background
- Hints and selected exercises
- Main theorems and definitions
- Bibliography
- Index
- Complexity class index
Summary
God does not play dice with the universe.
–Albert EinsteinAnyone who considers arithmetical methods of producing random digits is, of course, in a state of sin.
–John von Neumann, quoted by Knuth, 1981Randomization is an exciting and powerful paradigm in computer science and, as we saw in Chapter 7, often provides the simplest or most efficient algorithms for many computational problems. In fact, in some areas of computer science, such as distributed algorithms and cryptography, randomization is proven to be necessary to achieve certain tasks or achieve them efficiently. Thus it's natural to conjecture (as many scientists initially did) that at least for some problems, randomization is inherently necessary: One cannot replace the probabilistic algorithm with a deterministic one without a significant loss of efficiency. One concrete version of this conjecture would be that BPP ⊈ P (see Chapter 7 for definition of BPP). Surprisingly, recent research has provided more and more evidence that this is likely to be false. As we will see in this chapter, under very reasonable complexity assumptions, there is in fact a way to derandomize (i.e., transform into a deterministic algorithm) every probabilistic algorithm of the BPP type with only a polynomial loss of efficiency. Thus today most researchers believe that BPP = P. We note that this need not imply that randomness is useless in every setting–we already saw in Chapter 8 its crucial role in the definition of interactive proofs.
- Type
- Chapter
- Information
- Computational ComplexityA Modern Approach, pp. 402 - 420Publisher: Cambridge University PressPrint publication year: 2009