We live in a new age for statistical inference, where modern scientific technology such as microarrays and fMRI machines routinely produce thousands and sometimes millions of parallel data sets, each with its own estimation or testing problem. Doing thousands of problems at once is more than repeated application of classical methods. Taking an empirical Bayes approach, Bradley Efron, inventor of the bootstrap, shows how information accrues across problems in a way that combines Bayesian and frequentist ideas. Estimation, testing and prediction blend in this framework, producing opportunities for new methodologies of increased power. New difficulties also arise, easily leading to flawed inferences. This book takes a careful look at both the promise and pitfalls of large-scale statistical inference, with particular attention to false discovery rates, the most successful of the new statistical techniques. Emphasis is on the inferential ideas underlying technical developments, illustrated using a large number of real examples.

• In-depth investigation helps statisticians understand complicated new methodology • Takes the empirical Bayes approach, which combines Bayesian and frequentist viewpoints • The author, inventor of the bootstrap, has published extensively on both large-scale inference and empirical Bayes methods

### Contents

Introduction and foreword; 1. Empirical Bayes and the James–Stein estimator; 2. Large-scale hypothesis testing; 3. Significance testing algorithms; 4. False discovery rate control; 5. Local false discovery rates; 6. Theoretical, permutation and empirical null distributions; 7. Estimation accuracy; 8. Correlation questions; 9. Sets of cases (enrichment); 10. Combination, relevance, and comparability; 11. Prediction and effect size estimation; A. Exponential families; B. Programs and data sets; Bibliography; Index.

### Review

'In the last decade, Efron has played a leading role in laying down the foundations of large-scale inference, not only in bringing back and developing old ideas, but also linking them with more recent developments, including the theory of false discovery rates and Bayes methods. We are indebted to him for this timely, readable and highly informative monograph, a book he is uniquely qualified to write. It is a synthesis of many of Efron's own contributions over the last decade with that of closely related material, together with some connecting theory, valuable comments, and challenges for the future. His avowed aim is 'not to have the last word' but to help us deal 'with the burgeoning statistical problems of the twenty-first century'. He has succeeded admirably.' Terry Speed, International Statistical Review