Skip to main content Accesibility Help
×
×
Home
Core Statistics
  • Get access
    Check if you have access via personal or institutional login
  • Cited by 5
  • Cited by
    This book has been cited by the following publications. This list is generated based on data provided by CrossRef.

    Dormann, Carsten F. Calabrese, Justin M. Guillera-Arroita, Gurutzeta Matechou, Eleni Bahn, Volker Bartoń, Kamil Beale, Colin M. Ciuti, Simone Elith, Jane Gerstner, Katharina Guelat, Jérôme Keil, Petr Lahoz-Monfort, José J. Pollock, Laura J. Reineking, Björn Roberts, David R. Schröder, Boris Thuiller, Wilfried Warton, David I. Wintle, Brendan A. Wood, Simon N. Wüest, Rafael O. and Hartig, Florian 2018. Model averaging in ecology: a review of Bayesian, information-theoretic, and tactical approaches for predictive inference. Ecological Monographs, Vol. 88, Issue. 4, p. 485.

    Rogers, Marie Franklin, Anna and Knoblauch, Kenneth 2018. A Novel Method to Investigate How Dimensions Interact to Inform Perceptual Salience in Infancy. Infancy, Vol. 23, Issue. 6, p. 833.

    Duvvuri, Hiranmayi Wheeler, Lucas C. and Harms, Michael J. 2018. pytc: Open-Source Python Software for Global Analyses of Isothermal Titration Calorimetry Data. Biochemistry, Vol. 57, Issue. 18, p. 2578.

    Wood, Simon N. Li, Zheyuan Shaddick, Gavin and Augustin, Nicole H. 2017. Generalized Additive Models for Gigadata: Modeling the U.K. Black Smoke Network Daily Data. Journal of the American Statistical Association, Vol. 112, Issue. 519, p. 1199.

    Stickler, Benjamin A. and Schachinger, Ewald 2016. Basic Concepts in Computational Physics. p. 311.

    ×

Book description

Based on a starter course for beginning graduate students, Core Statistics provides concise coverage of the fundamentals of inference for parametric statistical models, including both theory and practical numerical computation. The book considers both frequentist maximum likelihood and Bayesian stochastic simulation while focusing on general methods applicable to a wide range of models and emphasizing the common questions addressed by the two approaches. This compact package serves as a lively introduction to the theory and tools that a beginning graduate student needs in order to make the transition to serious statistical analysis: inference; modeling; computation, including some numerics; and the R language. Aimed also at any quantitative scientist who uses statistical methods, this book will deepen readers' understanding of why and when methods work and explain how to develop suitable methods for non-standard situations, such as in ecology, big data and genomics.

Reviews

'The author keeps this book concise by focusing entirely on topics that are most relevant for scientific modeling via maximum likelihood and Bayesian inference. This makes it an ideal text and handy reference for any math-literate scientist who wants to learn how to build sophisticated parametric models and fit them to data using modern computational approaches. I will be recommending this well-written book to my collaborators.'

Murali Haran - Pennsylvania State University

'Simon Wood has written a must-read book for the instructor, student, and scholar in search of mathematical rigor, practical implementation, or both. The text is relevant to the likelihoodist and Bayesian alike; it is nicely topped off by instructive problems and exercises. Who thought that a core inference textbook needs to be dry?'

Geert Molenberghs - Universiteit Hasselt and KU Leuven, Belgium

'Simon Wood’s book Core Statistics is a welcome contribution. Wood’s considerable experience in statistical matters and his thoughtfulness as a writer and communicator consistently shine through. The writing is compact and neutral, with occasional glimpses of Wood’s wry humour. The carefully curated examples, with executable code, will repay imitation and development. I warmly recommend this book to graduate students who need an introduction, or a refresher, in the core arts of statistics.'

Andrew Robinson - University of Melbourne

'This is an interesting book intended for someone who has already taken an introductory course on probability and statistics and who would like to have a nice introduction to the main modern statistical methods and how these are applied using the R language. It covers the fundamentals of statistical inference, including both theory in a concise form and practical numerical computation.'

Vassilis G. S. Vasdekis Source: Mathematical Reviews

Refine List
Actions for selected content:
Select all | Deselect all
  • View selected items
  • Export citations
  • Download PDF (zip)
  • Send to Kindle
  • Send to Dropbox
  • Send to Google Drive
  • Send content to

    To send content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about sending content to .

    To send content items to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle.

    Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

    Find out more about the Kindle Personal Document Service.

    Please be advised that item(s) you selected are not available.
    You are about to send
    ×

Save Search

You can save your searches here and later view and run them again in "My saved searches".

Please provide a title, maximum of 40 characters.
×
References
Akaike, H. (1973). Information theory and an extension of the maximum likelihood principle. In B., Petran and F., Csaaki (Eds.), International symposium on information theory, Budapest: Akadeemiai Kiadi, pp. 267–281.
Berger, J. O. and L. R., Pericchi (1996). The intrinsic Bayes factor for model selection and prediction. Journal Of The American Statistical Association 91(433), 109–122.
Casella, G. and R., Berger (1990). Statistical inference. Belmont, CA: Duxbury Press.
Cox, D. R. (1992). Planning of experiments. New York: Wiley Classics Library.
Cox, D. R. and D. V., Hinkley (1974). Theoretical statistics. London: Chapman & Hall.
Davis, T. A. (2006). Direct methods for sparse linear systems. Philadelphia: SIAM.
Davison, A. C. (2003). Statistical models. Cambridge: Cambridge University Press.
De Groot, M. H. and M. J., Schervish (2002). Probability and statistics. Boston: Addison-Wesley.
Fahrmeir, L., T., Kneib, and S., Lang (2004). Penalized structured additive regression for space-time data: a Bayesian perspective. Statistica Sinica 14(3), 731–761.
Friel, N. and A., Pettitt (2008). Marginal likelihood estimation via power posteriors. Journal of the Royal Statistical Society, Series B 70(3), 589–607.
Gage, J. and P., Tyler (1985). Growth and recruitment of the deep-sea urchin echinus affinis. Marine Biology 90(1), 41–53.
Gamerman, D. and H., Lopes (2006). Markov chain Monte Carlo: stochastic simulation for Bayesian inference, Volume 68. Boca Raton, FL: Chapman & Hall CRC.
Gelman, A., J. B., Carlin, H. S., Stern, D. B., Dunson, A., Vehtari, and D. B., Rubin (2013). Bayesian data analysis. Boca Raton, FL: CRC press.
Gentle, J. (2003). Random number generation and Monte Carlo methods (2nd ed.). New York: Springer.
Gill, P. E., W., Murray, and M. H., Wright (1981). Practical optimization. London: Academic Press.
Golub, G. H. and C. F., Van Loan (2013). Matrix computations (4th ed.). Baltimore: Johns Hopkins University Press.
Green, P. J. (1995). Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika 82(4), 711–732.
Griewank, A. and A., Walther (2008). Evaluating derivatives: principles and techniques ofalgorithmic differentiation. Philadelphia: SIAM.
Grimmett, G. and D., Stirzaker (2001). Probability and random processes (3rd ed.). Oxford: Oxford University Press.
Gurney, W. S. C. and R. M., Nisbet (1998). Ecological dynamics. Oxford: Oxford University Press.
Hastie, T., R., Tibshirani, and J., Friedman (2001). The Elements of Statistical Learning. New York: Springer.
Kass, R. and A., Raftery (1995). Bayes factors. Journal of the American Statistical Association 90(430), 773–795.
Klein, J. and M., Moeschberger (2003). Survival analysis: techniques for censored and truncated data (2nd ed.). New York: Springer.
Marsaglia, G. (2003). Xorshift random number generators. Journal of Statistical Software 8(14), 1–16.
Matsumoto, M. and T., Nishimura (1998). Mersenne twister: a 623-dimensionally equidistributed uniform pseudo-random number generator. ACM Transactions on Modeling and Computer Simulation 8, 3–30.
McCullagh, P. and J. A., Nelder (1989). Generalized linear models (2nd ed.). London: Chapman & Hall.
Neal, R. M. (2003). Slice sampling. Annals of Statistics 31, 705–767.
Nocedal, J. and S., Wright (2006). Numerical optimization (2nd ed.). New York: Springer verlag.
O'Hagan, A. (1995). Fractional Bayes factors for model comparison. Journal of the Royal Statistical Society. Series B (Methodological) 57(1), 99–138.
Pinheiro, J. C. and D. M., Bates (2000). Mixed-effects models in S and S-PLUS. New York: Springer-Verlag.
Plummer, M., N., Best, K., Cowles, and K., Vines (2006). Coda: convergence diagnosis and output analysis for MCMC. R News 6(1), 7–11.
Press, W., S., Teukolsky, W., Vetterling, and B., Flannery (2007). Numerical recipes (3rd ed.). Cambridge: Cambridge University Press.
R Core Team (2012). R: a language and environment for statistical computing. Vienna: R Foundation for Statistical Computing. ISBN 3-900051-07-0.
Ripley, B. D. (1987). Stochastic simulation. New York: Wiley.
Robert, C. (2007). The Bayesian choice: from decision-theoretic foundations to computational implementation. New York: Springer.
Robert, C. and G., Casella (2009). Introducing Monte Carlo methods with R. New York: Springer.
Roberts, G. O., A., Gelman, and W. R., Gilks (1997). Weak convergence and optimal scaling of random walk metropolis algorithms. The Annals of Applied Probability 7(1), 110–120.
Rue, H., S., Martino, and N., Chopin (2009). Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations. Journal of the royal statistical society: Series B 71(2), 319–392.
Schwarz, G. (1978). Estimating the dimension of a model. Annals of Statistics 6(2), 461–464.
Silvey, S. D. (1970). Statistical inference. London: Chapman & Hall.
Spiegelhalter, D. J., N. G., Best, B. P., Carlin, and A., van der Linde (2002). Bayesian measures of model complexity and fit. Journal of the Royal Statistical Society, Series B 64(4), 583–639.
Steele, B. M. (1996). A modified EM algorithm for estimation in generalized mixed models. Biometrics 52(4), 1295–1310.
Tierney, L., R., Kass, and J., Kadane (1989). Fully exponential Laplace approximations to expectations and variances of nonpositive functions. Journal of the American Statistical Association 84(407), 710–716.
Watkins, D. S. (1991). Fundamentals of matrix computation. New York: Wiley.
Wichmann, B. and I., Hill (1982). Efficient and portable pseudo-random number generator. Applied Statistics 31, 188–190.
Wood, S. N. (2006). Generalized additive models: an introduction with R. Boca Raton, FL: CRC press.

Metrics

Altmetric attention score

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Book summary page views

Total views: 0 *
Loading metrics...

* Views captured on Cambridge Core between #date#. This data will be updated every 24 hours.

Usage data cannot currently be displayed