To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The evidence that the cosmological constant, Λ, is non-zero is overwhelming: the cosmic expansion is accelerating. Whether or not this is due to Einstein's famous constant, or some variant of it, i.e. ‘dark energy’, it is the major constituent of the Universe. Currently, we know almost nothing about it. The dominance of Λ, or whatever it is, at redshifts z < 1 means that we can study it via its effect on the cosmic expansion. The Newtonian models provide a good framework for modelling the expansion at these recent times, though we have to remember that we cannot study the propagation of light in the Newtonian context without importing some relativistic concepts.
A particularly relevant model is the ‘benchmark model', which is geometrically flat and contains both pressure-free matter and a cosmological constant. This model, and its generalisations to more exotic forms of dark energy, lie at the basis of the interpretation of modern cosmological data. We study it here via the Friedmann–Lemaître equations that describe the evolution of the Hubble expansion rate, H(z). We then move onto generalising the simple constant Λ to simple models for redshift-dependent dark energy.
As in the previous chapter, the exercises here serve as diversions that fill in details.
The Accelerating Cosmic Expansion
Important Remark
The evidence for the existence of an agent that causes the acceleration of the cosmic expansion comes from several quite diverse sources of data: supernovae, the power spectrum of the CMB, the present day manifestation of primordial structures (‘BAOs’), large scale cosmic structure, to name but a few. Each of these pieces of evidence has received ample confirmation from repeated experiments by independent groups. These sources of evidence agree not only as to the existence of this agency, but also to its magnitude. Even though we have little or no idea what this agency is, its discovery surely stands as one of the great scientific milestones of the past century.
The concept of likelihood is fundamental in the statistical estimation of the parameters that enter into a parameterised model of a dataset. We write down an expression telling us how well the model represents the data, and choose the parameter values that do the best job. Despite the apparent simplicity of this statement, it hides a wealth of issues that have caused a great schism in thinking about how we should draw conclusions from data.
This is the great divide between the frequentist and the Bayesian schools of thought concerning the way likelihood should be implemented in practice. Should we simply take a mechanistic approach and find the most likely parameter set, with an estimate of how confident we are in asserting the answer? Should we somehow fold in our prior prejudices to take account of our past experience? How should we interpret the result of any process that purports to assign a confidence to an answer? What, indeed, would confidence mean in this sense, especially if we cannot analyse any further samples to confirm our result?
The Great Schism
In this chapter we develop the theory of likelihood which is widely used in deriving parameters that fit models to data. Here, we regard the parameters of the model simply as numbers that are to be determined by some optimisation process. We shall generalise this in subsequent chapters when we come to discuss Bayesian inference, but along the way we shall point out salient differences between the two ways of thinking.
We are interested in the selection of parameters that provide the best fit to the given data. What we mean by best fit is generally described by a cost function that provides a quantitative measure of what we mean by goodness of fit. The usual criterion is to ask for the smallest set of parameters that provides the least total deviation of the model from the given data. That too involves an assumption regarding precisely what we mean by ‘smallest total deviation’, or indeed by a measure of the deviation.
Once we have made that fit we may be confronted by a new data set, or an augmentation of the first data set. In either case, we may repeat the fitting procedure only to find a different set of parameters than we found in the first place.
The ‘Renaissance in Cosmology’ took place with the discovery in 1965 by Penzias and Wilson (1965) of the Cosmic Microwave Background Radiation, now known variously as the ‘CMB’, ‘CMBR’, ‘CBR’, ‘MWB’ or simply ‘Microwave Background’ or ‘Relict Radiation’. The paper had the rather unprepossessing title of ‘A Measurement of Excess Antenna Temperature at 4080 Mc/s’ and was less than two pages. In the journal, the Penzias-Wilson paper is immediately preceded by the paper of Dicke, Peebles, Roll and Wilkinson (Dicke et al., 1965) having the title ‘Cosmic Black-Body Radiation’.
This explained that they too had been searching for this radiation field and, most importantly, explained what the significance of the discovery would be in terms of our physical understanding of our Universe (see Figure 3.3).
These two papers changed the course of cosmology.
Discovery of the CMB
During the 1950s and early 1960s the main issue was the great debate between the Steady State and Big Bang theories. This debate had centred around the apparently discrepant radio source counts in deep surveys made at different frequencies by radio astronomers in Cambridge and in Sydney.
The discovery of the cosmic microwave background radiation by Penzias and Wilson in 1965 has turned out to be decisive in establishing a new paradigm in physics: the Hot Big Bang theory (Penzias and Wilson, 1965). However, it should be recognised that the source of the excess radiation they had discovered was still under dispute even in the 1970s and it was not until the first results of the COBE-FIRAS experiment in 1990 that the issue was finally settled.
While the discovery of Penzias and Wilson was itself serendipitous, in that they were not looking for the cosmic background radiation, the idea that the cosmic background radiation was there to be found was well entrenched in both Princeton in the US and Moscow in the (then) Soviet Union. The theoretical framework for this had been set up by George Gamow, who had suggested in the late 1940s that the chemical elements were created in a Hot Big Bang and that the evidence for this would lie in the discovery of the relict radiation.
Classical field theory, which concerns the generation and interaction of fields, is a logical precursor to quantum field theory, and can be used to describe phenomena such as gravity and electromagnetism. Written for advanced undergraduates, and appropriate for graduate level classes, this book provides a comprehensive introduction to field theories, with a focus on their relativistic structural elements. Such structural notions enable a deeper understanding of Maxwell's equations, which lie at the heart of electromagnetism, and can also be applied to modern variants such as Chern–Simons and Born–Infeld. The structure of field theories and their physical predictions are illustrated with compelling examples, making this book perfect as a text in a dedicated field theory course, for self-study, or as a reference for those interested in classical field theory, advanced electromagnetism, or general relativity. Demonstrating a modern approach to model building, this text is also ideal for students of theoretical physics.
Following a long-term international collaboration between leaders in cosmology and the philosophy of science, this volume addresses foundational questions at the limit of science across these disciplines, questions raised by observational and theoretical progress in modern cosmology. Space missions have mapped the Universe up to its early instants, opening up questions on what came before the Big Bang, the nature of space and time, and the quantum origin of the Universe. As the foundational volume of an emerging academic discipline, experts from relevant fields lay out the fundamental problems of contemporary cosmology and explore the routes toward finding possible solutions. Written for graduates and researchers in physics and philosophy, particular efforts are made to inform academics from other fields, as well as the educated public, who wish to understand our modern vision of the Universe, related philosophical questions, and the significant impacts on scientific methodology.
The boundary between science and philosophy is often blurred at the frontiers of knowledge. This is because one is dealing with proposals which are not amenable to the usual type of scientific tests, at least initially. Some scientists have an antipathy to philosophy and therefore regard such proposals disparagingly. However, that may be short-sighted because historically science had its origin in natural philosophy and the science/philosophy boundary has continuously shifted as fresh data accumulate. The criteria for science itself have also changed. So ideas on the science/philosophy boundary may eventually become proper science. Sometimes the progress of science may even be powered from this boundary, with new paradigms emerging from there.
A particularly interesting example of this in the context of the physical sciences is cosmology. This is because the history of physics involves the extension of knowledge outwards to progressively larger scales and inwards to progressively smaller ones, and the scientific status of ideas at the smallest and largest scales has always been controversial. Cosmology involves both extremes and so is doubly vulnerable to anti-philosophical criticisms. While cosmography concerns the structure of the Universe on the largest scales, these being dominated by gravity, cosmogeny studies the origin of the Universe and involves arbitrarily small scales, where the other forces of nature prevail. Indeed, there is a sense in which the largest and smallest scales merge at the Big Bang. So cosmology has often had to struggle to maintain its scientific respectability and more conservative physicists still regard some cosmological speculations as going beyond proper science. One example concerns the current debate over the multiverse. The issue is not just whether other Universes exist but whether such speculations can be classified as science even if they do, since they may never be seen.
While most of this chapter focuses on cosmology, two other problems straddling the boundary between physics and philosophy are also discussed. The first concerns black holes. Although these objects were predicted by general relativity a century ago, Albert Einstein thought they were just mathematical artefacts and it was 50 years before observational evidence emerged for their physical reality.
In physical cosmology we are faced with an empirical context of gradually diminishing returns from new observations. This is true in a fundamental sense, since the amount of information we can expect to collect through astronomical observations is finite, owing to the fact that we occupy a particular vantage point in the history and spatial extent of the Universe. Arguably, we may approach the observational limit in the foreseeable future, at least in relation to some scientific hypotheses (Ellis, 2014). There is no guarantee that the amount and types of information we are able to collect will be sufficient to statistically test all reasonable hypotheses that may be posed. There is under-determination both in principle and in practice (Butterfield, 2014; Ellis, 2014; Zinkernagel, 2011). These circumstances are not new, indeed cosmology has had to contend with this problem throughout history. For example, Whitrow (1949) relates the same concerns, and points back to remarks by Blaise Pascal in the seventeenth century: ‘But if our view be arrested there let our imagination pass beyond; … We may enlarge our conceptions beyond all imaginable space; we only produce atoms in comparison with the reality of things’. Already with Thales, epistemological principles of uniformity and consistency have been used to structure the locally imaginable into something considered globally plausible. The primary example in contemporary cosmology is the Cosmological Principle of large-scale isotropy and homogeneity. In the following, the aim will be to apply such epistemological principles to the procedure of cosmological model inference itself.
The state of affairs described above naturally leads to a view of model inference as inference to the best explanation/model (e.g. Lipton, 2004; Maher, 1993), since some degree of explanatory ambiguity appears unavoidable in principle. This is consistent with a Bayesian interpretation of probability which includes a priori assumptions explicitly. As in science generally, inference in cosmology is based on statistical testing of models in light of empirical data. A large body of literature has built up in recent years discussing various aspects of these methods, with Bayesian statistics becoming a standard framework (Hobson, 2010; Jaynes, 2003; von Toussaint, 2011). The necessary foundations of Bayesian inference will be presented in the next section.
This chapter is about foundational themes underlying the scientific study of cosmology:
• What issues will a theory of cosmology deal with?
• What kinds of causation will be taken into account as we consider the relation between chance and necessity in cosmology?
• What kinds of data and arguments will we use to test theories, when they stretch beyond the bounds of observational probing?
• Should we weaken the need for testing and move to a post-empirical phase, as some have suggested?
These are philosophical issues at the foundation of the cosmological enterprise. The answer may be obvious or taken for granted by scientists in many cases, and so seem hardly worth mentioning; but that has demonstrably led to some questionable statements about what is reliably known about cosmology, particularly in popular books and public statements. The premise of this chapter is that it is better to carefully think these issues through and make them explicit, rather than having unexamined assumed views about them shaping cosmological theories and their public presentation. Thus, as in other subjects, being philosophical about what is being undertaken will help clarify practice in the area.
The basic enterprise of cosmology is to use tested physical theories to understand major aspects of the universe in which we live, as observed by telescopes of all kinds. The foundational issue arising is the uniqueness of the universe [66, 27, 28]. Standard methods of scientific theory testing rely on comparing similar objects to determine regularities, so they cannot easily be applied in the cosmological context, where there is no other similar object to use in any comparison. We have to extrapolate from aspects of the universe to hypotheses about the seen and unseen universe as a whole. Furthermore, physical explanations of developments in the very early universe depend on extrapolating physical theories beyond the bounds where they can be tested in a laboratory or particle collider. Hence philosophical issues of necessity arise as we push the theory beyond its testable limits. Making them explicit clarifies what is being done and illuminates issues that need careful attention.