To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
From an applications viewpoint, the main reason to study the subject of this book is to help deal with the complexity of describing random, time-varying functions. A random variable can be interpreted as the result of a single measurement. The distribution of a single random variable is fairly simple to describe. It is completely specified by the cumulative distribution function F(x), a function of one variable. It is relatively easy to approximately represent a cumulative distribution function on a computer. The joint distribution of several random variables is much more complex, for in general it is described by a joint cumulative probability distribution function, F(x1, x2, …, xn), which is much more complicated than n functions of one variable. A random process, for example a model of time-varying fading in a communication channel, involves many, possibly infinitely many (one for each time instant t within an observation interval) random variables. Woe the complexity!
This book helps prepare the reader to understand and use the following methods for dealing with the complexity of random processes:
• Work with moments, such as means and covariances.
• Use extensively processes with special properties. Most notably, Gaussian processes are characterized entirely by means and covariances, Markov processes are characterized by one-step transition probabilities or transition rates, and initial distributions. Independent increment processes are characterized by the distributions of single increments.
• Appeal to models or approximations based on limit theorems for reduced complexity descriptions, especially in connection with averages of independent, identically distributed random variables. The law of large numbers tells us, in a certain sense, that a probability distribution can be characterized by its mean alone. The central limit theorem similarly tells us that a probability distribution can be characterized by its mean and variance. These limit theorems are analogous to, and in fact examples of, perhaps the most powerful tool ever discovered for dealing with the complexity of functions: Taylor's theorem, in which a function in a small interval can be approximated using its value and a small number of derivatives at a single point.
The hype around globalization in early-twenty-first-century political and economic debates may convey an impression that we now are in an entirely new phase of economic development. This chapter will show that the presumption is wrong. A dose of elementary economic history is often helpful when the popular media forget about the past.
Globalization is market integration on a world scale. Market integration means that domestic markets are increasingly dependent on international markets. Prices and hence factor rewards will reflect global rather than local demand and supply conditions. Globalization is the product of intensified trade, capital mobility and migration. In that process prices, interest rates and – with a time lag – wages tend to converge and react faster to international shocks. The first wave of globalization started in the middle of the nineteenth century when barriers to trade, migration and capital mobility were abolished or weakened at the same time as the speed of information transmission increased. In most respects markets were as globalized around 1900 as they were at the beginning of the present century. In fact labour mobility across borders was less restricted before 1914 than it is now. However, there was an anti-globalization backlash early in the twentieth century with two World Wars and the Great Depression. That policy reversal affected commodity, labour and capital markets to the extent that the late-nineteenth-century globalization level was not regained until the 1970s or 1980s, when the second globalization period gained momentum.
Market integration operates through trade and arbitrage and the ultimate manifestation of a fully integrated market is the law of one price. The law of one price proposes that the price of identical goods that are traded is the same in all geographical locations. This is strictly true, of course, only if transport and transaction costs are zero, which they are not.
This edition has been thoroughly revised and a large amount of new material has been added reflecting new research results and the recent development of the European economy. Paul Sharp, my former PhD student and now Professor at the University of Southern Denmark in Odense, has assisted me in this work and he has the principal responsibility for Chapters 8 and 9.
We thank Marc Klemp for revising the Glossary and for his comments and suggestions on Chapter 3.
Claudia Riani has contributed to the development of the companion website and we thank Martin Lundrup Ingerslev for research assistance.
Industrial Revolution, Industrious Revolution and Industrial Enlightenment
The pre-industrial era witnessed a number of ground-breaking innovations and improvements, but they were typically generated by learning by doing. Producers learned that things worked, but had limited understanding of why things worked. From the seventeenth century, decisive efforts were directed towards gaining more and better knowledge of the ‘laws of nature’. However, it is wrong to believe that the British Industrial Revolution, the period 1770–1830, was based on scientific discoveries. Decisive steps were taken in that period towards a more profound understanding of nature, but these accomplishments had little immediate impact on production technologies. The iconic invention of the eighteenth century, the steam engine, is the exception that confirms this rule. The steam engine developed by Thomas Newcomen (1663–1729) relied on the results of scientific inquiry from the preceding century by the Italians Galileo Galilei (1564–1642) and Evangelista Torricelli (1608–97), the Dutchman Christiaan Huygens (1629–95), and Otto von Guericke (1602–86), a German, regarding atmospheric pressure, the weight of air and the nature of a vacuum. Contemporaries of Newcomen made significant contributions, in particular the French inventor Denis Papin (1647–1712?), who invented the piston. In the first generation of steam engines, the steam was condensed in a cylinder, which created a vacuum, and then the piston was pushed into the cylinder by atmospheric pressure.
The massive breakthrough of technologies, which sprang out of abstract theoretical inquiry coupled with empirical testing, did not arrive until the second half of the nineteenth century and mostly in the closing decades of that century. There is no denying, however, that systematic experiments, often combined with limited or flawed theoretical knowledge, became more common before and during the Industrial Revolution.
These misconceptions regarding the role of science contributed to very optimistic assessments of economic growth in the traditional historical narrative of what made Britain ‘the first industrial nation’.
Random processes can be passed through linear systems in much the same way as deterministic signals can. A time-invariant linear system is described in the time domain by an impulse response function, and in the frequency domain by the Fourier transform of the impulse response function. In a sense we shall see that Fourier transforms provide a diagonalization of WSS random processes, just as the Karhunen–Loève expansion allows for the diagonalization of a random process defined on a finite interval. While a m.s. continuous random process on a finite interval has a finite average energy, a WSS random process has a finite mean average energy per unit time, called the power.
Nearly all the definitions and results of this chapter can be carried through in either discrete time or continuous time. The set of frequencies relevant for continuous-time random processes is all of ℝ, while the set of frequencies relevant for discrete-time random processes is the interval [−π, π]. For ease of notation we shall primarily concentrate on continuous-time processes and systems in the first two sections, and give the corresponding definition for discrete time in the third section.
Representations of baseband random processes and narrowband random processes are discussed in Sections 8.4 and 8.5. Roughly speaking, baseband random processes are those which have power only in low frequencies. A baseband random process can be recovered from samples taken at a sampling frequency that is at least twice as large as the largest frequency component of the process. Thus, operations and statistical calculations for a continuous-time baseband process can be reduced to considerations for the discrete-time sampled process. Roughly speaking, narrowband random processes are those processes which have power only in a band (i.e. interval) of frequencies. A narrowband random process can be represented as a baseband random process that is modulated by a deterministic sinusoid. Complex random processes naturally arise as baseband equivalent processes for real-valued narrowband random processes. A related discussion of complex random processes is given in the last section of the chapter.