To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Modern electronic devices operate in general, on digital principles. That is, signals are transmitted in numerical form such that the numbers are coded by binary digits. A binary digit has only two states: ‘one’ and ‘zero’, or ‘high’ and ‘low’ etc. The reason for relying almost exclusively on digital information is that binary data can be easily manipulated and can be reliably stored and retrieved. That this approach is practical and economically advantageous is due to the great advances in large scale integration and chip manufacture as already discussed. In this chapter we will consider digital systems and the representation and storage of binary data. We will conclude by discussing the architecture of a small 3-bit computer, which nevertheless, contains all the important features of large machines.
Elements of Boolean algebra
In digital logic circuits a variable can take only one of the two possible values: 1 or 0. The rules for operating with such variables were first discussed by the British mathematician George Boole (1815–64) and are now referred to by his name. Since in pure logic a statement is either true or false, Boolean algebra can be applied when manipulating logic statements as well. This material is conceptually simple yet it is most relevant to the understanding of complex logic circuits.
Boolean algebra contains three basic operations: AND, OR and Complement. The result of these operations can be best represented by a truth table as introduced in Section 1.9, where also the symbols for the corresponding circuits were given.
The motion of a fluid is extremely complex because the individual molecules are subject to random thermal motion as well as to the collective motion of the fluid as a whole. Thus we consider a small element dτ of the fluid and follow its motion as a function of time. We will assume that the fluid is incompressible, so that the mass dm = ρ dτ contained in the volume dτ remains fixed and the density ρ is constant throughout the fluid; we will also assume that the fluid is non-viscous, that is there are no internal frictional forces. These two assumptions are applicable to motion through air when the velocity v is small as compared to the velocity of sound vs, i.e. v « vs. The velocity of sound is a measure of the random thermal velocity of the molecules; its value for air at s.t.p. is vs ≃ 330 m/s. When necessary we will relax these assumptions.
The simplest form of flow occurs when the velocity at each point of the liquid remains constant in time. This is illustrated in Fig. 7.1(a) where the element dτ follows the path from the point P to Q to R and has the velocity vP, vQ, vR; at a later time another element of the fluid will be at P but it will again follow the path to Q to R and have the same velocity.
Communication implies the transmission of messages and is the basis of human civilization. Speech, smoke signals, or written notes are all forms of communication. We will be concerned principally with communication over large distances, often refered to as telecommunications. Telecommunications are based on the transmission of electromagnetic (em) waves from a sending to a receiving station. The em wave can propagate either in a guided structure such as a pair of conductors, a waveguide or an optical fiber or it can propagate in free space. As technology progressed, higher frequency em waves became available and they offer important advantages as information carriers.
In Chapter 3 we introduce some general principles of information transmission. We examine the analysis of an arbitrary signal into a Fourier series, methods for modulating the carrier, and the sampling theorem for digital encoding of analog signals. The topic of noise in communication channels and of the expected level of random noise is treated next. Finally a brief overview of information theory is given. Information theory assigns a quantitative measure to the information contained in a message and is used to define the capacity of a communication channel.
Chapter 4 is devoted to the problems of the generation, propagation and detection of electromagnetic radiation at different frequencies. The physical laws governing these phenomena are Maxwell's equations and are universally valid. Different frequencies however present different problems in their transmission through the atmosphere and in their propagation along guided structures.
It is well known that certain materials conduct electricity with little resistance whereas others are good insulators. There also exist materials whose resistivity is between that of good conductors and insulators, and is strongly dependent on temperature; these materials are called semiconductors. Silicon (Si), germanium (Ge) and compounds such as gallium arsenide (GaAs) are semiconductors, silicon being by far the most widely used material. Solids, in general, are crystalline and their electrical properties are determined by the atomic structure of the overall crystal. This can be understood by analogy to the energy levels of a free atom.
A free atom, for instance the hydrogen atom, exhibits discrete energy levels which can be exactly calculated. A schematic representation of such an energy diagram is shown in Fig. 1.1(a). If two hydrogen atoms are coupled, as in the hydrogen molecule, the number of energy levels doubles as shown in part (b) of the figure. If the number of atoms that are coupled to each other is very large – as is the case for a crystal – the energy levels coalesce into energy bands as in Fig. 1.1(c). The electrons in the crystal can only have energies lying in these bands.
When an atom is not excited the electrons occupy the lowest possible energy levels. In accordance with the Pauli principle only two electrons (one with spin projection up and the other down) can be found at any one particular energy level.
Ours is the age of technology, rivaling the industrial revolution in its impact on the course of civilization. Whether the great achievements of technology, and our dependence on them, have improved our lot, or lead inexorably to a ‘strange new world’ we shall not debate here. Instead we focus on the physical laws that make technology possible in the first place. Our aim is to understand and explain modern technology, as distinct from describing it.
Even when the principles underlying a technical process or device are well understood, a great deal of engineering effort and a long manufacturing infrastructure are needed to translate them into practice. In turn, the technical skills that are developed lead to new possibilities in basic research and to new applications. For instance, the laser could have been easily built at the turn of the century; yet it was a long road starting with the development of radar and followed by the invention of the maser that led to the proposal for the laser. The use of computers in so many manufacturing areas and research fields is another example of the interplay between technology and basic science.
Because of the complexity of modern devices and of the rapid advances in all scientific fields, the need for specialization is acute. Thus, often, science students are only vaguely aware of the applications of the principles they have learned, whereas engineering students are too involved to appreciate the power of the physical law.
Microelectronics are found today at the heart of almost every device or machine. Be it an automobile, a cash register or just a digital watch it is controlled by electronic circuits built on small semiconductor chips. While the complexity of the functions performed by these devices has increased by several orders of magnitude their size is continuously decreasing. It is this remarkable achievement that has made possible the development of powerful processors and computers and has even raised the possibility of achieving artificial intelligence.
The basic building block of all microcircuits is the transistor, invented in 1948 by John Bardeen, Walter Brattain and William Shockley at Bell Telephone Laboratories. The first chapter is devoted to a discussion of the transistor beginning with a brief review of the structure of semiconductors and of the motion of charge carriers across junctions. We discuss the p–n junction and bipolar as well as field-effect transistors. We then consider modern techniques used in very large scale integration (VLSI) of circuit elements as exemplified by Metal-Oxide-Silicon (MOS) devices.
In the second chapter we take a broader look at how a processor, or computer, is organized and how it can be built out of individual logical circuit elements or gates. We review binary algebra and consider elementary circuits and the representation of data and of instructions; we also discuss the principles of mass data storage on magnetic devices. Finally we examine the architecture of a typical computer and analyze the sequence of operations in executing a particular task.