To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The senses of taste and smell are the chemical senses induced when chemical substances interact with the tongue and the nasal cavity, respectively. Physical quantities such as light (photon), sound wave and pressure (or temperature) are received in the senses of sight, hearing and touch, respectively, as shown in Table 1.1.
Chemoreception occurs even in unicellular organisms, although some living organisms such as deep-sea fish have no sense of sight among the physical senses. Protozoa such as amoebae and microbes such as colon bacilli show chemotaxis; they gather and escape from some chemical substances. The former is called positive chemotaxis and the latter negative chemotaxis. Colon bacilli show positive chemotaxis for amino acids tasting sweet and negative chemotaxis for chemical substances tasting strongly bitter or sour. This behavior is quite reasonable because substances tasting sweet become energy sources for living organisms whereas substances tasting strongly bitter or sour are often harmful.
As the above examples indicate, our likes and dislikes for chemical substances (i.e., foodstuffs) can be considered as an essential matter related to our safety. The senses of taste and smell are those used for checking the safety of substances that are ingested, and hence they have developed in higher animals in the same way as in unicellular living organisms that survive using chemical senses. The development of taste and odor sensors is a growing area of biomimetic technologies intended to mimic the original purpose of biological systems.
The design process cannot be readily characterized. That is, there is not a set of step-by-step rules by which a successful design can be realized. Furthermore, even after a design that satisfies all requirements is completed, it may be difficult to judge whether it is indeed optimal. It is not uncommon at a project's completion for those involved to express the thought that if only they had done it another way, it would have been so much easier – the benefit of hindsight.
When discussing design, we often think in terms of large, complex systems. But the design of such a system might be the result of an individual's intense effort spanning several years such as the development of wide-band frequency modulated broadcasting by Armstrong in the 1930s (Armstrong 1940). Alternatively, a design might be the result of the intense effort of a team of scientists and engineers as was, for example, the design of the compact audio disc, which was the joint effort of two normally competing corporations, Sony and Philips, on opposite sides of the world (Miyaoka 1984). A large design project, however, involves solving many small design problems – problems that at first glance may seem to be trivial but on carrying out the design prove to be otherwise. At times, an overall design may need to be modified because one or more of its components cannot be realized.
Essentially all electronic systems require a nonvarying supply voltage (or current), that is, a dc voltage (or dc current). On the other hand, the electric power supplied by utilities is characterized by an alternating voltage and current having a sinusoidal time dependence. In North America, a frequency of 60 Hz is common, whereas 50 Hz is used in most other areas of the world. Utility potentials depend on the usage: residential service is 120 V (rms) in North America, whereas 220–240 V is common for residential service elsewhere.
A semiconductor junction diode allows a current in only one direction; its reverse-biased current is negligibly small and can be ignored for nearly all applications. Hence, a diode may be used to convert an alternating source of current to a current with a single direction – a process generally referred to as rectification. For many electronic applications it is also necessary to transform the utility voltage to a desired voltage using an iron-core transformer.
The resistor RL of the power supply of Figure 6.1 represents the load to which electrical power is to be supplied. The secondary voltage of the transformer vTrans(t) is rectified by the diode, resulting in a load voltage vLoad(t) that has a single polarity. The load current vLoad(t)/RL also has a single polarity.
The usefulness of the supply shown in Figure 6.1 is very limited because the load voltage is zero for a significant portion of each period of the input voltage.
The idea of a field-effect transistor predates that of the junction transistor by two decades. In the late 1920s, Julius Edgar Lilienfeld proposed using an electric field to control the conductance of a semiconductor crystal (Sah 1988). Although Lilienfeld was granted three patents for proposed devices, there is no evidence that he was able to build an actual working transistor, probably because the required semiconductor technologies were not available at the time. It was Shockley's 1939 consideration of a related field-effect process, the “Schottky gate,” that initiated his thought processes and ultimately led to the invention of the point-contact transistor in 1948 (Shockley 1976). Shockley recognized that a surface field-effect played a role in the operation of this device (Shockley and Pearson 1948). Not only did the invention of the bipolar junction transistor follow this device, but so too the junction field-effect transistor (Shockley 1952).
Semiconductor field-effect devices rely on a single type of carrier for conduction, that is, either free electrons or holes. Hence, these devices are frequently referred to as unipolar transistors (single-polarity charges). The earliest commercially produced field-effect device is the junction field-effect transistor (JFET) in which conduction is controlled by a reverse-biased junction diode. This device, therefore, is characterized by a very high input resistance (negligible diode current for static conditions). Junction field-effect transistors are utilized both as discrete devices and, most frequently, in conjunction with bipolar junction transistors in integrated circuits.
Our daily lives are shaped by electronic systems. In the home we have a myriad of electronic accessories: radios, TVs, VCRs, hi-fis, camcorders, cassette and CD players, telephone answering machines, microwave ovens, and personal computers. Not so obvious but just as much a part of our lives are sophisticated electronic controls such as the microprocessor engine control of our car. We utilize a telephone system that functions with electronic devices to amplify and transfer telephone signals. Our conversations are carried around the world using a combination of microwave or fiber-optic links and satellites. Electronic radar systems are relied on for a safe flight from one airport to the next, and electronic sensors and computers “fly” a modern jet airplane. Modern medical practice depends on extremely complex diagnostic and monitoring electronic systems. Moreover, the commercial and industrial sectors could no longer function without electronic communications and information processing systems. The video monitor is a pervasive reminder of the new electronic world.
For better and at times for worse, electronics has changed our lives. Although we are in constant touch with what is happening around the world, we are also at the peril of weapons of unimaginable destructive power that rely on electronic developments. An understanding of electronics is imperative not only for designing and using electronic systems but for directing the evolution of electronic systems so that they serve to improve the human condition.
Although the invention of the transistor at Bell Telephone Laboratories in 1947 was destined to revolutionize the field of electronics, the initial response to the public announcement on June 30, 1948, was anything but overwhelming. The New York Times carried the news in its daily column “The News of Radio,” which dealt with new radio shows. Near the end of the column appeared the following (New York Times 1948):
A device called a transistor which has several applications in radio where a vacuum tube ordinarily is employed, was demonstrated yesterday at Bell Telephone Laboratories, 463 West Street, where it was invented.
Three additional brief paragraphs completed the announcement. A more fitting debut of the transistor was provided by three letters to Physical Review directed toward the scientific and technical community (Bardeen and Brattain 1948; Brattain and Bardeen 1948; Shockley and Pearson 1948). “The Transistor – A Crystal Triode” appeared as the cover article of the September 1948 issue of Electronics (Fink and Rockett 1948). Although the terms crystal triode and semiconductor triode were used to describe the transistor, by way of analogy with the vacuum tube triode (Figure 1.4), the name transistor has prevailed. The term transistor, that is transfer resistor, is attributed to John R. Pierce (already mentioned in relation to satellite communication) (Shockley 1976). Thus commenced the era of solid-state electronics in which transistors not only play an important role but within which entirely new electronic devices have been developed.
The field of electronics or microelectronics today encompasses a vast quantity of knowledge and practice. The topics that can be covered in a basic course must, by necessity, be limited to avoid a mere encyclopedic cataloging of various electronic circuits and systems. There are, however, a set of underlying concepts that one needs to grasp to understand electronics. It is the goal of the author to provide students and instructors with an accessible treatment of those modern electronic concepts along with appropriate applications. Applications are considered essential to grasp the utility of general concepts as well as to appreciate their limitations. The approach used in the text is to cover a limited number of topics well, as opposed to a cursory coverage of a very wide range of topics that may do little more than leave one with an extensive vocabulary.
The text provides more than adequate material for a one-semester, junior-level electronics course. A good working knowledge of linear circuits along with a reasonable understanding of calculus and physics is required. Although there is a progression in the complexity of the material covered, the text provides a flexibility in selecting the material to cover. The author has attempted to provide sufficient descriptive material to indicate not only what is being done but also to show how a particular circuit is used. Examples with detailed solutions utilizing analytic solutions and computer simulations conclude most sections. In addition, numerous references are cited to allow the interested student to learn more about a particular topic.
Several types of transistors are used in modern electronic systems, both individually as discrete devices and in conjunction with other transistors in integrated circuits. Transistors have three or more terminals and, as is the case for junction diodes, transistors are nonlinear elements. The bipolar junction transistor (BJT) that will be discussed in this chapter, as well as the field-effect transistor of the next chapter, are active devices. Electronic amplifying circuits in which a small input voltage, current, or both, produces a larger output voltage, current, or both depend on active devices. Amplification is required for nearly all electronic systems. The analysis of transistor circuits is considerably more difficult (a greater challenge) than that of circuits with two-terminal passive elements – resistors, capacitors, and inductors.
The history of active electronic circuits dates from the invention of the vacuum tube (the audion) by Lee De Forest in 1906 (De Forest 1906). From the very beginning the challenge was to develop circuits to utilize this new device. Edwin H. Armstrong was foremost among the early designers of electronic circuits that were initially used to improve wireless communication (Armstrong 1915). The junction transistor, developed in 1950 (following the invention of its predecessor, the point-contact transistor, in 1948), and other transistors, have replaced vacuum tubes for most (but not all!) applications. These applications, however, tend to rely on electronic circuits similar to those initially used with vacuum tubes (Electronics 1980).
Modern electronic systems depend on the technology developed for fabricating integrated circuits. Integrated circuit technologies have achieved complexities unimaginable using discrete components and have made economical mass production possible. Electronic circuit design and integrated circuit fabrication require highly specialized, distinct areas of technological expertise. However, to produce optimal integrated circuits, it is necessary that the practitioners of these two technologies interact. The result of this joint effort has been a cornucopia of general purpose and specialized integrated circuits.
An understanding of physical and chemical processes utilizing high-vacuum techniques and high-temperature reactions is required for designing and fabricating integrated circuits. Moreover, the physical dimensions and tolerances of integrated circuit elements are much smaller than those utilized by more conventional engineering disciplines. Thus, the design and fabrication of integrated circuits is an extremely challenging endeavor.
Figure A.I provides a perspective on commonly used physical dimensions. For measurements associated with everyday activities, we tend to think in terms of either inches and feet or centimeters and meters, depending on our cultural background. For the elements of an integrated circuit, these are very large dimensions. On a logarithmic scale, integrated circuit dimensions, which are generally expressed in microns (µm, 10−6 m), tend to fall midway between the size of atoms and everyday measurements. Through conventional machining techniques, tolerances of 0.001 in., ≈25 µm, are not uncommon, whereas the smallest dimension of a typical machined item might be only 0.01 in., ≈0.25 mm.
Negative feedback, when used with an amplifier, reduces the gain of the overall circuit because part of the output signal is used to “negate” a portion of the input signal. If properly designed, negative feedback circuits can result in improved performance characteristics – in particular, lower distortion, improved frequency and impedance characteristics, and a smaller dependence on supply voltages. To realize these benefits, an amplifier is required that has a gain considerably in excess of that which would otherwise be needed. With the advent of commercially produced integrated circuits, high-gain, low-cost amplifiers suitable for negative feedback circuits became readily available. Integrated circuit operational amplifiers (IC op amps) are now widely used “building blocks,” both as individual integrated circuits (replacing discrete transistors for many applications) and within more complex integrated circuits.
The concept of positive feedback electronic circuits predates that of negative feedback (Tucker 1972). Positive feedback was initially used to increase the gain of early low-gain vacuum tube circuits. With positive feedback (regenerative circuits), an enormous increase in the sensitivity of radio receivers was achieved. Only after high-gain amplifier circuits were developed in the 1920s did the concept of using negative feedback emerge. Harold Black is credited with having first proposed this concept in 1927. According to published accounts, the idea of an electronic amplifier with negative feedback was the result of a sudden insight that Black had while crossing the Hudson river by ferry on his way to work in Manhattan (Mabon 1975; O'Neill 1985).
So far, we have made measurements without saying anything about antennas, or about how power gets from the transmitter to the receiver. In our measurements, a 50-Ω load has taken the place of the antenna. However, instead of dissipating the power as heat, an antenna radiates power as electromagnetic waves. One thing that makes antennas interesting is that they necessarily involve both the voltages and currents that we study in circuits and the electric and magnetic fields that make up radio waves. This gives antennas a special place in the history of physics. They were the crucial components that Hertz developed in the 1880s to demonstrate that Maxwell's equations for electricity and magnetism are correct. In the 1960s, a special parabolic antenna allowed Arno Penzias and Robert Wilson at Bell Telephone Laboratories to discover the cosmic background radiation. That measurement earned them a Nobel Prize, and it gave an entirely new interpretation to the history of the universe.
An antenna is characterized by its impedance and pattern, which is a plot of where the power goes for a transmitting antenna. Traditionally, antennas have been analyzed as transmitters, and most antenna engineers think entirely in terms of transmitting antennas. If we know how an antenna transmits, we can use the reciprocity theorem to figure out how the antenna works in reception. The physical description of transmission and reception are actually quite different, and the physics of receiving antennas is in many ways as interesting as for transmitting antennas.
Cables allow us to transmit electrical signals from one circuit to another. For example, we might attach coaxial cable between a function generator and an oscilloscope (Figure 4.1a) and plastic-coated twin lead between an antenna and a television (Figure 4.1b). Usually, when we analyze the circuit, we assume that the voltage at one end of the cable is the same as the voltage at the other end and that the current at the beginning is the same as the current at the end. This is appropriate if the frequency is low. However, at high frequencies the cable itself begins to have an effect. A fundamental limitation is the speed of light. If the voltage at one end of the cable changes appreciably in less time than it takes light to propagate to the other end, we should expect the voltage to be different at the two ends. Another way of saying this is that we would expect the voltages at the ends to be different when the length of the cable becomes an appreciable fraction of a wavelength.
Distributed Capacitance and Inductance
However, even when the cable is considerably shorter than a wavelength, it can have a large effect. We found in Problem 3 that a cable has capacitance. This capacitance is associated with the charges that the voltages on the line induce. We can take the capacitance into account in a circuit by adding a capacitance between the wires (Figure 4.1c).
So far the filters we have made have had only two elements: a capacitor and a resistor or inductor. We can improve the response of our filters by adding more elements. This allows us to make the pass band flatter and the roll-off steeper. Multielement filters behave somewhat like transmission lines, and we need to have the right input and output resistance to avoid problems with reflections. Analyzing these filters by hand is quite difficult, but the calculations are easy on a computer. For this we will use a computer program called Puff, which is included with this book. Instructions for running the program are given in Appendix C.
Ladder Filters
We will consider ladder networks with alternating series and shunt elements like the discrete transmission line we studied in Problem 11. If the series elements are inductors and the shunt elements are capacitors, then the circuit acts as a low-pass filter (Figure 5.1a, b). At low frequencies, the impedance of the inductors and the admittance of the capacitors are small, and the input signal passes through to the output with little loss. In contrast, at high frequencies the inductors begin to act as voltage dividers and the capacitors as current dividers. This reduces the power transmitted to the load. We can also make high-pass filters with series capacitors and shunt inductors (Figure 5.1c, d).
Many different filters have been developed, giving a wide choice of amplitude, phase, pass-band, and stop-band characteristics.
Puff is a circuit simulator for linear circuits. It calculates scattering parameters and makes microstrip and stripline layouts. It also makes time-domain plots. The program is named after the magic dragon in the song by the popular American singing group Peter, Paul and Mary. Puff originated as a teaching tool for Caltech's microwave circuits course. It was created as an inexpensive and simple-to-use alternative to professional software whose high costs, copy protection schemes, and training requirements create difficulties in the academic environment. Puff uses a simple interactive schematic-capture type environment. After a circuit is laid out on the screen using cursor keys, a frequency or time domain analysis is available with a few keystrokes. This process is faster than using net lists, and errors are rare since the circuit is always visible on the screen. Intended for students and researchers, public distribution of the program began in 1987. Puff use, originally limited to Caltech, UCLA, and Cornell, has since spread to many other universities and colleges. The program has also become popular with working engineers, scientists, and amateur radio operators. Over 20,000 copies of versions 1.0, 1.5, 2.0, and 2.1 have been distributed worldwide, and translations have been made to Russian, Polish, and Japanese.
When a transistor is active, the current gain β is large, in the range of 100 or more. This means that we can use the transistor as an amplifier to increase the power of a signal. The amplifier may be considered the single most important device in communications electronics, and it is key to both receivers and transmitters. Developing amplifiers has been a central focus of electrical engineering from the days of the first vacuum tubes, and it is just as important today. There are many issues to consider in designing an amplifier. In transmitters, we are very interested in efficiency. High efficiency makes it easier to dissipate the heat and allows long battery life in portable transmitters. In receivers, it is important to add as little noise as possible to the signal. In this chapter, we study linear amplifiers, where the amplitude of the output tracks the amplitude of the input. In the next chapter, we consider saturating amplifiers, where only the frequency of the output follows the input.
Common-Emitter Amplifier
The basic transistor amplifier is shown in Figure 9. 1a. It uses an npn transistor with a load resistor R at the collector. The supply voltage is written as Vcc. It is traditional to double the subscript of a supply voltage to distinguish it from an AC voltage. This circuit is called a common-emitter amplifier. You do have to be on your guard with amplifier names.