By the early 1970s, most of the basic electronic development that made digital technology feasible was in place. One great burst of innovation over two decades had produced the transistor, the p-n junction, the CMOS process for integrated circuits, magnetic disk storage, heterojunction semiconductor lasers, fiber optic communications, and semiconductor imagers and displays.
The speed with which researchers put the new devices to practical use was almost as remarkable. They quickly developed an array of systems that gather, process, and transmit information.
The most visible example was the computer. In 1982, Time magazine focused popular attention on the computer's importance by naming it “Machine of the Year,” a widely discussed departure from the “Man of the Year” honor the magazine had bestowed for over sixty years.
Since then computers have become a part of our lives. Smaller, more powerful microprocessors and supporting devices, combined with plummeting costs, have sparked an endless stream of new computerized consumer and industrial products.
However, while the computer has been central to the Information Age, the most revolutionary transformation has taken place in communications. Data transmission, once limited to 300 bits per second on analog phone lines, now uses digital links to carry multi-megabit per second data streams. Voice has gone digital and wireless. The Internet has metamorphosed in just ten years from an “insider” network for scientists to an indispensable, ubiquitous communications backbone linking billions of people around the globe.
And both computing and communications have been enabled by advances in software architecture.