To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
As soon as an Analytical Engine exists, it will necessarily guide the future course of the science. Whenever any result is sought by its aid, the question will then arise – by what course of calculation can these results be arrived at by the machine in the shortest time?
Charles Babbage
Beginnings
What is an algorithm? The word is derived from the name of the Persian scholar Mohammad Al-Khowarizmi (see B.5.1 and Fig. 5.1). In the introduction to his classic book Algorithmics: The Spirit of Computing, computer scientist David Harel gives the following definition:
An algorithm is an abstract recipe, prescribing a process that might be carried out by a human, by a computer, or by other means. It thus represents a very general concept, with numerous applications. Its principal interest and use, however, is in those cases where the process is to be carried out by a computer.
Thus an algorithm can be regarded as a “recipe” detailing the mathematical steps to follow to do a particular task. This could be a numerical algorithm for solving a differential equation or an algorithm for completing a more abstract task, such as sorting a list of items according to some specified property. The word algorithmics was introduced by J. F. Traub in a textbook in 1964 and popularized as a key field of study in computer science by Donald Knuth (B.5.2) and David Harel (B.5.3). When the steps to define an algorithm to carry out a particular task have been identified, the programmer chooses a programming language to express the algorithm in a form that the computer can understand.
In 2012 the U.S. National Research Council published the report “Continuing Innovation in Information Technology.” The report contained an updated version of the Tire Tracks figure, first published in 1995. Figure A.2 gives examples of how computer science research, in universities and in industry, has directly led to the introduction of entirely new categories of products that have ultimately provided the basis for new billion-dollar industries. Most of the university-based research has been federally funded.
The bottom row of the figure shows specific computer science research areas where major investments have resulted in the different information technology industries and companies shown at the top of the figure. The vertical red tracks represent university-based research and the blue tracks represent industry research and development. The dashed black lines indicate periods following the introduction of significant commercial products resulting from this research, and the green lines represent the establishment of billion-dollar industries with the thick green lines showing the achievement of multibillion-dollar markets by some of these industries.
No one saw these mice coming. No one, that is, in my field, writing science fictions. Oh, a few novels were written about those Big Brains, a few New Yorker cartoons were drawn showing those immense electric craniums that needed whole warehouses to THINK in. But no one in all of future writing foresaw those big brutes dieted down to fingernail earplug size so you could shove Moby Dick in one ear and pull Job and Ecclesiastes out the other.
Ray Bradbury
Early visions
The British science fiction writer, Brian Aldiss, traces the origin of science fiction to Mary Shelley’s Frankenstein in 1818. In her book, the unwise scientist, Victor Frankenstein, deliberately makes use of his knowledge of anatomy, chemistry, electricity, and physiology to create a living creature. An alternative starting point dates back to the second half of the nineteenth century with the writing of Jules Verne (B.17.1) and Herbert George (H. G.) Wells (B.17.2). This was a very exciting time for science – in 1859 Charles Darwin had published the Origin of Species; in 1864 James Clerk Maxwell had unified the theories of electricity and magnetism; and in 1869 Mendeleev had brought some order to chemistry with his Periodic Table of the Elements, and Joule and Kelvin were laying the foundations of thermodynamics. Verne had the idea of combining modern science with an adventure story to create a new type of fiction. After publishing his first such story “Five Weeks in a Balloon” in 1863, he wrote:
I have just finished a novel in a new form, a new form – do you understand? If it succeeds, it will be a gold mine.
There are many “popular” books on science that provide accessible accounts of the recent developments of modern science for the general reader. However, there are very few popular books about computer science – arguably the “science” that has changed the world the most in the last half century. This book is an attempt to address this imbalance and to present an accessible account of the origins and foundations of computer science. In brief, the goal of this book is to explain how computers work, how we arrived at where we are now, and where we are likely to be going in the future.
The key inspiration for writing this book came from Physics Nobel Prize recipient Richard Feynman. In his lifetime, Feynman was one of the few physicists well known to a more general public. There were three main reasons for this recognition. First, there were some wonderful British television programs of Feynman talking about his love for physics. Second, there was his best-selling book “Surely You’re Joking, Mr. Feynman!”: Adventures of a Curious Character, an entertaining collection of stories about his life in physics – from his experiences at Los Alamos and the Manhattan atomic bomb project, to his days as a professor at Cornell and Caltech. And third, when he was battling the cancer that eventually took his life, was his participation in the enquiry following the Challenger space shuttle disaster. His live demonstration, at a televised press conference, of the effects of freezing water on the rubber O-rings of the space shuttle booster rockets was a wonderfully understandable explanation of the origin of the disaster.
Video games are bad for you? That’s what they said about rock and roll.
Shigeru Miyamoto
The first computer games
Since the earliest days, computers have been used for serious purposes and for fun. When computing resources were scarce and expensive, using computers for games was frowned upon and was typically an illicit occupation of graduate students late at night. Yet from these first clandestine experiments, computer video games are now big business. In 2012, global video game sales grew by more than 10 percent to more than $65 billion. In the United States, a 2011 survey found that more than 90 percent of children aged between two and seventeen played video games. In addition, the Entertainment Software Association in the United States estimated that 40 percent of all game players are now women and that women over the age of eighteen make up a third of the total game-playing population. In this chapter we take a look at how this multibillion-dollar industry began and how video games have evolved from male-dominated “shoot ’em up” arcade games to more family-friendly casual games on smart phones and tablets.
One of the first computer games was written for the EDSAC computer at Cambridge University in 1952. Graduate student Alexander Douglas used a computer game as an illustration for his PhD dissertation on human-computer interaction. The game was based on the game called tic-tac-toe in the United States and noughts and crosses in the United Kingdom. Although Douglas did not name his game, computer historian Martin Campbell-Kelly saved the game in a file called OXO for his simulator program, and this name now seems to have escaped into the wild. The player competed against the computer, and output was programmed to appear on the computer’s cathode ray tube (CRT) as a display screen. The source code was short and, predictably, the computer could play a perfect game of tic-tac-toe (Fig. 9.1).
It seems reasonable to envision, for a time 10 or 15 years hence, a “thinking center” that will incorporate the functions of present-day libraries together with anticipated advances in information storage and retrieval.... The picture readily enlarges itself into a network of such centers, connected to one another by wide-band communication lines and to individual users by leased-wire services. In such a system, the speed of the computers would be balanced, and the cost of gigantic memories and the sophisticated programs would be divided by the number of users.
J. C. R. Licklider
The network is the computer
Today, with the Internet and World Wide Web, it seems very obvious that computers become much more powerful in all sorts of ways if they are connected together. In the 1970s this result was not so obvious. This chapter is about how the Internet of today came about. As we can see from Licklider’s (B.10.1) quotation beginning this chapter, in addition to arguing for the importance of interactive computing in his 1960 paper on “Man-Computer Symbiosis,” Lick also envisaged linking computers together, a practice we now call computer networking. Larry Roberts, Bob Taylor’s hand-picked successor at the Department of Defense’s Advanced Research Projects Agency (ARPA), was the person responsible for funding and overseeing the construction of the ARPANET, the first North American wide area network (WAN). A WAN links together computers over a large geographic area, such as a state or country, enabling the linked computers to share resources and exchange information.
Computers now impact almost every aspect of our lives, from our social interactions to the safety and performance of our cars. How did this happen in such a short time? And this is just the beginning. In this book, Tony Hey and Gyuri Pápay lead us on a journey from the early days of computers in the 1930s to the cutting-edge research of the present day that will shape computing in the coming decades. Along the way, they explain the ideas behind hardware, software, algorithms, Moore's Law, the birth of the personal computer, the Internet and the Web, the Turing Test, Jeopardy's Watson, World of Warcraft, spyware, Google, Facebook and quantum computing. This book also introduces the fascinating cast of dreamers and inventors who brought these great technological developments into every corner of the modern world. This exciting and accessible introduction will open up the universe of computing to anyone who has ever wondered where his or her smartphone came from.
A valuable resource for working programmers, as well as a fount of useful algorithmic tools for computer scientists, this new edition of the popular calendars book expands the treatment of the previous edition to new calendar variants: generic cyclical calendars and astronomical lunar calendars as well as the Korean, Vietnamese, Aztec, and Tibetan calendars. The authors frame the calendars of the world in a completely algorithmic form, allowing easy conversion among these calendars and the determination of secular and religious holidays. LISP code for all the algorithms are available on the Web.
Artificial intelligence (AI) is a field within computer science that is attempting to build enhanced intelligence into computer systems. This book traces the history of the subject, from the early dreams of eighteenth-century (and earlier) pioneers to the more successful work of today's AI engineers. AI is becoming more and more a part of everyone's life. The technology is already embedded in face-recognizing cameras, speech-recognition software, Internet search engines, and health-care robots, among other applications. The book's many diagrams and easy-to-understand descriptions of AI programs will help the casual reader gain an understanding of how these and other AI systems actually work. Its thorough (but unobtrusive) end-of-chapter notes containing citations to important source materials will be of great use to AI scholars and researchers. This book promises to be the definitive history of a field that has captivated the imaginations of scientists, philosophers, and writers for centuries.
The mathematician and engineer Charles Babbage (1791–1871) is best remembered for his 'calculating machines', which are considered the forerunner of modern computers. Over the course of his life he wrote a number of books based on his scientific investigations, but in this volume, published in 1864, Babbage writes in a more personal vein. He points out at the beginning of the work that it 'does not aspire to the name of autobiography', though the chapters sketch out the contours of his life, beginning with his family, his childhood and formative years studying at Cambridge, and moving through various episodes in his scientific career. However, the work also diverges into his observations on other topics, as indicated by chapter titles such as 'Street Nuisances' and 'Wit'. Babbage's colourful recollections give an intimate portrait of the life of one of Britain's most influential inventors.
Computing and Language Variation explores dialects and social differences in language computationally, examining topics such as how (and how much) linguistic differences impede intelligibility, how national borders accelerate and direct change, how opinio
This textbook, for second- or third-year students of computer science, presents insights, notations, and analogies to help them describe and think about algorithms like an expert, without grinding through lots of formal proof. Solutions to many problems are provided to let students check their progress, while class-tested PowerPoint slides are on the web for anyone running the course. By looking at both the big picture and easy step-by-step methods for developing algorithms, the author guides students around the common pitfalls. He stresses paradigms such as loop invariants and recursion to unify a huge range of algorithms into a few meta-algorithms. The book fosters a deeper understanding of how and why each algorithm works. These insights are presented in a careful and clear way, helping students to think abstractly and preparing them for creating their own innovative ways to solve problems.
In spite of the rapid growth of interest in the computer analysis of language, this book provides an integrated introduction to the field. Inevitably, when many different approaches are still being considered, a straightforward work of synthesis would be neither possible nor practicable. Nevertheless, Ralph Grishman provides a valuable survey of various approaches to the problems of syntax analysis, semantic analysis, text analysis and natural language generation, while considering in greater detail those that seem to him most productive. The book is written for readers with some background in computer science and finite mathematics, but advanced knowledge of programming languages or compilers is not necessary and nor is a background in linguistics. The exposition is always clear and students will find the exercises and extensive bibliography supporting the text particularly helpful.
This book provides a clear and accessible introduction to formal, and especially Montague, semantics within a linguistic framework. It presupposes no previous background in logic, but takes the student step-by-step from simple predicate/argument structures and their interpretation through to Montague's intentional logic. It covers all the major aspects, including set theory, propositional logic, type theory, lambda abstraction, traditional and generalised quantifiers, inference, tense and aspect, possible worlds semantics, and intensionality. Throughout the emphasis is on the use of logical tools for linguistic semantics, rather than on purely logical topics, and the introductory chapter situates formal semantics within the general framework of linguistic semantics. It assumes some basic knowledge of linguistics, but aims to be as non-technical as possible within a technical subject. Formal Semantics will be welcomed by students of linguistics, artificial intelligence and cognitive science alike.
Information and Communication Technologies (ICTs) have profoundly changed many aspects of life, including the nature of entertainment, work, communication, education, healthcare, industrial production and business, social relations and conflicts. They have had a radical and widespread impact on our moral lives and hence on contemporary ethical debates. The Cambridge Handbook of Information and Computer Ethics, first published in 2010, provides an ambitious and authoritative introduction to the field, with discussions of a range of topics including privacy, ownership, freedom of speech, responsibility, technological determinism, the digital divide, cyber warfare, and online pornography. It offers an accessible and thoughtful survey of the transformations brought about by ICTs and their implications for the future of human life and society, for the evaluation of behaviour, and for the evolution of moral values and rights. It will be a valuable book for all who are interested in the ethical aspects of the information society in which we live.
This textbook is an introduction to denotational semantics and its applications to programming languages. Dr Allison emphasizes a practical approach and the student is encouraged to write and test denotational definitions. The first section is devoted to the mathematical foundations of the subject and sufficient detail is given to illustrate the fundamental problems. The remainder of the book covers the use of denotational semantics to describe sequential programming languages such as Algol, Pascal and C. Throughout, numerous exercises, usually in Pascal, will help the student practise writing definitions and carry out simple applications. The book culminates in discussing an executable semantics of the logic-programming language Prolog. Being an introduction, advanced undergraduates in computer science and graduates new to the subject will find this a readily accessible account of one of the central topics of computer science.
This 1992 collection takes the exciting step of examining natural language phenomena from the perspective of both computational linguistics and formal semantics. Computational linguistics has until now been primarily concerned with the construction of computational models for handling the complexities of linguistic form, but has not tackled the questions of representing or computing meaning. Formal semantics, on the other hand, has attempted to account for the relations between forms and meanings, without necessarily attending to computational concerns. The book introduces the reader to the two disciplines and considers the prospects for the more unified and comprehensive computational theory of language which might obtain from their amalgamation. Of great interest to those working in the fields of computation, logic, semantics, artificial intelligence and linguistics generally.
Computability and Logic has become a classic because of its accessibility to students without a mathematical background and because it covers not simply the staple topics of an intermediate logic course, such as Godel's incompleteness theorems, but also a large number of optional topics, from Turing's theory of computability to Ramsey's theorem. This 2007 fifth edition has been thoroughly revised by John Burgess. Including a selection of exercises, adjusted for this edition, at the end of each chapter, it offers a simpler treatment of the representability of recursive functions, a traditional stumbling block for students on the way to the Godel incompleteness theorems. This updated edition is also accompanied by a website as well as an instructor's manual.
This textbook is designed as a first book on concurrent programming for computer science undergraduates, and provides a comprehensive introduction to the problems of concurrency. Concurrency is of vital importance in many areas of computer science, particularly in operating systems. It is also increasingly being taught in undergraduate courses. The book builds on the student's familiarity with sequential programming in a high level language, which will make it very accessible to computer science students. The book is concerned mainly with the high level aspects of concurrency, which will be equally applicable to traditional time sliced or more recent truly parallel systems.