To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Video games are bad for you? That’s what they said about rock and roll.
Shigeru Miyamoto
The first computer games
Since the earliest days, computers have been used for serious purposes and for fun. When computing resources were scarce and expensive, using computers for games was frowned upon and was typically an illicit occupation of graduate students late at night. Yet from these first clandestine experiments, computer video games are now big business. In 2012, global video game sales grew by more than 10 percent to more than $65 billion. In the United States, a 2011 survey found that more than 90 percent of children aged between two and seventeen played video games. In addition, the Entertainment Software Association in the United States estimated that 40 percent of all game players are now women and that women over the age of eighteen make up a third of the total game-playing population. In this chapter we take a look at how this multibillion-dollar industry began and how video games have evolved from male-dominated “shoot ’em up” arcade games to more family-friendly casual games on smart phones and tablets.
One of the first computer games was written for the EDSAC computer at Cambridge University in 1952. Graduate student Alexander Douglas used a computer game as an illustration for his PhD dissertation on human-computer interaction. The game was based on the game called tic-tac-toe in the United States and noughts and crosses in the United Kingdom. Although Douglas did not name his game, computer historian Martin Campbell-Kelly saved the game in a file called OXO for his simulator program, and this name now seems to have escaped into the wild. The player competed against the computer, and output was programmed to appear on the computer’s cathode ray tube (CRT) as a display screen. The source code was short and, predictably, the computer could play a perfect game of tic-tac-toe (Fig. 9.1).
It seems reasonable to envision, for a time 10 or 15 years hence, a “thinking center” that will incorporate the functions of present-day libraries together with anticipated advances in information storage and retrieval.... The picture readily enlarges itself into a network of such centers, connected to one another by wide-band communication lines and to individual users by leased-wire services. In such a system, the speed of the computers would be balanced, and the cost of gigantic memories and the sophisticated programs would be divided by the number of users.
J. C. R. Licklider
The network is the computer
Today, with the Internet and World Wide Web, it seems very obvious that computers become much more powerful in all sorts of ways if they are connected together. In the 1970s this result was not so obvious. This chapter is about how the Internet of today came about. As we can see from Licklider’s (B.10.1) quotation beginning this chapter, in addition to arguing for the importance of interactive computing in his 1960 paper on “Man-Computer Symbiosis,” Lick also envisaged linking computers together, a practice we now call computer networking. Larry Roberts, Bob Taylor’s hand-picked successor at the Department of Defense’s Advanced Research Projects Agency (ARPA), was the person responsible for funding and overseeing the construction of the ARPANET, the first North American wide area network (WAN). A WAN links together computers over a large geographic area, such as a state or country, enabling the linked computers to share resources and exchange information.
Computers now impact almost every aspect of our lives, from our social interactions to the safety and performance of our cars. How did this happen in such a short time? And this is just the beginning. In this book, Tony Hey and Gyuri Pápay lead us on a journey from the early days of computers in the 1930s to the cutting-edge research of the present day that will shape computing in the coming decades. Along the way, they explain the ideas behind hardware, software, algorithms, Moore's Law, the birth of the personal computer, the Internet and the Web, the Turing Test, Jeopardy's Watson, World of Warcraft, spyware, Google, Facebook and quantum computing. This book also introduces the fascinating cast of dreamers and inventors who brought these great technological developments into every corner of the modern world. This exciting and accessible introduction will open up the universe of computing to anyone who has ever wondered where his or her smartphone came from.
A valuable resource for working programmers, as well as a fount of useful algorithmic tools for computer scientists, this new edition of the popular calendars book expands the treatment of the previous edition to new calendar variants: generic cyclical calendars and astronomical lunar calendars as well as the Korean, Vietnamese, Aztec, and Tibetan calendars. The authors frame the calendars of the world in a completely algorithmic form, allowing easy conversion among these calendars and the determination of secular and religious holidays. LISP code for all the algorithms are available on the Web.
Artificial intelligence (AI) is a field within computer science that is attempting to build enhanced intelligence into computer systems. This book traces the history of the subject, from the early dreams of eighteenth-century (and earlier) pioneers to the more successful work of today's AI engineers. AI is becoming more and more a part of everyone's life. The technology is already embedded in face-recognizing cameras, speech-recognition software, Internet search engines, and health-care robots, among other applications. The book's many diagrams and easy-to-understand descriptions of AI programs will help the casual reader gain an understanding of how these and other AI systems actually work. Its thorough (but unobtrusive) end-of-chapter notes containing citations to important source materials will be of great use to AI scholars and researchers. This book promises to be the definitive history of a field that has captivated the imaginations of scientists, philosophers, and writers for centuries.
The mathematician and engineer Charles Babbage (1791–1871) is best remembered for his 'calculating machines', which are considered the forerunner of modern computers. Over the course of his life he wrote a number of books based on his scientific investigations, but in this volume, published in 1864, Babbage writes in a more personal vein. He points out at the beginning of the work that it 'does not aspire to the name of autobiography', though the chapters sketch out the contours of his life, beginning with his family, his childhood and formative years studying at Cambridge, and moving through various episodes in his scientific career. However, the work also diverges into his observations on other topics, as indicated by chapter titles such as 'Street Nuisances' and 'Wit'. Babbage's colourful recollections give an intimate portrait of the life of one of Britain's most influential inventors.
Computing and Language Variation explores dialects and social differences in language computationally, examining topics such as how (and how much) linguistic differences impede intelligibility, how national borders accelerate and direct change, how opinio
This textbook, for second- or third-year students of computer science, presents insights, notations, and analogies to help them describe and think about algorithms like an expert, without grinding through lots of formal proof. Solutions to many problems are provided to let students check their progress, while class-tested PowerPoint slides are on the web for anyone running the course. By looking at both the big picture and easy step-by-step methods for developing algorithms, the author guides students around the common pitfalls. He stresses paradigms such as loop invariants and recursion to unify a huge range of algorithms into a few meta-algorithms. The book fosters a deeper understanding of how and why each algorithm works. These insights are presented in a careful and clear way, helping students to think abstractly and preparing them for creating their own innovative ways to solve problems.
In spite of the rapid growth of interest in the computer analysis of language, this book provides an integrated introduction to the field. Inevitably, when many different approaches are still being considered, a straightforward work of synthesis would be neither possible nor practicable. Nevertheless, Ralph Grishman provides a valuable survey of various approaches to the problems of syntax analysis, semantic analysis, text analysis and natural language generation, while considering in greater detail those that seem to him most productive. The book is written for readers with some background in computer science and finite mathematics, but advanced knowledge of programming languages or compilers is not necessary and nor is a background in linguistics. The exposition is always clear and students will find the exercises and extensive bibliography supporting the text particularly helpful.
This book provides a clear and accessible introduction to formal, and especially Montague, semantics within a linguistic framework. It presupposes no previous background in logic, but takes the student step-by-step from simple predicate/argument structures and their interpretation through to Montague's intentional logic. It covers all the major aspects, including set theory, propositional logic, type theory, lambda abstraction, traditional and generalised quantifiers, inference, tense and aspect, possible worlds semantics, and intensionality. Throughout the emphasis is on the use of logical tools for linguistic semantics, rather than on purely logical topics, and the introductory chapter situates formal semantics within the general framework of linguistic semantics. It assumes some basic knowledge of linguistics, but aims to be as non-technical as possible within a technical subject. Formal Semantics will be welcomed by students of linguistics, artificial intelligence and cognitive science alike.
Information and Communication Technologies (ICTs) have profoundly changed many aspects of life, including the nature of entertainment, work, communication, education, healthcare, industrial production and business, social relations and conflicts. They have had a radical and widespread impact on our moral lives and hence on contemporary ethical debates. The Cambridge Handbook of Information and Computer Ethics, first published in 2010, provides an ambitious and authoritative introduction to the field, with discussions of a range of topics including privacy, ownership, freedom of speech, responsibility, technological determinism, the digital divide, cyber warfare, and online pornography. It offers an accessible and thoughtful survey of the transformations brought about by ICTs and their implications for the future of human life and society, for the evaluation of behaviour, and for the evolution of moral values and rights. It will be a valuable book for all who are interested in the ethical aspects of the information society in which we live.
This textbook is an introduction to denotational semantics and its applications to programming languages. Dr Allison emphasizes a practical approach and the student is encouraged to write and test denotational definitions. The first section is devoted to the mathematical foundations of the subject and sufficient detail is given to illustrate the fundamental problems. The remainder of the book covers the use of denotational semantics to describe sequential programming languages such as Algol, Pascal and C. Throughout, numerous exercises, usually in Pascal, will help the student practise writing definitions and carry out simple applications. The book culminates in discussing an executable semantics of the logic-programming language Prolog. Being an introduction, advanced undergraduates in computer science and graduates new to the subject will find this a readily accessible account of one of the central topics of computer science.
This 1992 collection takes the exciting step of examining natural language phenomena from the perspective of both computational linguistics and formal semantics. Computational linguistics has until now been primarily concerned with the construction of computational models for handling the complexities of linguistic form, but has not tackled the questions of representing or computing meaning. Formal semantics, on the other hand, has attempted to account for the relations between forms and meanings, without necessarily attending to computational concerns. The book introduces the reader to the two disciplines and considers the prospects for the more unified and comprehensive computational theory of language which might obtain from their amalgamation. Of great interest to those working in the fields of computation, logic, semantics, artificial intelligence and linguistics generally.
Computability and Logic has become a classic because of its accessibility to students without a mathematical background and because it covers not simply the staple topics of an intermediate logic course, such as Godel's incompleteness theorems, but also a large number of optional topics, from Turing's theory of computability to Ramsey's theorem. This 2007 fifth edition has been thoroughly revised by John Burgess. Including a selection of exercises, adjusted for this edition, at the end of each chapter, it offers a simpler treatment of the representability of recursive functions, a traditional stumbling block for students on the way to the Godel incompleteness theorems. This updated edition is also accompanied by a website as well as an instructor's manual.
This textbook is designed as a first book on concurrent programming for computer science undergraduates, and provides a comprehensive introduction to the problems of concurrency. Concurrency is of vital importance in many areas of computer science, particularly in operating systems. It is also increasingly being taught in undergraduate courses. The book builds on the student's familiarity with sequential programming in a high level language, which will make it very accessible to computer science students. The book is concerned mainly with the high level aspects of concurrency, which will be equally applicable to traditional time sliced or more recent truly parallel systems.
In this lively series of essays, Tom Dean explores interesting fundamental topics in computer science with the aim of showing how computers and computer programs work and how the various subfields of computer science are connected. Along the way, he conveys his fascination with computers and enthusiasm for working in a field that has changed almost every aspect of our daily lives. The essays touch on a wide range of topics, from digital logic and machine language to artificial intelligence and searching the World Wide Web, considering such questions as:How can a computer learn to recognize junk email?What happens when you click on a link in a browser?How can you program a robot to do two things at once?Are there limits on what computers can do?The author invites readers to experiment with short programs written in several languages. Through these interactions he grounds the models and metaphors of computer science and makes the underlying computational ideas more concrete. The accompanying web site http://www.cs.brown.edu/~tld/talk/ provides easy access to code fragments from the book, tips on finding and installing software, links to online resources, exercises and sample lectures.
This fourth edition of one of the classic logic textbooks has been thoroughly revised by John Burgess. The aim is to increase the pedagogical value of the book for the core market of students of philosophy and for students of mathematics and computer science as well. This book has become a classic because of its accessibility to students without a mathematical background, and because it covers not simply the staple topics of an intermediate logic course such as Godel's Incompleteness Theorems, but also a large number of optional topics from Turing's theory of computability to Ramsey's theorem. John Burgess has now enhanced the book by adding a selection of problems at the end of each chapter, and by reorganising and rewriting chapters to make them more independent of each other and thus to increase the range of options available to instructors as to what to cover and what to defer.
This textbook is an introduction to the design and writing of computer programs. It leads the reader through all the stages of program construction from the original specifications through to the final program. The formal verification of intermediate versions of the program is studied in considerable detail. The authors show how, given the formal specification of a program, data structure and program structure diagrams are drawn and then converted into a procedural program in a program design language (PDL). They demonstrate the conversion of PDL into a variety of real programming languages including Pascal, FORTRAN, COBOL, and Assembler. The book also includes chapters on abstract data types, analysing existing programs, and a small case study. First-year undergraduates in computer science and graduates taking courses in computing will find this a comprehensive introduction to program construction.
Free logic is an important field of philosophical logic that first appeared in the 1950s. J. Karel Lambert was one of its founders and coined the term itself. The essays in this collection (written over a period of 40 years) explore the philosophical foundations of free logic and its application to areas as diverse as the philosophy of religion and computer science. Amongst the applications on offer are those to the analysis of existence statements, to definite descriptions and to partial functions. The volume contains a proof that free logics of any kind are non-extensional and then uses that proof to show that Quine's theory of predication and referential transparency must fail. The purpose of this collection is to bring an important body of work to the attention of a new generation of professional philosophers, computer scientists and mathematicians.