To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
As soon as an Analytical Engine exists, it will necessarily guide the future course of the science. Whenever any result is sought by its aid, the question will then arise – by what course of calculation can these results be arrived at by the machine in the shortest time?
Charles Babbage
Beginnings
What is an algorithm? The word is derived from the name of the Persian scholar Mohammad Al-Khowarizmi (see B.5.1 and Fig. 5.1). In the introduction to his classic book Algorithmics: The Spirit of Computing, computer scientist David Harel gives the following definition:
An algorithm is an abstract recipe, prescribing a process that might be carried out by a human, by a computer, or by other means. It thus represents a very general concept, with numerous applications. Its principal interest and use, however, is in those cases where the process is to be carried out by a computer.
Thus an algorithm can be regarded as a “recipe” detailing the mathematical steps to follow to do a particular task. This could be a numerical algorithm for solving a differential equation or an algorithm for completing a more abstract task, such as sorting a list of items according to some specified property. The word algorithmics was introduced by J. F. Traub in a textbook in 1964 and popularized as a key field of study in computer science by Donald Knuth (B.5.2) and David Harel (B.5.3). When the steps to define an algorithm to carry out a particular task have been identified, the programmer chooses a programming language to express the algorithm in a form that the computer can understand.
In 2012 the U.S. National Research Council published the report “Continuing Innovation in Information Technology.” The report contained an updated version of the Tire Tracks figure, first published in 1995. Figure A.2 gives examples of how computer science research, in universities and in industry, has directly led to the introduction of entirely new categories of products that have ultimately provided the basis for new billion-dollar industries. Most of the university-based research has been federally funded.
The bottom row of the figure shows specific computer science research areas where major investments have resulted in the different information technology industries and companies shown at the top of the figure. The vertical red tracks represent university-based research and the blue tracks represent industry research and development. The dashed black lines indicate periods following the introduction of significant commercial products resulting from this research, and the green lines represent the establishment of billion-dollar industries with the thick green lines showing the achievement of multibillion-dollar markets by some of these industries.
No one saw these mice coming. No one, that is, in my field, writing science fictions. Oh, a few novels were written about those Big Brains, a few New Yorker cartoons were drawn showing those immense electric craniums that needed whole warehouses to THINK in. But no one in all of future writing foresaw those big brutes dieted down to fingernail earplug size so you could shove Moby Dick in one ear and pull Job and Ecclesiastes out the other.
Ray Bradbury
Early visions
The British science fiction writer, Brian Aldiss, traces the origin of science fiction to Mary Shelley’s Frankenstein in 1818. In her book, the unwise scientist, Victor Frankenstein, deliberately makes use of his knowledge of anatomy, chemistry, electricity, and physiology to create a living creature. An alternative starting point dates back to the second half of the nineteenth century with the writing of Jules Verne (B.17.1) and Herbert George (H. G.) Wells (B.17.2). This was a very exciting time for science – in 1859 Charles Darwin had published the Origin of Species; in 1864 James Clerk Maxwell had unified the theories of electricity and magnetism; and in 1869 Mendeleev had brought some order to chemistry with his Periodic Table of the Elements, and Joule and Kelvin were laying the foundations of thermodynamics. Verne had the idea of combining modern science with an adventure story to create a new type of fiction. After publishing his first such story “Five Weeks in a Balloon” in 1863, he wrote:
I have just finished a novel in a new form, a new form – do you understand? If it succeeds, it will be a gold mine.
There are many “popular” books on science that provide accessible accounts of the recent developments of modern science for the general reader. However, there are very few popular books about computer science – arguably the “science” that has changed the world the most in the last half century. This book is an attempt to address this imbalance and to present an accessible account of the origins and foundations of computer science. In brief, the goal of this book is to explain how computers work, how we arrived at where we are now, and where we are likely to be going in the future.
The key inspiration for writing this book came from Physics Nobel Prize recipient Richard Feynman. In his lifetime, Feynman was one of the few physicists well known to a more general public. There were three main reasons for this recognition. First, there were some wonderful British television programs of Feynman talking about his love for physics. Second, there was his best-selling book “Surely You’re Joking, Mr. Feynman!”: Adventures of a Curious Character, an entertaining collection of stories about his life in physics – from his experiences at Los Alamos and the Manhattan atomic bomb project, to his days as a professor at Cornell and Caltech. And third, when he was battling the cancer that eventually took his life, was his participation in the enquiry following the Challenger space shuttle disaster. His live demonstration, at a televised press conference, of the effects of freezing water on the rubber O-rings of the space shuttle booster rockets was a wonderfully understandable explanation of the origin of the disaster.
Video games are bad for you? That’s what they said about rock and roll.
Shigeru Miyamoto
The first computer games
Since the earliest days, computers have been used for serious purposes and for fun. When computing resources were scarce and expensive, using computers for games was frowned upon and was typically an illicit occupation of graduate students late at night. Yet from these first clandestine experiments, computer video games are now big business. In 2012, global video game sales grew by more than 10 percent to more than $65 billion. In the United States, a 2011 survey found that more than 90 percent of children aged between two and seventeen played video games. In addition, the Entertainment Software Association in the United States estimated that 40 percent of all game players are now women and that women over the age of eighteen make up a third of the total game-playing population. In this chapter we take a look at how this multibillion-dollar industry began and how video games have evolved from male-dominated “shoot ’em up” arcade games to more family-friendly casual games on smart phones and tablets.
One of the first computer games was written for the EDSAC computer at Cambridge University in 1952. Graduate student Alexander Douglas used a computer game as an illustration for his PhD dissertation on human-computer interaction. The game was based on the game called tic-tac-toe in the United States and noughts and crosses in the United Kingdom. Although Douglas did not name his game, computer historian Martin Campbell-Kelly saved the game in a file called OXO for his simulator program, and this name now seems to have escaped into the wild. The player competed against the computer, and output was programmed to appear on the computer’s cathode ray tube (CRT) as a display screen. The source code was short and, predictably, the computer could play a perfect game of tic-tac-toe (Fig. 9.1).
It seems reasonable to envision, for a time 10 or 15 years hence, a “thinking center” that will incorporate the functions of present-day libraries together with anticipated advances in information storage and retrieval.... The picture readily enlarges itself into a network of such centers, connected to one another by wide-band communication lines and to individual users by leased-wire services. In such a system, the speed of the computers would be balanced, and the cost of gigantic memories and the sophisticated programs would be divided by the number of users.
J. C. R. Licklider
The network is the computer
Today, with the Internet and World Wide Web, it seems very obvious that computers become much more powerful in all sorts of ways if they are connected together. In the 1970s this result was not so obvious. This chapter is about how the Internet of today came about. As we can see from Licklider’s (B.10.1) quotation beginning this chapter, in addition to arguing for the importance of interactive computing in his 1960 paper on “Man-Computer Symbiosis,” Lick also envisaged linking computers together, a practice we now call computer networking. Larry Roberts, Bob Taylor’s hand-picked successor at the Department of Defense’s Advanced Research Projects Agency (ARPA), was the person responsible for funding and overseeing the construction of the ARPANET, the first North American wide area network (WAN). A WAN links together computers over a large geographic area, such as a state or country, enabling the linked computers to share resources and exchange information.
This paper presents a new numerical approach using a Genetic algorithm (GA) to search for the singularity-free cylindrical space of a 4-RRR planar redundant parallel manipulator and investigates the effects of the joint position (namely the length ratios of two links) of each leg on the singularity-free cylindrical space. A previous method investigated the maximal singularity-free zone in a 3-dimensional (3-D) space within a given workspace. The method in this paper is improved by optimizing the maximal singularity-free zone in a 2-dimensional (2-D) plane while considering the whole workspace. This improvement can be helpful for reducing the searching time and for finding a larger singularity-free zone. Furthermore, the effect of the joint position of each leg on the maximal singularity-free zone is studied in this paper, which reveals a larger singularity-free zone than before. This result shows that changing the joint positions of one or two legs may be more practical than changing the joint positions of more legs. The approach in this paper can be used to analyze the maximal singularity-free zone of any other three-degree-of-freedom (3-DOF) planar parallel mechanisms and will be useful for the optimal design of redundant parallel manipulators.
We present a novel motorized semi-autonomous mobile hospital bed guided by a human operator and a reactive navigation algorithm. The proposed reactive navigation algorithm is launched when the sensory device detects that the hospital bed is in the potential danger of collision. The semi-autonomous hospital bed is able to safely and quickly deliver critical neurosurgery (head trauma) patients to target locations in dynamic uncertain hospital environments such as crowded hospital corridors while avoiding en-route steady and moving obstacles. We do not restrict the nature or the motion of the obstacles, meaning that the shapes of the obstacles may be time-varying or deforming and they may undergo arbitrary motions. The only information available to the navigation system is the current distance to the nearest obstacle. Performance of the proposed navigation algorithm is verified via theoretical studies. Simulation and experimental results also confirm the performance of the reactive navigation algorithm in real world scenarios.
Wireless sensor networks are an emerging technology with a wide range of applications in military and civilian domains. The book begins by detailing the basic principles and concepts of wireless sensor networks, including information gathering, energy management and the structure of sensory nodes. It proceeds to examine advanced topics, covering localisation, topology, security and evaluation of wireless sensor networks, highlighting international research being carried out in this area. Finally, it features numerous examples of applications of this technology to a range of domains, such as wireless, multimedia, underwater and underground wireless sensor networks. The concise but clear presentation of the important principles, techniques and applications of wireless sensor networks makes this guide an excellent introduction for anyone new to the subject, as well as an ideal reference for practitioners and researchers.
Do you need to get up to date with the world's most popular networking technology? With this resource you will discover everything you need to know about Ethernet and its implementation in the automotive industry. Enhance your technical understanding and better inform your decision-making process so that you can experience the benefits of Ethernet implementation. From new market opportunities, to lower costs, and less complex processes; this is the first book to provide a comprehensive overview of automotive Ethernet. Covering electromagnetic requirements and physical layer technologies, Quality of Service, the use of VLANs, IP, and Service Discovery, as well as network architecture and testing, this unique and comprehensive resource is a must have, whether you are a professional in the automotive industry, or an academic who needs a detailed overview of this revolutionary technology and its historical background.
Whether you are a developer, engineer, researcher or student, this practical guide gives you everything you need to know about NFC technology and its applications. You will learn what differentiates NFC from other short-range technologies such as contactless cards, RFID and Bluetooth, as well as discovering the opportunities it provides, from a fast and instinctive user interface with no infrastructure requirements to the world of Secure Elements, Trusted Service Managers, mobile wallets and the Internet of Things. With critical applications in areas including advertising, retail and transportation, this book demonstrates how you can use NFC technology practically to make transactions easier and quicker. All of this is supplemented with an array of in-depth case studies and real-life examples to reinforce your understanding, along with detailed coverage of the problems associated with the wider commercial introduction of NFC and strategies that can be used to aid its future development.
Written by leading authorities in database and Web technologies, this book is essential reading for students and practitioners alike. The popularity of the Web and Internet commerce provides many extremely large datasets from which information can be gleaned by data mining. This book focuses on practical algorithms that have been used to solve key problems in data mining and can be applied successfully to even the largest datasets. It begins with a discussion of the map-reduce framework, an important tool for parallelizing algorithms automatically. The authors explain the tricks of locality-sensitive hashing and stream processing algorithms for mining data that arrives too fast for exhaustive processing. Other chapters cover the PageRank idea and related tricks for organizing the Web, the problems of finding frequent itemsets and clustering. This second edition includes new and extended coverage on social networks, machine learning and dimensionality reduction.
We give an algebraic characterization of the syntax and semantics of a class of untyped functional programming languages.
To this end, we introduce a notion of 2-signature: such a signature specifies not only the terms of a language, but also reduction rules on those terms. To any 2-signature (S, A) we associate a category of ‘models’. We then prove that this category has an initial object, which integrates the terms freely generated by S, and which is equipped with reductions according to the rules given in A. We call this initial object the programming language generated by (S, A). Models of a 2-signature are built from relative monads and modules over such monads. Through the use of monads, the models – and in particular, the initial model – come equipped with a substitution operation that is compatible with reduction in a suitable sense.
The initiality theorem is formalized in the proof assistant Coq, yielding a machinery which, when fed with a 2-signature, provides the associated programming language with reduction relation and certified substitution.
Timed and register automata are well-known models of computation over timed and data words, respectively. The former has clocks that allow to test the lapse of time between two events, whilst the latter includes registers that can store data values for later comparison. Although these two models behave differently in appearance, several decision problems have the same (un)decidability and complexity results for both models. As a prominent example, emptiness is decidable for alternating automata with one clock or register, both with non-primitive recursive complexity. This is not by chance.
This work confirms that there is indeed a tight relationship between the two models. We show that a run of a timed automaton can be simulated by a register automaton over ordered data domain, and conversely that a run of a register automaton can be simulated by a timed automaton. These are exponential time reductions hold both in the finite and infinite words settings. Our results allow to transfer decidability results back and forth between these two kinds of models, as well complexity results modulo an exponential time reduction. We justify the usefulness of these reductions by obtaining new results on register automata.
Computers now impact almost every aspect of our lives, from our social interactions to the safety and performance of our cars. How did this happen in such a short time? And this is just the beginning. In this book, Tony Hey and Gyuri Pápay lead us on a journey from the early days of computers in the 1930s to the cutting-edge research of the present day that will shape computing in the coming decades. Along the way, they explain the ideas behind hardware, software, algorithms, Moore's Law, the birth of the personal computer, the Internet and the Web, the Turing Test, Jeopardy's Watson, World of Warcraft, spyware, Google, Facebook and quantum computing. This book also introduces the fascinating cast of dreamers and inventors who brought these great technological developments into every corner of the modern world. This exciting and accessible introduction will open up the universe of computing to anyone who has ever wondered where his or her smartphone came from.
In a recent paper, Girard (2012) proposed to use his recent construction of a geometry of interaction in the hyperfinite factor (Girard 2011) in an innovative way to characterize complexity classes. We begin by giving a detailed explanation of both the choices and the motivations of Girard's definitions. We then provide a complete proof that the complexity class co-NL can be characterized using this new approach. We introduce the non-deterministic pointer machine as a technical tool, a concrete model to compute algorithms.
Performance evaluation of Wireless Sensor Networks (WSNs), like any communications network system, can be conducted by using simulation analysis, analytic modeling, and measurement/testing techniques. Evaluation of WSN systems is needed at every stage in their life. There is no point in designing and implementing a new system that does not have competitive performance, and does not meet the objectives and performance evaluation and quality of service requirements. Performance evaluation of an existing system is also important since it helps to determine how well it is performing and whether any improvements are needed in order to enhance the performance [1].
After a system has been built and is running, its performance can be assessed by using the measurement technique. In order to evaluate the performance of a component or subsystem that cannot be measured, for example, during the design and development phases, it is necessary to use analytic and/or simulation modeling so as to predict the performance [1–46].
The objective of this chapter is to provide an up-to-date treatment of the techniques that can be used to evaluate the performance of WSN systems.
Background information
Wireless sensor networks (WSNs) are unique in certain aspects that make them different from other wireless networks. These aspects include: