To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This is the revised and expanded 1998 edition of a popular introduction to the design and implementation of geometry algorithms arising in areas such as computer graphics, robotics, and engineering design. The basic techniques used in computational geometry are all covered: polygon triangulations, convex hulls, Voronoi diagrams, arrangements, geometric searching, and motion planning. The self-contained treatment presumes only an elementary knowledge of mathematics, but reaches topics on the frontier of current research, making it a useful reference for practitioners at all levels. The second edition contains material on several new topics, such as randomized algorithms for polygon triangulation, planar point location, 3D convex hull construction, intersection algorithms for ray-segment and ray-triangle, and point-in-polyhedron. The code in this edition is significantly improved from the first edition (more efficient and more robust), and four new routines are included. Java versions for this new edition are also available. All code is accessible from the book's Web site (http://cs.smith.edu/~orourke/) or by anonymous ftp.
Teaches students how to solve ethical dilemmas in the field of computing, taking a philosophical, rather than a legal, approach to the topic. It first examines the principles of Idealism, Realism, Pragmatism, Existentialism, and Philosophical Analysis, explaining how each of them might be adopted as a basis for solving computing dilemmas. The book then presents a worksheet of key questions to be used in solving dilemmas. Twenty-nine cases, drawn from the real-life experiences of computer professionals, are included in the book as a means to let students experiment with solving ethical dilemmas and identify the philosophical underpinnings of the solutions.
This easy-to-read guide provides a concise introduction to the engineering background of modern communication systems, from mobile phones to data compression and storage. Background mathematics and specific engineering techniques are kept to a minimum so that only a basic knowledge of high-school mathematics is needed to understand the material covered. The authors begin with many practical applications in coding, including the repetition code, the Hamming code and the Huffman code. They then explain the corresponding information theory, from entropy and mutual information to channel capacity and the information transmission theorem. Finally, they provide insights into the connections between coding theory and other fields. Many worked examples are given throughout the book, using practical applications to illustrate theoretical definitions. Exercises are also included, enabling readers to double-check what they have learned and gain glimpses into more advanced topics, making this perfect for anyone who needs a quick introduction to the subject.
For undergraduate and beginning graduate students, this textbook explains and examines the central concepts used in modern programming languages, such as functions, types, memory management, and control. The book is unique in its comprehensive presentation and comparison of major object-oriented programming languages. Separate chapters examine the history of objects, Simula and Smalltalk, and the prominent languages C++ and Java. The author presents foundational topics, such as lambda calculus and denotational semantics, in an easy-to-read, informal style, focusing on the main insights provided by these theories. Advanced topics include concurrency, concurrent object-oriented programming, program components, and inter-language interoperability. A chapter on logic programming illustrates the importance of specialized programming methods for certain kinds of problems. This book will give the reader a better understanding of the issues and tradeoffs that arise in programming language design, and a better appreciation of the advantages and pitfalls of the programming languages they use.
The principles of cognition are becoming increasingly important in the areas of signal processing, communications and control. In this groundbreaking book, Simon Haykin, a pioneer in the field and an award-winning researcher, educator and author, sets out the fundamental ideas of cognitive dynamic systems. Weaving together the various branches of study involved, he demonstrates the power of cognitive information processing and highlights a range of future research directions. The book begins with a discussion of core topics such as cognition and sensing, dealing, in particular, with the perception-action cycle. Bayesian filtering, machine learning and dynamic programming are then addressed. Building on these foundations, there is detailed coverage of two important practical applications, cognitive radar and cognitive radio. Blending theory and practice, this insightful book is aimed at all graduate students and researchers looking for a thorough grounding in this fascinating field.
This textbook for advanced undergraduates and graduate students emphasizes algorithms for a range of strategies for locomotion, sensing, and reasoning. It concentrates on wheeled and legged mobile robots but discusses a variety of other propulsion systems. This edition includes advances in robotics and intelligent machines over the ten years prior to publication, including significant coverage of SLAM (simultaneous localization and mapping) and multi-robot systems. It includes additional mathematical background and an extensive list of sample problems. Various mathematical techniques that were assumed in the first edition are now briefly introduced in appendices at the end of the text to make the book more self-contained. Researchers as well as students in the field of mobile robotics will appreciate this comprehensive treatment of state-of-the-art methods and key technologies.
Fundamentals of OOP and Data Structures in Java is a text for an introductory course on classical data structures. Part One of the book presents the basic principles of Object-Oriented Programming (OOP) and Graphical User Interface (GUI) programming with Java as the example language. Part Two introduces each of the major data structures with supporting, GUI-based laboratory programs designed to reinforce the basic concepts and principles of the text. These laboratories allow the reader to explore and experiment with the properties of each data structure. All source code for the laboratories is available on the web. By integrating the principles of OOP and GUI programming, this book takes the unique path of presenting the fundamental issues of data structures within the context of paradigms that are essential to today's professional software developer. The authors assume the reader has only an elementary understanding of Java and no experience with OOP.
This new, expanded textbook describes all phases of a modern compiler: lexical analysis, parsing, abstract syntax, semantic actions, intermediate representations, instruction selection via tree matching, dataflow analysis, graph-coloring register allocation, and runtime systems. It includes good coverage of current techniques in code generation and register allocation, as well as functional and object-oriented languages, that are missing from most books. In addition, more advanced chapters are now included so that it can be used as the basis for a two-semester or graduate course. The most accepted and successful techniques are described in a concise way, rather than as an exhaustive catalog of every possible variant. Detailed descriptions of the interfaces between modules of a compiler are illustrated with actual C header files. The first part of the book, Fundamentals of Compilation, is suitable for a one-semester first course in compiler design. The second part, Advanced Topics, which includes the advanced chapters, covers the compilation of object-oriented and functional languages, garbage collection, loop optimizations, SSA form, loop scheduling, and optimization for cache-memory hierarchies.
Free logic is an important field of philosophical logic that first appeared in the 1950s. J. Karel Lambert was one of its founders and coined the term itself. The essays in this collection (written over a period of 40 years) explore the philosophical foundations of free logic and its application to areas as diverse as the philosophy of religion and computer science. Amongst the applications on offer are those to the analysis of existence statements, to definite descriptions and to partial functions. The volume contains a proof that free logics of any kind are non-extensional and then uses that proof to show that Quine's theory of predication and referential transparency must fail. The purpose of this collection is to bring an important body of work to the attention of a new generation of professional philosophers, computer scientists and mathematicians.
Hundreds of millions of people across the world use the Internet every day. Its functions vary, from shopping and banking to chatting and dating. From a psychological perspective, the Internet has become a major vehicle for interpersonal communication that can significantly affect people's decisions, behaviors, attitudes and emotions. Moreover, its existence has created a virtual social environment in which people can meet, negotiate, collaborate and exchange goods and information. Cyberspace is not just a technical device but a phenomenon which has reduced the world to a proverbial global village, fostering collaborations and international cooperations; thus reducing the barriers of geographical distance and indigenous cultures. Azy Barak and a team of prominent social scientists review a decade of scientific investigations into the social, behavioral and psychological aspects of cyberspace, collating state-of-the-art knowledge in each area. Together they develop emerging conceptualizations and envisage directions and applications for future research.
The past decade has seen many advances in physical layer wireless communication theory and their implementation in wireless systems. This textbook takes a unified view of the fundamentals of wireless communication and explains the web of concepts underpinning these advances at a level accessible to an audience with a basic background in probability and digital communication. Topics covered include MIMO (multi-input, multi-output) communication, space-time coding, opportunistic communication, OFDM and CDMA. The concepts are illustrated using many examples from real wireless systems such as GSM, IS-95 (CDMA), IS-856 (1 x EV-DO), Flash OFDM and UWB (ultra-wideband). Particular emphasis is placed on the interplay between concepts and their implementation in real systems. An abundant supply of exercises and figures reinforce the material in the text. This book is intended for use on graduate courses in electrical and computer engineering and will also be of great interest to practising engineers.
One of the most cited books in physics of all time, Quantum Computation and Quantum Information remains the best textbook in this exciting field of science. This 10th anniversary edition includes an introduction from the authors setting the work in context. This comprehensive textbook describes such remarkable effects as fast quantum algorithms, quantum teleportation, quantum cryptography and quantum error-correction. Quantum mechanics and computer science are introduced before moving on to describe what a quantum computer is, how it can be used to solve problems faster than 'classical' computers and its real-world implementation. It concludes with an in-depth treatment of quantum information. Containing a wealth of figures and exercises, this well-known textbook is ideal for courses on the subject, and will interest beginning graduate students and researchers in physics, computer science, mathematics, and electrical engineering.
We are surrounded by products that have minds of their own. Computing power, in the form of microcontrollers, microprocessors, sensors, and data storage chips, has become so cheap that manufacturers are building connectivity and embedded intelligence into all types of consumer goods. These 'smart products' are fundamentally changing both the competitive landscape for business and the daily lives of consumers. This book analyzes the evolution of smart products to help managers understand the impact of embedded product intelligence on corporate strategy, consumer value, and industry competition. It describes four different ecosystem strategies for designing and launching smart products: the control-focused Hegemon, the standards-focused Federator, the high growth and brand-focused Charismatic Leader, and the disruptive industry Transformer. This ecosystem model is then applied to smart products in the automotive, wireless, energy, residential, and health industries. The book concludes with recommendations for successfully managing smart products and services.
Following a discussion of various forms of set-theoretical foundations of category theory and the controversial question of whether category theory does or can provide an autonomous foundation of mathematics, this article concentrates on the question whether there is a foundation for “unlimited” or “naive” category theory. The author proposed four criteria for such some years ago. The article describes how much had previously been accomplished on one approach to meeting those criteria, then takes care of one important obstacle that had been met in that approach, and finally explains what remains to be done if one is to have a fully satisfactory solution.
This paper deals with the decidability of semigroup freeness. More precisely, thefreeness problem over a semigroup S is defined as: given a finite subsetX ⊆ S, decide whether each element ofS has at most one factorization over X. To date, thedecidabilities of the following two freeness problems have been closely examined. In 1953,Sardinas and Patterson proposed a now famous algorithm for the freeness problem over thefree monoids. In 1991, Klarner, Birget and Satterfield proved the undecidability of thefreeness problem over three-by-three integer matrices. Both results led to the publicationof many subsequent papers. The aim of the present paper is (i) to presentgeneral results about freeness problems, (ii) to study the decidabilityof freeness problems over various particular semigroups (special attention is devoted tomultiplicative matrix semigroups), and (iii) to propose precise,challenging open questions in order to promote the study of the topic.
Since Findler and Felleisen (Findler, R. B. & Felleisen, M. 2002) introduced higher-order contracts, many variants have been proposed. Broadly, these fall into two groups: some follow Findler and Felleisen (2002) in using latent contracts, purely dynamic checks that are transparent to the type system; others use manifest contracts, where refinement types record the most recent check that has been applied to each value. These two approaches are commonly assumed to be equivalent—different ways of implementing the same idea, one retaining a simple type system, and the other providing more static information. Our goal is to formalize and clarify this folklore understanding. Our work extends that of Gronski and Flanagan (Gronski, J. & Flanagan, C. 2007), who defined a latent calculus λC and a manifest calculus λH, gave a translation φ from λC to λH, and proved that if a λC term reduces to a constant, so does its φ-image. We enrich their account with a translation ψ from λH to λC and prove an analogous theorem. We then generalize the whole framework to dependent contracts, whose predicates can mention free variables. This extension is both pragmatically crucial, supporting a much more interesting range of contracts, and theoretically challenging. We define dependent versions of λH and two dialects (“lax” and “picky”) of λC, establish type soundness—a substantial result in itself, for λH — and extend φ and ψ accordingly. Surprisingly, the intuition that the latent and manifest systems are equivalent now breaks down: the extended translations preserve behavior in one direction, but in the other, sometimes yield terms that blame more.
Region-based memory management (RBMM) is a form of compile time memory management, well-known from the world of functional programming. In this paper we describe our work on implementing RBMM for the logic programming language Mercury. One interesting point about Mercury is that it is designed with strong type, mode, and determinism systems. These systems not only provide Mercury programmers with several direct software engineering benefits, such as self-documenting code and clear program logic, but also give language implementors a large amount of information that is useful for program analyses. In this work, we make use of this information to develop program analyses that determine the distribution of data into regions and transform Mercury programs by inserting into them the necessary region operations. We prove the correctness of our program analyses and transformation. To execute annotated programs, we have implemented runtime support that tackles the two main challenges posed by backtracking. First, backtracking can require regions removed during forward execution to be “resurrected”; and second, any memory allocated during computation that has been backtracked over must be recovered promptly without waiting for the regions involved to come to the end of their life. We describe in detail our solution of both these problems. We study in detail how our RBMM system performs on a selection of benchmark programs, including some well-known difficult cases for RBMM. Even with these difficult cases, our RBMM-enabled Mercury system obtains clearly faster runtimes for 15 out of 18 benchmarks compared to the base Mercury system with its Boehm runtime garbage collector, with an average runtime speedup of 24%, and an average reduction in memory requirements of 95%. In fact, our system achieves optimal memory consumption in some programs.
David L. Waltz died on March 22, 2012 after suffering from brain cancer.
Dave was a good friend to Natural Language Engineering, and provided some sage advice when Roberto Garigliano and I started working on the proposed journal in the early 1990s; he subsequently agreed to serve as a founding editorial board member.