To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The TCP/IP family of protocols have become the de facto standard in the world of networking, are found in virtually all computer communication systems, and form the basis of today's Internet. TCP/IP Essentials is a hands-on guide to TCP/IP technologies, and shows how the protocols are implemented in practice. The book contains a series of extensively tested laboratory experiments that span the various elements of protocol definition and behavior. Topics covered include bridges, routers, LANs, static and dynamic routing, multicast and realtime service, and network management and security. The experiments are described in a Linux environment, with parallel notes on Solaris implementation. The book includes many homework exercises, and supplementary material for instructors is available. The book is aimed at students of electrical and computer engineering and students of computer science taking courses in networking. It is also an ideal guide for engineers studying for networking certifications.
The new edition of this successful and established textbook retains its two original intentions of explaining how to program in the ML language, and teaching the fundamentals of functional programming. The major change is the early and prominent coverage of modules, which are extensively used throughout. In addition, the first chapter has been totally rewritten to make the book more accessible to those without experience of programming languages. The main features of new Standard Library for the revised version of ML are described and many new examples are given, while references have also been updated. Dr Paulson has extensive practical experience of ML and has stressed its use as a tool for software engineering; the book contains many useful pieces of code, which are freely available (via the Internet) from the author. He shows how to use lists, trees, higher-order functions and infinite data structures. Many illustrative and practical examples are included.. Efficient functional implementations of arrays, queues, priority queues, etc. are described. Larger examples include a general top-down parser, a lambda-calculus reducer and a theorem prover. The combination of careful explanation and practical advice will ensure that this textbook continues to be the preferred text for many courses on ML.
The unique feature of this compact student's introduction is that it presents concepts in an order that closely follows a standard mathematics curriculum, rather than structure the book along features of the software. As a result, the book provides a brief introduction to those aspects of the Mathematica software program most useful to students. The second edition of this well loved book is completely rewritten for Mathematica 6 including coverage of the new dynamic interface elements, several hundred exercises and a new chapter on programming. This book can be used in a variety of courses, from precalculus to linear algebra. Used as a supplementary text it will aid in bridging the gap between the mathematics in the course and Mathematica. In addition to its course use, this book will serve as an excellent tutorial for those wishing to learn Mathematica and brush up on their mathematics at the same time.
C# programmers: no more translating data structures from C++ or Java to use in your programs! Mike McMillan provides a tutorial on how to use data structures and algorithms plus the first comprehensive reference for C# implementation of data structures and algorithms found in the .NET Framework library, as well as those developed by the programmer. The approach is very practical, using timing tests rather than Big O notation to analyze the efficiency of an approach. Coverage includes arrays and array lists, linked lists, hash tables, dictionaries, trees, graphs, and sorting and searching algorithms, as well as more advanced algorithms such as probabilistic algorithms and dynamic programming. This is the perfect resource for C# professionals and students alike.
Device testing represents the single largest manufacturing expense in the semiconductor industry, costing over $40 billion a year. The most comprehensive and wide ranging book of its kind, Testing of Digital Systems covers everything you need to know about this vitally important subject. Starting right from the basics, the authors take the reader through automatic test pattern generation, design for testability and built-in self-test of digital circuits before moving on to more advanced topics such as IDDQ testing, functional testing, delay fault testing, memory testing, and fault diagnosis. The book includes detailed treatment of the latest techniques including test generation for various fault models, discussion of testing techniques at different levels of integrated circuit hierarchy and a chapter on system-on-a-chip test synthesis. Written for students and engineers, it is both an excellent senior/graduate level textbook and a valuable reference.
With clear and easy-to-understand explanations, this book covers the fundamental concepts and coding methods of signal compression, whilst still retaining technical depth and rigor. It contains a wealth of illustrations, step-by-step descriptions of algorithms, examples and practice problems, which make it an ideal textbook for senior undergraduate and graduate students, as well as a useful self-study tool for researchers and professionals. Principles of lossless compression are covered, as are various entropy coding techniques, including Huffman coding, arithmetic coding and Lempel-Ziv coding. Scalar and vector quantization and trellis coding are thoroughly explained, and a full chapter is devoted to mathematical transformations including the KLT, DCT and wavelet transforms. The workings of transform and subband/wavelet coding systems, including JPEG2000 and SBHP image compression and H.264/AVC video compression, are explained and a unique chapter is provided on set partition coding, shedding new light on SPIHT, SPECK, EZW and related methods.
Multiagent systems combine multiple autonomous entities, each having diverging interests or different information. This overview of the field offers a computer science perspective, but also draws on ideas from game theory, economics, operations research, logic, philosophy and linguistics. It will serve as a reference for researchers in each of these fields, and be used as a text for advanced undergraduate or graduate courses. The authors emphasize foundations to create a broad and rigorous treatment of their subject, with thorough presentations of distributed problem solving, game theory, multiagent communication and learning, social choice, mechanism design, auctions, cooperative game theory, and modal logics of knowledge and belief. For each topic, basic concepts are introduced, examples are given, proofs of key results are offered, and algorithmic considerations are examined. An appendix covers background material in probability theory, classical logic, Markov decision processes and mathematical programming.
Numerical Mathematics is a unique textbook which presents rudimentary numerical mathematics in conjunction with computational laboratory assignments. No previous knowledge of calculus or linear algebra is presupposed, and thus the book is tailor-made for undergraduate students, as well as prospective mathematics teachers. The material in the book emphasises algorithmic aspects of mathematics, which are made viable through numerical assignments, in which the traditional 'chalk-and-talk' lecturer turns, in part, into a laboratory instructor. The book is not a numerical methods book, containing ready-made computational recipes. Rather, it guides the student to create the algorithm required for any given assignment - expressed in whichever programming language is used - on the basis of the underlying mathematics. The computational assignments cover iterative processes, area approximations, solution of linear systems, acceleration of series summation, interpolative approximations, and construction of computer-library functions. Throughout the book, strong emphasis is being put upon vital concepts such as error bounds, precision control, numerical efficiency, computational complexity, as well as round off errors and numerical stability. It is the authors' belief that the material presented in this book is part and parcel of the mathematical foundations that should be acquired by a student in the microcomputer era.
Information and Communication Technologies (ICTs) have profoundly changed many aspects of life, including the nature of entertainment, work, communication, education, healthcare, industrial production and business, social relations and conflicts. They have had a radical and widespread impact on our moral lives and hence on contemporary ethical debates. The Cambridge Handbook of Information and Computer Ethics, first published in 2010, provides an ambitious and authoritative introduction to the field, with discussions of a range of topics including privacy, ownership, freedom of speech, responsibility, technological determinism, the digital divide, cyber warfare, and online pornography. It offers an accessible and thoughtful survey of the transformations brought about by ICTs and their implications for the future of human life and society, for the evaluation of behaviour, and for the evolution of moral values and rights. It will be a valuable book for all who are interested in the ethical aspects of the information society in which we live.
Are all film stars linked to Kevin Bacon? Why do the stock markets rise and fall sharply on the strength of a vague rumour? How does gossip spread so quickly? Are we all related through six degrees of separation? There is a growing awareness of the complex networks that pervade modern society. We see them in the rapid growth of the internet, the ease of global communication, the swift spread of news and information, and in the way epidemics and financial crises develop with startling speed and intensity. This introductory book on the new science of networks takes an interdisciplinary approach, using economics, sociology, computing, information science and applied mathematics to address fundamental questions about the links that connect us, and the ways that our decisions can have consequences for others.
This textbook provides a clear and concise introduction to computer architecture and implementation. Two important themes are interwoven throughout the book. The first is an overview of the major concepts and design philosophies of computer architecture and organization. The second is the early introduction and use of analytic modeling of computer performance. The author begins by describing the classic von Neumann architecture, and then presents in detail a number of performance models and evaluation techniques. He goes on to cover user instruction set design, including RISC architecture. A unique feature of the book is its memory-centric approach - memory systems are discussed before processor implementations. The author also deals with pipelined processors, input/output techniques, queuing modes, and extended instruction set architectures. Each topic is illustrated with reference to actual IBM and Intel architectures. The book contains many worked examples and over 130 homework exercises. It is an ideal textbook for a one-semester undergraduate course in computer architecture and implementation.
The popularity of the Web and Internet commerce provides many extremely large datasets from which information can be gleaned by data mining. This book focuses on practical algorithms that have been used to solve key problems in data mining and which can be used on even the largest datasets. It begins with a discussion of the map-reduce framework, an important tool for parallelizing algorithms automatically. The authors explain the tricks of locality-sensitive hashing and stream processing algorithms for mining data that arrives too fast for exhaustive processing. The PageRank idea and related tricks for organizing the Web are covered next. Other chapters cover the problems of finding frequent itemsets and clustering. The final chapters cover two applications: recommendation systems and Web advertising, each vital in e-commerce. Written by two authorities in database and Web technologies, this book is essential reading for students and practitioners alike.
The nervous system is made up of a large number of interacting elements. To understand how such a complex system functions requires the construction and analysis of computational models at many different levels. This book provides a step-by-step account of how to model the neuron and neural circuitry to understand the nervous system at all levels, from ion channels to networks. Starting with a simple model of the neuron as an electrical circuit, gradually more details are added to include the effects of neuronal morphology, synapses, ion channels and intracellular signalling. The principle of abstraction is explained through chapters on simplifying models, and how simplified models can be used in networks. This theme is continued in a final chapter on modelling the development of the nervous system. Requiring an elementary background in neuroscience and some high school mathematics, this textbook is an ideal basis for a course on computational neuroscience.
The computational education of biologists is changing to prepare students for facing the complex datasets of today's life science research. In this concise textbook, the authors' fresh pedagogical approaches lead biology students from first principles towards computational thinking. A team of renowned bioinformaticians take innovative routes to introduce computational ideas in the context of real biological problems. Intuitive explanations promote deep understanding, using little mathematical formalism. Self-contained chapters show how computational procedures are developed and applied to central topics in bioinformatics and genomics, such as the genetic basis of disease, genome evolution or the tree of life concept. Using bioinformatic resources requires a basic understanding of what bioinformatics is and what it can do. Rather than just presenting tools, the authors - each a leading scientist - engage the students' problem-solving skills, preparing them to meet the computational challenges of their life science careers.
This book provides a systematic analysis of many common argumentation schemes and a compendium of 96 schemes. The study of these schemes, or forms of argument that capture stereotypical patterns of human reasoning, is at the core of argumentation research. Surveying all aspects of argumentation schemes from the ground up, the book takes the reader from the elementary exposition in the first chapter to the latest state of the art in the research efforts to formalize and classify the schemes, outlined in the last chapter. It provides a systematic and comprehensive account, with notation suitable for computational applications that increasingly make use of argumentation schemes.
Computers and the Law provides readers with an introduction to the legal issues associated with computing – particularly in the massively networked context of the Internet. Assuming no previous knowledge of the law or any special knowledge of programming or computer science, this textbook offers undergraduates of all disciplines and professionals in the computing industry an understanding of basic legal principles and an awareness of the peculiarities associated with legal issues in cyberspace. This is not a law school casebook, but rather a variety of carefully chosen, relevant cases presented in redacted form. The full cases are available on an ancillary Web site. The pervasiveness of computing in modern society has generated numerous legal ambiguities. This book introduces readers to the fundamental workings of the law in physical space and suggests the opportunity to create new types of laws with nontraditional goals.
This textbook, first published in 2003, emphasises the fundamentals and the mathematics underlying computer graphics. The minimal prerequisites, a basic knowledge of calculus and vectors plus some programming experience in C or C++, make the book suitable for self study or for use as an advanced undergraduate or introductory graduate text. The author gives a thorough treatment of transformations and viewing, lighting and shading models, interpolation and averaging, Bézier curves and B-splines, ray tracing and radiosity, and intersection testing with rays. Additional topics, covered in less depth, include texture mapping and colour theory. The book covers some aspects of animation, including quaternions, orientation, and inverse kinematics, and includes source code for a Ray Tracing software package. The book is intended for use along with any OpenGL programming book, but the crucial features of OpenGL are briefly covered to help readers get up to speed. Accompanying software is available freely from the book's web site.
The dream of automatic language translation is now closer thanks to recent advances in the techniques that underpin statistical machine translation. This class-tested textbook from an active researcher in the field, provides a clear and careful introduction to the latest methods and explains how to build machine translation systems for any two languages. It introduces the subject's building blocks from linguistics and probability, then covers the major models for machine translation: word-based, phrase-based, and tree-based, as well as machine translation evaluation, language modeling, discriminative training and advanced methods to integrate linguistic annotation. The book also reports the latest research, presents the major outstanding challenges, and enables novices as well as experienced researchers to make novel contributions to this exciting area. Ideal for students at undergraduate and graduate level, or for anyone interested in the latest developments in machine translation.
This introduction to the basic ideas of structural proof theory contains a thorough discussion and comparison of various types of formalization of first-order logic. Examples are given of several areas of application, namely: the metamathematics of pure first-order logic (intuitionistic as well as classical); the theory of logic programming; category theory; modal logic; linear logic; first-order arithmetic and second-order logic. In each case the aim is to illustrate the methods in relatively simple situations and then apply them elsewhere in much more complex settings. There are numerous exercises throughout the text. In general, the only prerequisite is a standard course in first-order logic, making the book ideal for graduate students and beginning researchers in mathematical logic, theoretical computer science and artificial intelligence. For the new edition, many sections have been rewritten to improve clarity, new sections have been added on cut elimination, and solutions to selected exercises have been included.
This book gives a comprehensive description of the architecture of microprocessors from simple in-order short pipeline designs to out-of-order superscalars. It discusses topics such as:The policies and mechanisms needed for out-of-order processing such as register renaming, reservation stations, and reorder buffers Optimizations for high performance such as branch predictors, instruction scheduling, and load-store speculationsDesign choices and enhancements to tolerate latency in the cache hierarchy of single and multiple processorsState-of-the-art multithreading and multiprocessing emphasizing single chip implementationsTopics are presented as conceptual ideas, with metrics to assess the performance impact, if appropriate, and examples of realization. The emphasis is on how things work at a black box and algorithmic level. The author also provides sufficient detail at the register transfer level so that readers can appreciate how design features enhance performance as well as complexity.