To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Linearity is a multi-faceted and ubiquitous notion in the analysis and development of programming language concepts. We study linearity in a denotational perspective by picking out programs that correspond to linear functions between domains.
We propose a PCF-like language imposing linear constraints on the use of variable to program only linear functions. To entail a full abstraction result, we introduce some higher-order operators related to exception handling and parallel evaluation. We study several notions of operational equivalence and show them to coincide with our language. Finally, we present a new operational evaluation of the language that provides the base for a real implementation. It exploits the denotational linearity to provide an efficient evaluation semantics SECD-like, that avoids the use of closures.
Although the language Lucid was not originally intended to support computing with infinite data structures, the notion of (infinite) sequence quickly came to the fore, together with a demand-driven computation model in which demands are propagated for the values of particular values at particular index points. This naturally generalized to sequences of multiple dimensions so that a programmer could, for example, write a program that could be understood as a (nonterminating) loop in which one of the loop variables is an infinite vector.
Programmers inevitably found use for more and more dimensions, which led to a problem which is fully solved for the first time in this paper. The problem is that the implementation's cache requires some estimate of the dimensions actually used to compute a value being fetched. This estimate can be difficult or (if dimensions are passed as parameters) impossible to obtain, and the demand-driven evaluation model for Lucid breaks down.
We outline the evolution of Lucid which gave rise to this problem, and outline the solution, as used for the implementation of TransLucid, the latest descendant of Lucid.
This paper solves the known problem of elimination of unnecessary internal element construction as well as variable elimination in XML processing with (a subset of) XQuery without ignoring the issues of document order. The semantics of XQuery is context sensitive and requires preservation of document order. In this paper, we propose, as far as we are aware, the first XQuery fusion that can deal with both the document order and the context of XQuery expressions. More specifically, we carefully design a context representation of XQuery expressions based on the Dewey order encoding, develop a context-preserving XQuery fusion for ordered trees by static emulation of the XML store, and prove that our fusion is correct. Our XQuery fusion has been implemented, and all the examples in this paper have passed through the system.
The Computer-based Corpsman Training System (CBCTS) and its forebear the Tactical Combat Casualty Care Simulation (TC3sim) are serious games designed to train military combat medical personnel. The designs of the two games do not differ significantly. TC3sim was built for the U.S. Army and involves Iraq scenarios. CBCTS has some upgraded visuals and is skinned for the Marine Corps. Its scenarios are geared toward Afghanistan. Their designs share the same learning objectives, the same medical interactions, the same assessment model, and the same physiological simulations. In their development, the complexity of simulating synthetic casualties and the combinations of user interactions were significantly underestimated. However, success came from two factors. The development of a simple user interface allowed users to quickly learn how to play the game and manage the large number of medical interactions. The employment of iterative releases allowed for constant feedback to be collected and integrated back into the game design.
Introduction
The Computer-based Corpsman Training System (CBCTS) is a first-person serious game designed to train U.S. Navy combat medical personnel who are assigned to tthe U.S. Marines (called corpsmen) how to respond to casualties on the battlefield. It is based on the U.S. Army’s Tactical Combat Casualty Care Simulation (TC3sim). CBCTS and TC3sim are essentially the same game, but CBCTS’s visuals were customized for the U.S. Marines. Instead of Iraq, CBCTS uses scenarios set in Afghanistan. The wari ghters’ characters have also been reskinned to be appropriate for the Marines and Navy services.
We introduce the motivations, history, technical approach, and design choices behind DARWARS Ambush!, a game-based, convoy operations trainer that was heavily used by the U.S. Army and Marines for five years. We discuss a number of the practical deployment concerns we addressed and discuss how we cultivated relationships to build a community of committed users. As one of the first large-scale, successful serious games for learning, DARWARS Ambush! broke new ground and led to many lessons learned on how to best design, develop, and deploy a serious game. We discuss some of our experiences, decisions, and lessons learned, and conclude with some recommendations that may help new efforts attain success as well.
Introduction
In late 2004, DARPA Program Manager Dr. Ralph Chatham asked BBN Technologies, who was already under contract on his DARWARS Training Superiority Program, whether we could quickly – within six months – deploy a training system to help soldiers better respond to convoy ambushes then prevalent in Iraq. At that time, convoy ambushes involving small arms, rocket-propelled grenades (RPGs), or improvised explosive devices (IEDs) were a leading cause of casualties. The U.S. military had recognized the need for increased training for convoy operations and aggressively pursued a variety of training solutions, including live-fire training exercises, marksmanship trainers, and driver training systems (see Steele, 2004; Tiron, 2004 for examples). Dr. Chatham recognized the need for a squad-level team trainer that would focus on situational awareness, communication, and coordination.
The Virtual Dental Implant Trainer (VDIT) is a 3-D simulation environment for dental students to practice dental implant surgery procedures. It provides a highly authentic surgery experience for trainees looking to practice techniques learned elsewhere, or for experienced dentists looking to refresh their skills. Because of its focus on being a practice environment, VDIT does not contain many of the instructional design techniques often found in many other training simulations. Furthermore, there is limited use of game elements found in many other serious games. However, given the tasks and emphasis on practice, this is acceptable. With additional effort VDIT could be transitioned into a more effective and engaging instructional environment.
Introduction
The Virtual Dental Implant Trainer (VDIT) is a highly accurate procedural training simulation environment for dentists. VDIT is not intended to be a stand-alone learning experience for those i rst learning how to perform dental implant surgery. Rather, it was specii cally designed to be used in conjunction with other training, or for those seeking a practice environment. These decisions on use greatly affected the game’s design. The remaining sections of this chapter look at the effectiveness of these decisions on VDIT.
Session types and contracts are two formalisms used to study client–server protocols. In this paper, we study the relationship between them. The main result is the existence of a fully abstract model of session types; this model is based on a natural interpretation of these types into a subset of contracts.
The Computer-based Corpsman Training System (CBCTS) was developed by ECS, Inc. for the U.S. Army Research, Development and Engineering Command. Game design elements complement the instructional design elements to produce an award-winning learning game. Notable design features include a well-designed tutorial, opportunities for decision making, time to reflect and replay a scenario, and implicit and explicit feedback. While game and instructional elements work very well together in CBCTS, suggestions are made in this chapter to increase instructional guidance to gain learning efficiencies without jeopardizing gameplay. These suggestions will benefit all learning game designers striving to improve their own games. Game designers are cautioned that additional elements may increase the design and development resource requirements, and instructional and gameplay trade-offs have to be considered. Some of these trade-offs are briefly addressed.
Introduction
The Computer-based Corpsman Training System (CBCTS) is a learning game that provides combat corpsmen realistic training to prepare them to apply their skills in a combat situation. CBCTS was developed by ECS, Inc. for the U.S. Army Research, Development and Engineering Command (RDECOM). The game supports training for Navy combat medics who are assigned to the U.S. Marine Corps. CBCTS is used at the Army Medical Department (AMEDD) Center and School as part of the curriculum to prepare combat medics.
We define and study hierarchies of topological spaces induced by the classical Borel and Luzin hierarchies of sets. Our hierarchies are divided into two classes: hierarchies of countably based spaces induced by their embeddings into Pω, and hierarchies of spaces (not necessarily countably based) induced by their admissible representations. We concentrate on the non-collapse property of the hierarchies and on the relationships between hierarchies in the two classes.
We propose a new method to verify that a higher-order, tree-processing functional program conforms to an input/output specification. Our method reduces the verification problem to multiple verification problems for higher-order multi-tree transducers, which are then transformed into higher-order recursion schemes and model-checked. Unlike previous methods, our new method can deal with arbitrary higher-order functional programs manipulating algebraic data structures, as long as certain invariants on intermediate data structures are provided by a programmer. We have proved the soundness of the method and implemented a prototype verifier.
Delta modelling is an approach to facilitate the automated product derivation for software product lines. It is based on a set of deltas specifying modifications that are incrementally applied to a core product. The applicability of deltas depends on application conditions over features. This paper presents abstract delta modelling, which explores delta modelling from an abstract, algebraic perspective. Compared to the previous work, we take a more flexible approach to conflicts between modifications by introducing the notion of conflict-resolving deltas. Furthermore, we extend our approach to allow the nesting of delta models for increased modularity. We also present conditions on the structure of deltas to ensure unambiguous product generation.
In the present chapter we investigate the formal aspects of adding definitions to a type system. In this we follow the pioneering work of N.G. de Bruijn (cf. de Bruijn, 1970). As the basic system we take λC, the most powerful system in the λ-cube. System λC is suitable for the PAT-interpretation, because it encapsulates λP. But it also covers the nice second order aspects of λ2. Therefore, λC appears to be enough for the purpose of ‘coding’ mathematics and mathematical reasonings and is an excellent candidate for the natural extension we want, being almost inevitable for practical applications: the addition of definitions.
We start with an extension leading from λC to a system called λD0. This system contains a formal version of definitions in the usual sense, the so-called descriptive definitions, so it can be used for a great amount of applications in the realm of logic and mathematics. But λD0 does not yet allow a satisfactory representation of axioms and axiomatic notions; these will be considered in the following chapter, in which a small, further extension of λD0 leads to our final system λD. (We have noticed before that we do not consider inductive and recursive definitions, since we can do without them; see Section 8.2.)
In order to give a proper description of λD0, we first extend our set of expressions, as given in Definition 6.3.1 for λC.
The aim of the book is, firstly, to give an introduction to type theory, an evolving scientific field at the crossroads of logic, computer science and mathematics. Secondly, the book explains how type theory can be used for the verification of mathematical expressions and reasonings.
Type theory enables one to provide a ‘coded’ version – i.e. a full formalisation – of many mathematical topics. The formal system underlying type theory forces the user to work in a very precise manner. The real power of type theory is that well-formedness of the formalised expressions implies logical and mathematical correctness of the original content.
An attractive property of type theory is that it becomes possible and feasible to do the encoding in a ‘natural’ manner, such that one follows (and recognises) the way in which these subjects were presented originally. Another important feature of type theory is that proofs are treated as first-class citizens, in the sense that proofs do not remain meta-objects, but are coded as expressions (terms) of the same form as the rest of the formalisation.
The authors intend to address a broad audience, ranging from university students to professionals. The exposition is gentle and gradual, developing the material at a steady pace, with ample examples and comments, cross-references and motivations. Theoretical issues relevant for logic and computer science alternate with practical applications in the area of fundamental mathematical subjects.
This book, Type Theory and Formal Proof: An Introduction, is a gentle, yet profound, introduction to systems of types and their inhabiting lambda-terms. The book appears shortly after Lambda Calculus with Types (Barendregt et al., 2013). Although these books have a partial overlap, they have very different goals. The latter book studies the mathematical properties of some formalisms of types and lambda-terms. The book in your hands is focused on the use of types and lambda-terms for the complete formalisation of mathematics. For this reason it also treats higher order and dependent types. The act of defining new concepts, essential for mathematical reasoning, forms an integral part of the book. Formalising makes it possible that arbitrary mathematical concepts and proofs be represented on a computer and enables a machine verification of the well-formedness of definitions and of the correctness of proofs. The resulting technology elevates the subject of mathematics and its applications to its maximally complete and reliable form.
The endeavour to reach this level of precision was started by Aristotle, by his introduction of the axiomatic method and quest for logical rules. For classical logic Frege completed this quest (and Heyting for the intuitionistic logic of Brouwer). Frege did not get far with his intended formalisation of mathematics: he used an inconsistent foundation. In 1910 Whitehead and Russell introduced types to remedy this. These authors made proofs largely formal, except that substitutions still had to be understood and performed by the reader.