To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The sheer complexity of computer systems has meant that automated reasoning, i.e. the ability of computers to perform logical inference, has become a vital component of program construction and of programming language design. This book meets the demand for a self-contained and broad-based account of the concepts, the machinery and the use of automated reasoning. The mathematical logic foundations are described in conjunction with practical application, all with the minimum of prerequisites. The approach is constructive, concrete and algorithmic: a key feature is that methods are described with reference to actual implementations (for which code is supplied) that readers can use, modify and experiment with. This book is ideally suited for those seeking a one-stop source for the general area of automated reasoning. It can be used as a reference, or as a place to learn the fundamentals, either in conjunction with advanced courses or for self study.
By
Andrew Herbert, Microsoft Research, Cambridge, United Kingdom
Edited by
Yves Bertot,Gérard Huet, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Jean-Jacques Lévy, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Gordon Plotkin, University of Edinburgh
In 2005 Gilles Kahn discussed with Rick Rashid, Stephen Emmott and myself a proposal for Microsoft Research, Cambridge and INRIA to establish a joint research laboratory in France, building on the long-term informal collaboration between the two institutions. The research focus of the joint laboratory was an important point of discussion. In addition to building on our mutual strengths in areas such as software specification an important topic was a shared desire to create a programme of researching the area of computational science – using the concepts and methods of computer science to accelerate the pace of scientific development and explore the potential for new approaches to science exploiting computer science concepts and methods. This paper explores what computational science is and the contribution it can make to scientific progress. It is in large part abridged from a report “Towards 2020 Science” published by a group of experts assembled by Microsoft Research who met over three intense days to debate and consider the role and future of science, looking towards 2020 and, in particular, the importance and impact of computing and computer science in that vision.
Introduction
Computers have played an increasingly important role in science for 50 years. At the end of the twentieth century there was a transition from computers supporting scientists to do conventional science to computer science itself becoming part of the fabric of science and how science is done.
Edited by
Yves Bertot,Gérard Huet, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Jean-Jacques Lévy, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Gordon Plotkin, University of Edinburgh
The Milner-Damas typing algorithm W is one of the classic algorithms in computer science. In this paper we describe a formalized soundness and completeness proof for this algorithm. Our formalization is based on names for both term and type variables, and is carried out in Isabelle/HOL using the Nominal Datatype Package. It turns out that in our formalization we have to deal with a number of issues that are often overlooked in informal presentations of W.
“Alpha-conversion always bites you when you least expect it.”
A remark made by Xavier Leroy when discussing with us the informal proof about W in his PhD thesis.
Introduction
Milner's polymorphic type system for ML is probably the most influential programming language type system. The second author learned about it from a paper by Clément et al. He was immediately taken by their view that type inference can be viewed as Prolog execution, in particular because the Isabelle system, which he had started to work on, was based on a similar paradigm as the Typol language developed by Kahn and his coworkers. Milner himself had provided the explicit type inference algorithm W and proved its soundness. Completeness was later shown by Damas and Milner. Neither soundness nor completeness of W are trivial because of the presence of the Let-construct (which is not expanded during type inference).
By
Pierre Bernhard, I3S, University of Nice-Sophia Antipolis and CNRS, France,
Frédéric Hamelin, I3S, University of Nice-Sophia Antipolis and CNRS, France
Edited by
Yves Bertot,Gérard Huet, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Jean-Jacques Lévy, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Gordon Plotkin, University of Edinburgh
Gilles Kahn and I were classmates at École Polytechnique where, in the academic year 1965–1966, he taught me programming (this was in MAGE 2, a translation in French of Fortran 2 I believe, on a punched tape computer SETI PALAS 250), then we met again and became good friends at Stanford University, where he was a computer science student while I was in aeronautics and astronautics. Our paths were to get closer starting in the spring of 1980 when we started planning and, from 1983 on, heading INRIA Sophia-Antipolis together.
Gilles has always believed that game theory was worth pursuing. He was adament that our laboratory should take advantage of my being conversant with that topic. He was instrumental in maintaining it alive in the lab.
He was to be later the president of INRIA who presided over the introduction of “biological systems” as a full-fledged scientific theme of INRIA. Although this was after I had left INRIA, this again met with my personal scientific taste. I had by then embraced behavioural ecology as my main domain of interest and of application of dynamic games, much thanks to Eric Wajnberg, from INRA, but also out of an old desire of looking into the ecological applications of these techniques.
It is why I think fit to write here a few words about games and behavioural ecology, and also population dynamics and evolution, which are closely related topics.
Edited by
Yves Bertot,Gérard Huet, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Jean-Jacques Lévy, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Gordon Plotkin, University of Edinburgh
The evolution of programming languages involves isolating and describing abstractions that allow us to solve problems more elegantly, efficiently, and reliably, and then providing appropriate linguistic support for these abstractions. Ideally, a new abstraction can be described precisely with a mathematical semantics, and the semantics leads to logical techniques for reasoning about programs that use the abstraction. Gilles Kahn's early work on stream processing networks is a beautiful example of this process at work.
Gilles began thinking about parallel graph programs at Stanford, and he developed his ideas in a series of papers starting in 1971: and. Gilles' original motivation was to provide a formal model for reasoning about aspects of operating systems programming, based on early data flow models of computation. But the model he developed turned out to be of much more general interest, both in terms of program architecture and in terms of semantics. During his Edinburgh visit in 1975–76, Gilles and I collaborated on a prototype implementation of the model that allowed further development and experimentation, reported in. By 1976 it was clear that his model, while inspired by early data flow research, was also closely connected to several other developments, including coroutines, Landin's notion of streams, and the then emerging lazy functional languages.
Edited by
Yves Bertot,Gérard Huet, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Jean-Jacques Lévy, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Gordon Plotkin, University of Edinburgh
Edited by
Yves Bertot,Gérard Huet, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Jean-Jacques Lévy, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Gordon Plotkin, University of Edinburgh
Gilles Kahn was a serious scientist, but part of his style and effectiveness was in the great sense of curiosity and fun that he injected in the most technical topics. Some of his later projects involved connecting computing and the traditional sciences. I offer a perspective on the culture shock between biology and computing, in the style in which I would have explained it to him.
The nature of nature
In a now classic peer-reviewed commentary, “Can a Biologist Fix a Radio?”, Yuri Lazebnik describes the serious difficulties that scientists have in understanding biological systems. As an analogy, he describes the approach biologists would take if they had to study radios, instead of biological organisms, without having prior knowledge of electronics.
We would eventually find how to open the radios and will find objects of various shape, color, and size […]. We would describe and classify them into families according to their appearance. We would describe a family of square metal objects, a family of round brightly colored objects with two legs, round-shaped objects with three legs and so on. Because the objects would vary in color, we will investigate whether changing the colors affects the radio's performance. Although changing the colors would have only attenuating effects (the music is still playing but a trained ear of some people can discern some distortion), this approach will produce many publications and result in a lively debate.
By
Erik Sandewall, Linköping University and Royal Institute of Technology, Stockholm, Sweden
Edited by
Yves Bertot,Gérard Huet, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Jean-Jacques Lévy, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Gordon Plotkin, University of Edinburgh
The purpose of the research reported here was to explore an alternative way of organizing the general software structure in computers, eliminating the traditional distinctions between operating system, programming language, database system, and several other kinds of software. We observed that there is a lot of costly duplication of concepts and of facilities in the conventional architecture, and believe that most of that duplication can be eliminated if the software is organized differently. This article describes Leordo, an experimental software system that has been built in order to explore an alternative design and to try to verify the hypothesis that a much more compact design is possible and that concept duplication can be eliminated or at least greatly reduced. Definite conclusions in those respects can not yet be made, but the indications are positive and the design that has been
Introduction
Project goal and design goals
Leordo is a software project and an experimental software system that integrates capabilities that are usually found in several different software systems:
in the operating system
in the programming language and programming environment
in an intelligent agent system
in a text formatting system
and others more. I believe that it should be possible to make a much more concise, efficient, and user-friendly design of the total software system in the conventional (PC-type) computer by integrating capabilities and organizing them in a new way.
Edited by
Yves Bertot,Gérard Huet, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Jean-Jacques Lévy, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Gordon Plotkin, University of Edinburgh
Dataflow models of computation have intrigued computer scientists since the 1970s. They were first introduced by Jack Dennis as a basis for parallel programming languages and architectures, and by Gilles Kahn as a model of concurrency. Interest in these models of computation has been recently rekindled by the resurrection of parallel computing, due to the emergence of multicore architectures. However, Dennis and Kahn approached dataflow very differently. Dennis' approach was based on an operational notion of atomic firings driven by certain firing rules. Kahn's approach was based on a denotational notion of processes as continuous functions on infinite streams. This paper bridges the gap between these two points of view, showing that sequences of firings define a continuous Kahn process as the least fixed point of an appropriately constructed functional. The Dennis firing rules are sets of finite prefixes satisfying certain conditions that ensure determinacy. These conditions result in firing rules that are strictly more general than the blocking reads of the Kahn–MacQueen implementation of Kahn process networks, and solve some compositionality problems in the dataflow model. This work was supported in part by the Center for Hybrid and Embedded Software Systems (CHESS) at UC Berkeley, which receives support from the National Science Foundation (NSF awards #0720882 (CSR-EHS: PRET), #0647591 (CSR-SGER), and #0720841 (CSR-CPS)), the US Army Research Office (ARO #W911NF-07-2-0019), the US Air Force Office of Scientific Research (MURI #FA9550-06-0312 and AF-TRUST #FA9550-06-1-0244), the Air Force Research Lab (AFRL), the State of California Micro Program, and the following companies: Agilent, Bosch, DGIST, National Instruments, and Toyota.
Edited by
Yves Bertot,Gérard Huet, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Jean-Jacques Lévy, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Gordon Plotkin, University of Edinburgh
This paper is an overview of 15 years of collaboration between R&D teams at Dassault Aviation and several research projects at INRIA. This collaboration was related to Gilles Kahn's work on generic programming environments, program transformation, and user interfaces for proof assistants.
It is also an evocation of personal memories about Gilles, my perception of the impact of the research he carried out and supervised, and his dedication to INRIA.
Introduction
Since 1990, Dassault Aviation has been working on some formal methods and programming tools developed at INRIA by Gilles' research group (CROAP) or by other groups led by scientists close to him such as Gérard Berry and Gérard Huet.
Formal methods, more specifically the synchronous languages Esterel and Lustre or the proof assistant Coq, have been evaluated and introduced in our engineering processes to enhance our development tools for safety critical software, especially software embedded in flight control systems.
As for the programming tools developed by CROAP with the generative environment Centaur, it happened that in 1995 some of its language-specific instantiations were targeting scientific computation. More precisely, they were designed to assist some classical transformations of large Fortran codes (ports, parallelization, differentiation). Since Dassault Aviation has always developed its computational fluid dynamics (CFD) codes in-house, there was some motivation in our company to experiment tools that claim to partially automate some time consuming tasks done manually at that time.
By
Bruno Courcelle, Institut Universitaire de France Université Bordeaux 1, Laboratoire Bordelais de Recherche en Informatique
Edited by
Yves Bertot,Gérard Huet, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Jean-Jacques Lévy, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Gordon Plotkin, University of Edinburgh
The communication by Gilles Kahn, Jean Vuillemin and myself at the second International Colloquium on Automata, Languages and Programming, held in Saarbrücken in 1974 is in French in the proceedings, and has not been published as a journal article. However, Todd Veldhuizen wrote in 2002 an English translation that is reproduced in the next chapter.
À propos Chapter 8
It was quite a surprise for me to receive a message from Todd Veldhuizen saying that he had translated from French a 30-year-old conference paper presented at the second International Colloquium on Automata, Languages and Programming, held in Saarbrücken in 1974, of which I am coauthor with G. Kahn and J. Vuillemin. He did that work because he felt the paper was “seminal”. First of all I would like to thank him for this work. The publication of his translation in a volume dedicated to the memory of Gilles Kahn is a testimony of the gratitude of Jean Vuillemin and myself to him, and the recognition of an important scientific contribution of Gilles among many others.
In this overview, I indicate a few research directions that can be traced back to that communication. I give only a few related references, this overview is not a thorough bibliographical review of related articles.
By
Bengt Nordström, Chalmers University of Technology and the University of Göteborg
Edited by
Yves Bertot,Gérard Huet, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Jean-Jacques Lévy, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Gordon Plotkin, University of Edinburgh
The structure of documents of various degree of formality, from scientific papers with layout information and programs with their documentation to completely formal proofs can be expressed by assigning a type to the abstract syntax tree of the document. By using dependent types – an idea from type theory – it is possible to express very strong syntactic criterion on wellformedness of documents. This structure can be used to automatically generate parsers, type checkers and structure-oriented editors.
Introduction
We are interested to find a general framework for describing the structure of many kinds of documents, such as
books and articles
“live” documents (like a web document with parts to be filled in)
programs
formal proofs.
Are there any good reasons why we use different programs to edit and print articles, programs and formal proofs? A unified view on these kinds of documents would make it possible to use only one structure-oriented editor to build all of them, and it would be easier to combine documents of different kinds, for instance scientific papers, programs with their documentation, informal and formal proofs and simple web forms.
Such a view requires that we have a good framework to express syntactic wellformedness (from things like the absence of a title in a footnote to correctness of a formal proof) and to express how the document should be edited and presented.
Edited by
Yves Bertot,Gérard Huet, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Jean-Jacques Lévy, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Gordon Plotkin, University of Edinburgh
Semantics of programming languages and interactive environments for the development of proofs and programs are two important aspects of Gilles Kahn's scientific contributions. In his paper “The semantics of a simple language for parallel programming”, he proposed an interpretation of (deterministic) parallel programs (now called Kahn networks) as stream transformers based on the theory of complete partial orders (cpos). A restriction of this language to synchronous programs is the basis of the data-flow Lustre language which is used for the development of critical embedded systems.
We present a formalization of this seminal paper in the Coq proof assistant. For that purpose, we developed a general library for cpos. Our cpos are defined with an explicit function computing the least upper bound (lub) of an increasing sequence of elements. This is different from what Kahn developed for the standard Coq library where only the existence of lubs (for arbitrary directed sets) is required, giving no way to explicitly compute a fixpoint. We define a cpo structure for the type of possibly infinite streams. It is then possible to define formally what is a Kahn network and what is its semantics, achieving the goal of having the concept closed under composition and recursion. The library is illustrated with an example taken from the original paper as well as the Sieve of Eratosthenes, an example of a dynamic network.
By
Jean-Jacques Lévy, INRIA and Microsoft Research–INRIA Joint Centre
Edited by
Yves Bertot,Gérard Huet, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Jean-Jacques Lévy, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Gordon Plotkin, University of Edinburgh
Edited by
Yves Bertot,Gérard Huet, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Jean-Jacques Lévy, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Gordon Plotkin, University of Edinburgh
Asclepios is the name of a research project team officially launched on November 1st, 2005 at INRIA Sophia-Antipolis, to study the Analysis and Simulation of Biological and Medical Images. This research project team follows a previous one, called Epidaure, initially dedicated to Medical Imaging and Robotics research. These two project teams were strongly supported by Gilles Kahn, who used to have regular scientific interactions with their members. More generally, Gilles Kahn had a unique vision of the growing importance of the interaction of the Information Technologies and Sciences with the Biological and Medical world. He was one of the originators of the creation of a specific BIO theme among the main INRIA research directions, which now regroups 16 different research teams including Asclepios, whose research objectives are described and illustrated in this article.
Introduction
The revolution of biomedical images and quantitative medicine
There is an irreversible evolution of medical practice toward more quantitative and personalized decision processes for prevention, diagnosis and therapy. This evolution is supported by a continually increasing number of biomedical devices providing in vivo measurements of structures and processes inside the human body, at scales varying from the organ to the cellular and even molecular level. Among all these measurements, biomedical images of various forms increasingly play a central role.
Facing the need for more quantitative and personalized medicine based on larger and more complex sets of measurements, there is a crucial need for developing: (1) advanced image analysis tools capable of extracting the pertinent information from biomedical images and signals; (2) advanced models of the human body to correctly interpret this information; (3) large distributed databases to calibrate and validate these models.
Edited by
Yves Bertot,Gérard Huet, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Jean-Jacques Lévy, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Gordon Plotkin, University of Edinburgh
Gilles Kahn, pour moi, ce fut d'abord un article, en français s'il vous plait, texte qui fut le point de départ de ma recherche:
G. Kahn et G. Plotkin, Domaines concrets, TR IRIA-Laboria 336 (1978), paru en version anglaise en 1993 – signe de son influence dans le temps – dans le volume d'hommage à Corrado Böhm.
On ne pouvait imaginer un meilleur appât pour le jeune homme que j'étais, arrivé à l'informatique par le fruit d'une hésitation entre mathé matiques (intimidantes) et langues (les vraies). Un autre collègue trop tôt disparu, Maurice Gross, m'avait aidé à choisir une tierce voie et m'avait guidé vers le DEA d'Informatique Théorique de Paris 7. Les cours de Luc Boasson et de Dominique Perrin m'avaient déjà bien ferré, mais la rencontre des domaines concrets m'a définitivement “attrapé”, et parce qu'il s'agissait de structures ressemblant aux treillis – rencontrés assez tôt dans ma scolarité grâce aux Leçons d'Algèbre Moderne de Paul Dubreil et Marie-Louise Dubreil Jacotin que m'avait conseillées mon professeur de mathématiques –, et parce que Gérard Berry qui m'avait mis ce travail entre les mains avait une riche idée pour bâtir sur cette pierre.
L'idée directrice de cet article était de donner une définiton générale de structure de données, comprenant les listes, les arbres, les enregistrements, les enregistrements avec variantes, etc…, et, comme l'on fait dans toute bonne mathématique, une bonne notion de morphisme entre ces structures: Cette définition était donnée sous deux facettes équivalentes et reliées par un théorème de représentation: l'une concrète, en termes de cellules (nœeuds d'arbres, champs d'enrigistrements, …) et de valeurs, l'autre abstraite, en termes d'ordres partiels.
By
G. Ramalingam, Microsoft Research India; Bangalore, India,
Thomas Reps, University of Wisconsin; Madison, WI; USA
Edited by
Yves Bertot,Gérard Huet, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Jean-Jacques Lévy, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Gordon Plotkin, University of Edinburgh
Program representation graphs (PRGs) are an intermediate representation for programs. (They are closely related to program dependence graphs.) In this paper, we develop a mathematical semantics for PRGs that, inspired by Kahn's semantics for a parallel programming language, interprets PRGs as dataflow graphs. We also study the relationship between this semantics and the standard operational semantics of programs. We show that (i) the semantics of PRGs is more defined than the standard operational semantics, and (ii) for states on which a program terminates normally, the PRG semantics is identical to the standard operational semantics.
Introduction
In this paper, we develop a mathematical semantics for program representation graphs (PRGs) and study its relationship to a standard (operational) semantics of programs. Program representation graphs are an intermediate representation of programs, introduced by Yang et al. in an algorithm for detecting program components that exhibit identical execution behaviors. They combine features of static-single-assignment forms (SSA forms) and program dependence graphs (PDGs) (See Fig. 10.1 for an example program and its PRG.) PRGs have also been used in an algorithm for merging program variants.
Program dependence graphs have been used as an intermediate program representation in various applications such as vectorization, parallelization, and merging program variants. A number of variants of the PDG have been used as the basis for efficient program analysis by optimizing compilers as well as other tools.
By
Roberto M. Amadio, Université Paris Diderot, PPS, UMR-7126,
Mehdi Dogguy, Université Paris Diderot, PPS, UMR-7126
Edited by
Yves Bertot,Gérard Huet, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Jean-Jacques Lévy, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Gordon Plotkin, University of Edinburgh
The Sπ-calculus is a synchronous π-calculus which is based on the SL model. The latter is a relaxation of the Esterel model where the reaction to the absence of a signal within an instant can only happen at the next instant. In the present work, we present and characterize a compositional semantics of the Sπ-calculus based on suitable notions of labelled transition system and bisimulation. Based on this semantic framework, we explore the notion of determinacy and the related one of (local) confluence.
Introduction
Let P be a program that can repeatedly interact with its environment. A derivative of P is a program to which P reduces after a finite number of interactions with the environment. A program terminates if all its internal computations terminate and it is reactive if all its derivatives are guaranteed to terminate. A program is determinate if after any finite number of interactions with the environment the resulting derivative is unique up to semantic equivalence.
Most conditions found in the literature that entail determinacy are rather intuitive, however the formal statement of these conditions and the proof that they indeed guarantee determinacy can be rather intricate in particular in the presence of name mobility, as available in a paradigmatic form in the π-calculus.
Our purpose here is to provide a streamlined theory of determinacy for the synchronous π-calculus introduced in.
Edited by
Yves Bertot,Gérard Huet, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Jean-Jacques Lévy, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Gordon Plotkin, University of Edinburgh
By
Paul Klint, Centrum voor Wiskunde en Informatica and University of Amsterdam
Edited by
Yves Bertot,Gérard Huet, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Jean-Jacques Lévy, Institut National de Recherche en Informatique et en Automatique (INRIA), Rocquencourt,Gordon Plotkin, University of Edinburgh
Gilles Kahn was a great colleague and good friend who has left us much too early. In this paper I will sketch our joint research projects, the many discussions we had, some personal recollections, and the influence these have had on the current state-of-the-art in meta-level language technology.
Getting acquainted
Bâtiment 8.On a sunny day in the beginning of July 1983 I parked my beige Citroen Dyane on the parking lot in front of Bâtiment 8, INRIA Rocquencourt. At the time, the buildings made the impression that the US military who had constructed the premises in Rocquencourt were also the last that had ever used the paint brush. Inside, lived an energetic research family and I was hosted by project CROAP headed by Gilles Kahn. My roommates Veronique Donzeau-Gouge and Bertrand Mélèse helped me find a bureau in a corner in the cramped building and helped to set up a Multics account on the Honeywell-Bull mainframe.
After some flirtations with computer graphics, software portability and the Unix operating system, I turned to the study of string processing languages on which I wrote a PhD in 1982. The main topic was the Summer programming language that featured objects, success/failure driven control flow, string matching and composition, and a “try” mechanism that allowed the execution of an arbitrary sequence of statements and would undo all side effects in case this execution resulted in failure.