To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
It is usually agreed that programming languages for implementing (other) programming languages, or ‘implementation languages’, should be simple low-level languages which are close to the machine code and to the operating system. In this paper it is argued that a very high level implementation language is a good idea, of particular importance for knowledge-based systems, and that Lisp (as a language and as a system) is very well suited to be a very high level implementation language. The significance of special-purpose programming languages is also discussed, and the requirements that they have for a very high level implementation language are considered.
This article reviews research on how people use and understand linguistic expressions of uncertainty, with a view toward the needs of researchers and others interested in artificial intelligence systems. We discuss and present empirical results within an inductively developed theoretical framework consisting of two background assumptions and six principles describing the underlying cognitive processes.
The Third International Workshop on Deontic Logic in Computer Science (ΔEON'96) took place in Sesimbra, Portugal, from 11–13 January 1996. It consisted of 12 refereed technical presentations and four invited talks. The invited speakers were Nuel Belnap (Pittsburgh University, USA), Andrew Jones (Oslo University, Norway), Krister Segerberg (Uppsala University, Sweden) and Marek Sergot (Imperial College, UK).
Recently there has been an increase in the number of computer aided design systems developed explicitly representing knowledge about the functionality of engineering designs. Reviewing these systems provides an understanding of the methods workers use to encapsulate knowledge of functionality within their systems. A number of issues are addressed to reveal the nature of their approaches. The developers' perception of functionality is discussed to identify variations in understanding of function and to establish the existence of any consensus. Methods of representing this knowledge are examined, thereby identifying representation types or combinations used and the advantages to be gained from any single representation. Illustrations of the manipulation of function shows how this type of knowledge can be used to support reasoning during early stage design. A survey of relationships with other design characteristics as a testimony to the manipulation of functionality is used to impact other aspects of a design. Through knowledge of relationships some models of the design process are posited by workers. A study of these bears evidence of the role of function in design and the stages at which its use is significant.
This paper attempts to assess the practical utility of nonmonotonic logic in diagnostic problem solving. We begin with a brief review of the main assumptions which motivate work in this area, and discuss two logic-based approaches which involve nonmonotonic arguments. Then we consider two recent proposals for the application of default logic to diagnosis, as well as a proposal based on counterfactual logic. In conclusion, we briefly compare these methods with other diagnostic reasoning paradigms found in the Artificial Intelligence literature.
In qualitative reasoning research, much effort has been spent on developing representation and reasoning formalisms. Only recently, the process of constructing models in terms of these formalisms has been recognised as an important research topic of its own. Approaches addressing this topic are examined in this review. For this purpose a general model of the task of constructing qualitative models is developed that serves as a frame of reference in considering these approaches. Two categories of approaches are identified: model composition and model induction approaches. The former compose a model from predefined partial models and the latter infer a model from behavioural data. Similarities and differences between the approaches are discussed using the general task model as a reference. It appears that the majority of approaches focus on automating model construction entirely. Assessing and debugging a model in cooperation with a modeller is identified as an important topic for future research
This paper presents a review of pattern matching techniques. The application areas for pattern matching are extensive, ranging from CAD systems to chemical analysis and from manufacturing to image processing. Published techniques and methods are classified and assessed within the context of three key issues: pattern classes, similarity types and matching methods. It has been shown that the techniques and approaches are as diverse and varied as the applications.
First-generation expert systems have significant limitations, often attributed to their not being sufficiently deep. However, a generally accepted answer to “What is a deep expert system?” is still to be given. To answer this question one needs to answer “Why do first-generation systems exhibit the limitations they do?” thus identifying what is missing from first-generation systems and therefore setting the design objectives for second-generation (i.e. deep) systems. Several second-generation architectures have been proposed; inherent in each of these architectures is a definition of deepness. Some of the proposed architectures have been designed with the objective of alleviating a subset, rather than the whole set, of the first-generation limitations. Such approaches are prone to local, non-robust solutions.In this paper we analyze the limitations (under the categories: human-computer interaction, problem-solving flexibility, and extensibility) of the first-generation expert systems thus setting design goals for second-generation systems. On the basis of this analysis proposed second-generation architectures are reviewed and compared. The paper concludes by presenting requirements for a generic second-generation architecture.
The Japanese Fifth Generation Computer Systems (FGCS) project has chosen logic programming for its core programming language. It has recognized the major contribution that logic programming has to make not only in artificial intelligence but in database systems and software specification as well. It has recognized and intends to exploit the greater potential that logic programming has to offer for taking advantage of the parallelism possible with innovative multiprocessor computer architectures.
Finance is a challenging yet appropriate domain for model-based reasoning, an area of research otherwise grounded in classical physics. Among the many features that suggest a model-based approach are that firms have formal internal structures, business entities have idealizable behaviours, and there is a history of formal analysis of business problems. This article discusses the motivations and foundations of the model-based approach, and surveys several existing artificial intelligence programs that exploit its advantages. The survey shows that there are ample opportunities for useful systems and significant research in this area. However, accomplishing either of these goals depends crucially upon moving beyond qualitative models based only on accounting information, which tend not to capture the actual complexities of the domain.