To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The success of a software system strongly depends on the ability of turning a precise domain analysis into a concrete architecture. Even if the domain model relies on sound ontological bases, there is often a wide semantic gap between the conceptual model and the concrete components that should reify it. To fill the semantic gap, relevant domain concepts should be engineered by identifying the corresponding architectural abstractions, which can be realized by concrete software components. Space plays a crucial role in many application domains, but surprisingly, related architectural abstractions have not emerged yet. This paper proposes space-related abstractions derived from the application of classical software engineering principles; in particular, the information hiding principle that leads to an operational definition of space. Basic abstractions are refined to deal with architectural aspects. As the underlying software engineering principles are close to principles that underlie the definition of space ontologies, the conjecture is that the proposed space architectural abstractions might be the basis for a formalization in ontological terms.
Ingo Wegener passed away on 26 November 2008, after a three-year struggle with brain cancer. This is a tremendous loss for all those who knew him and worked with him. Ingo is survived by his wife, Christa.
After studying Mathematics in Bielefeld and holding a post in Frankfurt am Main, in 1987 Ingo became a full professor of Computer Science, in Efficient Algorithms and Complexity Theory, at the Technische Universität Dortmund, a position he held until his death.
Random geometric graphs have been one of the fundamental models for reasoning about wireless networks: one places n points at random in a region of the plane (typically a square or circle), and then connects pairs of points by an edge if they are within a fixed distance of one another. In addition to giving rise to a range of basic theoretical questions, this class of random graphs has been a central analytical tool in the wireless networking community.
For many of the primary applications of wireless networks, however, the underlying environment has a large number of obstacles, and communication can only take place among nodes when they are close in space and when they have line-of-sight access to one another – consider, for example, urban settings or large indoor environments. In such domains, the standard model of random geometric graphs is not a good approximation of the true constraints, since it is not designed to capture the line-of-sight restrictions.
Here we propose a random-graph model incorporating both range limitations and line-of-sight constraints, and we prove asymptotically tight results for k-connectivity. Specifically, we consider points placed randomly on a grid (or torus), such that each node can see up to a fixed distance along the row and column it belongs to. (We think of the rows and columns as ‘streets’ and ‘avenues’ among a regularly spaced array of obstructions.) Further, we show that when the probability of node placement is a constant factor larger than the threshold for connectivity, near-shortest paths between pairs of nodes can be found, with high probability, by an algorithm using only local information. In addition to analysing connectivity and k-connectivity, we also study the emergence of a giant component, as well an approximation question, in which we seek to connect a set of given nodes in such an environment by adding a small set of additional ‘relay’ nodes.
The assessment of damage of structural concrete elements relies on the engineering judgment of the person in charge of the inspection and evaluation. This paper describes a rule-based prototype expert system developed to assist an engineer engaged in the task of assessing postearthquake damage to structural concrete elements in the task of providing guidance for inspection as well as criteria for evaluation and courses of action to take afterwards. The expert system, called DASE, identifies the most likely failure modes that may be developed by columns or beams of a damaged building, determines the severity of damage and, suggests immediate actions to take afterwards, such as rehabilitation procedures or tests, to maintain an acceptable local safety level. Floor damage classification and restoration guidelines can also be provided, assuming that all of the structural components in a building's floor have been inspected. The Analytic Hierarchy Process has been implemented as the overall framework to determine the severity of damage, since it allows judgments and personal values to be represented in a systematic and rational manner. DASE provides graphics that customize the user interface and an explanation facility. The knowledge acquisition process consisted primarily of the analysis of documented knowledge found through literature research.
The feasibility and relative merits of integrating knowledge-based systems (KBSs) and artificial neural networks (ANNs) for application to engineering problems are presented and evaluated. The strength of KBSs lies in their ability to represent human judgment and solve problems by providing explanations from and reasoning with heuristic knowledge. ANNs demonstrate problem solving characteristics not inherent in KBSs, including an ability to learn from example, develop a generalized solution applicable to a range of examples of the problem, and process information extremely rapidly. In this respect, KBSs and ANNs are complementary, rather than alternatives, and may be integrated into a system that exploits the advantages of both technologies. The scope of application and quality of solutions produced by such a hybrid extend beyond the boundaries of the individual technologies. This paper identifies and describes how KBSs and ANNs can be integrated, and provides an evaluation of the advantages that will accrue in engineering applications.
Inductive learning is proposed as a tool for synthesizing domain knowledge from data generated by a model-based simulator. In order to use an inductive engine to generate decision rules, the pre-classification process becomes more complicated in the presence of multiple competing objectives. Instead of relying on a domain expert to perform this pre-classification task, a clustering algorithm is used to eliminate human biases involved in the selection of a classification function for pre-classification. It is shown that the use of a clustering algorithm for pre-classification not only further automates the process of knowledge by synthesizing, but also improves the quality of the rules generated by the inductive engine.
Learning engineering is a new subarea of knowledge engineering dealing with the methodological aspects of using learning systems in knowledge acquisition. In this paper, the justification for the development of Learning Engineering is provided, and its major subdomains and research directions are briefly discussed.
The task of modeling, i.e., of creating a set of equations that can be used to predict the behavior of a physical object, is a key step in engineering analysis. This paper describes a computer system, MSG, for generating mathematical models to analyze physical systems involving heat transfer behavior. MSG is motivated by the need for modeling in an automated design process. The models are sets of equations which may include algebraic equations, ordinary differential equations and partial differential equations. MSG uses the strong domain theory to guide model construction in three sequential tasks: identify regions of interests on an object, determine relevant heat transfer and energy storage processes, and transform these processes into equations. The decisions in these tasks are guided by estimates of variation in temperature and material property, and the relative strengths of heat transfer processes.
Human designers often adopt strategies from previous similar cases to guide their search in new design tasks. We have developed an approach for automated design strategy capture and reuse. That approach has been implemented in DDIS, a prototype structural design system that uses a blackboard framework to integrate case-based and domain-based reasoning. Plans, goals, and critical constraints from user-selected previous cases are combined with case-independent reasoning to solve underconstrained parametric structural design problems. This article presents a detailed example of design strategy recording and reuse in base plate design for electrical transmission poles.
This article poses the notion that it is possible and desirable to formalize and apply design critiques in a specialized framework. It describes GENCRIT, (GENeric CRItiquing Tool), one such framework for design critiquing. The article starts by highlighting the role of critics in the design process. It then goes on to bring out the need for a critic building tool, viz. that of aiding in the rapid development of multiple critics. GENCRIT combines knowledge-based techniques and a multifactor decision making model to develop an integrated approach to evaluation that encompasses a wide range of designs. Critics developed using GENCRIT evaluate candidate designs based on the critiquing knowledge provided by experts, give justifications for the evaluation, and suggest improvements. The working of GENCRIT is illustrated with two examples: a constructibility critic for reinforced concrete buildings and a bridge design critic.
Computational abstraction of engineering design leads to an elegant theory defining (1) the process of design as an abstract model of computability, the Turing machine; (2) the artifacts of design as enumerated strings from a (possibly multidimensional) grammar; and (3) design specifications or constraints as formal state changes that govern string enumeration. Using this theory, it is shown that engineering design is a computable function. A computational methodology based on the theory is then developed that can be described as a form follows function design paradigm.
This paper presents a novel formulation of the configuration-design problem that achieves the benefits of the concurrent engineering (CE) design paradigm. In CE, all design concerns (manufacturability, testability, etc.) are applied to an evolving design throughout the design cycle. CE identifies conflicts early on, avoids costly redesign, and leads to better products. Our formulation is based on a distributed dynamic interval constraint-satisfaction problem (DDICSP) model. Persistent catalog agents map onto DDICSP variables and constraint agents map onto DDICSP constraints. These agents use a set of operations and heuristics to navigate the design space to eliminate sets of designs until a solution is found. Experimental results show that an architecture where each catalog agent resides on a separate computer has performance advantages over nondistributed approaches.
An approach to CAD and CAM modeling and to the design of CAD/CAM systems is presented. Models of the product and of the process are represented by logical assertions in a common logical language. CAD/CAM functions are represented by the application of logical inference rules, which correspond to the derivation of new information as well as to actions. This allows all the different kinds of model and specification used in design and manufacturing to be represented in a computer in a common form. It therefore allows the representation of constraints and rules connecting any aspects of design and manufacturing together.
This approach has all the advantages of formal specifiction, namely, ease of expression, communication, standardization and abstraction. At the same time, we demonstrate its practical implementation in an efficient form, and which is industry compatible, and we report practical experience with using this approach for CAD/CAM models and for intelligent CAD/CAM functions.
Fixtures are used in almost all manufacturing operations. They take on many different forms, ranging from simple mechanical vises to modular fixtures to specialized fixtures. For cylindrical parts or simple prismatic parts, automatic design of fixtures is feasible. However, for parts of irregular shape such as castings or forged parts, the automation of fixture design is a complex problem. In this paper we address the problem of fixture layout for parts of complicated shape. Issues related to automatic design such as completeness, soundness, and computing efficiency are first discussed. A theoretical framework based on algebraic formulations is developed which models human reasoning employed for workpiece restraint. These formulations will ensure the completeness and soundness of the design methodology. They also provide a mechanism for constraint propagation in the design space. Finally, the generality of the framework is discussed.
We show that the usual notion of constraint propagation is but one of a number of similar inferences useful in quantitative reasoning about physical objects. These inferences are expressed formally as rules for the propagation of ‘labeled intervals’ through equations. We prove the rules' correctness and illustrate their utility for reasoning about objects (such as motors or transmissions) which assume a continuum of different states. The inferences are the basis of a ‘mechanical design compiler’, which has correctly produced detailed designs from ‘high level’ descriptions for a variety of power transmission and temperature sensing systems.