To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The circular economy (CE) seeks to replace traditional linear models by focusing on resource reuse and circulation. However, developing effective CE business strategies is difficult due to complex user behaviors and product flows. Existing scenario analysis tools often rely on survey-based conjoint methods, raising concerns about discrepancies with real purchasing patterns. This study introduces a data-driven simulation approach that employs a consumer preference model and product circulation processes based on actual operational data. Applied to a second-hand PC rental business, our method more accurately reproduces market behavior and reveals that targeting certain customer segments can enhance profitability and resource utilization. These findings underscore the approach’s value as a practical tool for pre-evaluating strategies in CE businesses.
Variation simulation approaches are frequently used to analyse the effects of geometrical variations on the final product quality. Various software tools are used during product development as they strongly differ in their specified goals, the context of use, and users. Although a few workarounds and information-sharing strategies exist, switching software usually results in the simulation model being built from scratch, leading to redundant manual effort and uncertainties. This paper examines the potential and limitations of the Quality Information Framework (QIF) information model in improving collaborative work within a heterogeneous simulation software landscape by exchanging variation simulation model-related information in a standardised Model-Based Definition sense. An application scenario shows how QIF can bridge the gap between tools used in early and late design phases.
This research examines, during the human-AI interaction process, how generative AI’s depiction of human bodies reflects and perpetuates able-bodied norms, positioning disabled or grotesque bodies as “errors.” Through a feminist and disability studies lens and employing archival research and visual analysis, this research challenges traditional notions of bodily normativity, advocating for inclusivity in AI-generated imagery. It underscores how labeling nonconformity as an error perpetuates able-bodied standards while erasing the visibility and autonomy of disabled bodies. By critiquing generative AI’s role in reinforcing societal norms, this study calls for reimagining human-AI interactions with a shift in perception and advocates for an approach that neither devalues nor excludes disabled bodies.
Food production systems are shaped by external factors, such as social events and economic shifts, which influence and are influenced by labour dynamics—e.g., workforce availability—and human factors—e.g., worker skills. Using a systems approach, this paper explores how labour shortages impacting worker teams—such as in terms of mixture of availability, skills, and human behaviours—affect production and quality. UK apple harvesting is chosen as a case study due to its reliance on skilled seasonal migrant workers. Findings highlight the need for strategies such as upskilling local workers, enhancing training programmes, and adopting new technologies to mitigate labour shortages and enable high-performance collaborative worker groups.
Lightweight design is critical for improving the efficiency and sustainability of engineering applications. Laminated composites, with their high strength-to-weight ratio and tailored material properties, play a key role but introduce interlaminar stresses, particularly near free edges where delamination failures often occur. Addressing these stresses typically requires computationally expensive 3D finite element simulations, limiting their use in early design stages. This study presents a machine learning approach using Gaussian process regression and artificial neural networks to efficiently predict interlaminar stresses based on in-plane stress data from shell FE simulations. Achieving high predictive accuracy, this method enables cost-effective, early-stage composite design optimization under complex loading scenarios.
Additive manufacturing (AM) enables the creation of complex internal geometries, including cooling channels. Yet, the impact of AM-induced surface roughness on their fluid dynamics remains underexplored. The goal of this study is to provide insight into the effects of surface roughness on the fluid dynamics of AM channels. A parametric surface roughness model and computational fluid dynamics (CFD) simulations were employed to examine three representative AM channel cross-sections: diamond, droplet, and circular. The findings indicate that diamond profiles result in higher pressure losses and turbulence intensity compared to the other cross-sections. In contrast, droplet profiles exhibit lower pressure losses and turbulence intensity compared to diamond profiles, while circular channels remain optimal in non-overhang areas.
The overall quality of final Digital Twin (DT) solutions and their ability to produce useful insights are key considerations for researchers and for the industry to readily adopt them. However, validation of DTs is often neglected in existing research dedicated to their development. Further, there is a lack of methodologies for building bi-directional information exchanges between virtual and real spaces, potentially hindering effective decision-making. This work presents a comparative analysis of several quantitative metrics by implementing them on the Digital Twin of a railway braking system as a use case. Their suitability as performance measures for validation and as thresholds to support decision-making is assessed. Their integration into a novel DT structure is shown to contribute to a well-rounded validation procedure and a practical decision-making framework.
Today, Manufacturing companies are adopting a servitization strategy and Product-Service System model to enhance value and remain competitive. Often, this transition also means to embrace a System-of-Systems (SoS) perspective. Concurrently, companies face challenges with volatile, uncertain, complex, and ambiguous (VUCA) environments. One way to tackle VUCA is to utilize simulation modeling. However, developing SoS simulations can be complex and cumbersome. This paper extracts lessons learned from six case studies to identify effective and ineffective practices in developing simulation models. The analysis has led to nine design principles for more effective simulation modeling. Furthermore, the paper explores simulation techniques for modeling SoS and discusses effective VUCA management. Finally, the paper proposes four future research directions to advance SoS simulation research.
Light weight design Plans am cranial role in enhancing efficiency and sustainability. The strategic use of advanced materials, such as fiber-reinforced plastics, can help achieving lightweight designs. However, the anisotropic material properties of composite materials also lead to new challenges in the design and manufacturing process. Additionally, due to the layered structure of composite parts, the number of design points is increased drastically. Moreover, the complex manufacturing process, including curing, makes composite parts prone to variations. Therefore, this research paper presents an innovative lightweight design approach that aims to overcome the described difficulties by linking the individual simulation steps, providing a continuous simulation strategy and taking variations into account. Finally, the presented simulation strategy is applied to an electrified cross skate.
Carbon fiber-reinforced plastics (CFRP) have a great lightweight potential due to their high strength-to-weight ratio. However, new challenges arise due to complex production processes and the large number of design parameters that are subject to variations. This study advances simulation methodologies to address these challenges by modeling the entire CFRP production process while accounting for fiber angle variations at each step. The approach enables prediction of assembly stresses and deformations by utilizing surrogate models, and supports further approaches, such as tolerance optimization and process refinement. Two case studies demonstrate the effectiveness of the method and illustrate its potential to support the optimization of the production process.
Recent academic contributions explore the integration of Digital Twins (DTs) within smart Product-Service System (sPSS). This integration aims to innovate business propositions, hardware and services. However, gaps persist in developing DT environments to support early-stage collaborative innovation for sPSS, and limited studies explore how real-time synchronized digital replicas enhance value co-creation in this area. This paper addresses this gap by presenting a framework and practical example of integrating value-driven decision support into early sPSS conceptual design. A case study on the development of the Smart Electric Vehicle (SEV) conducted with a global automotive Original Equipment Manufacturer (OEM) demonstrates the framework’s efficacy. Through qualitative data analyses based on experimental validation in a case company, the DT proves effective in aiding decision makers in selecting value-adding configurations within specific scenarios. Furthermore, the DT serves as a visual decision-making tool, fostering collaboration across diverse teams within the automotive company. This collaboration facilitates value creation across practitioners with varied backgrounds, emphasizing the DT’s role in enhancing early-stage innovation and co-creation processes in the sPSS domain.
Often in Software Engineering, a modeling formalism has to support scenarios of inconsistency in which several requirements either reinforce or contradict each other. Paraconsistent transition systems are proposed in this paper as one such formalism: states evolve through two accessibility relations capturing weighted evidence of a transition or its absence, respectively. Their weights come, parametrically, from a residuated lattice. This paper explores both i) a category of these systems, and the corresponding compositional operators and ii) a modal logic to reason upon them. Furthermore, two notions of crisp and graded simulation and bisimulation are introduced in order to relate two paraconsistent transition systems. Finally, results of modal invariance, for specific subsets of formulas, are discussed.
In this chapter, we describe how to jointly model continuous quantities, by representing them as multiple continuous random variables within the same probability space. We define the joint cumulative distribution function and the joint probability density function and explain how to estimate the latter from data using a multivariate generalization of kernel density estimation. Next, we introduce marginal and conditional distributions of continuous variables and also discuss independence and conditional independence. Throughout, we model real-world temperature data as a running example. Then, we explain how to jointly simulate multiple random variables, in order to correctly account for the dependence between them. Finally, we define Gaussian random vectors which are the most popular multidimensional parametric model for continuous data, and apply them to model anthropometric data.
The European Resuscitation Council (ERC) establishes guidelines for cardiopulmonary resuscitation (CPR) under standard conditions and special circumstances but without specific instructions for nighttime situations with reduced visibility. The aim of this study was to evaluate the feasibility of performing CPR at night under two different conditions, in darkness with ambient light and with the additional illumination of a headlamp, as well as to determine the quality of the maneuver.
Methods:
A crossover, randomized pilot study involving nineteen lifeguards was conducted, with each participant performing two five-minute CPR tests: complete darkness with headlamp and natural night environment at the beach without additional lighting. Both tests were conducted with a 30:2 ratio of chest compression (CC) to ventilations using mouth-to-pocket mask technique in the darkness of the night with a 30-minute break between them. Outcome measures included quality of CPR, number of CCs, mean depth of CCs, mean rate of CCs, and number of effective ventilations. Results were reported as the mean or median difference (MD) between the two groups with 95% confidence interval (CI) using techniques for paired data.
Results:
There were no statistically significant differences between the two lighting conditions for the outcomes of CPR quality, mean depth of CCs, or number of effective ventilations. The number of CCs was lower when performed without the headlamp (MD: -8; 95%CI, -15 to 0). In addition, the mean rate of CCs was lower when performed without the headlamp (MD: -3; 95%CI, -5 to -1).
Conclusions:
The rescuers performed CPR at night with good quality, both in darkness and with the illumination of a headlamp. The use of additional lighting with a headlamp does not appear to be essential for conducting resuscitation.
The ARLE GPS tool provides computer-aided design support for solving problems with the spatial planning and design of houses, using a robust design model with physical-biological and cost strategies. This enables architects to eliminate uncertainties and to make robust decisions by applying computational thinking to decision making and action implementation. This support enables the architect to deal with the complexity arising from the interrelationships between the design variables and transforms the spatial planning problem, which is conceptualized as illdefined, into a well-defined problem. A scientific method is used, based on mathematical modeling of the action-decision field of design geometric variables, rather than a drawn method involving sketches. This tool acts as an aid mechanism, an assembler, a simulator, and an evaluator of geometric prototypes (virtual or graphical) and can be used to systematize the assembly or modeling of the FPL structure, particularly with respect to the performance required of a house. This candidate solution, provided by the tool, defines the spatial dimensions of the rooms in the house, the topological data of the assembly sequence, and the connections between rooms. The architect converts this virtual prototype into a graphical FPL prototype, which is then modeled, refined and evaluated continuously and objectively with the aid of ARLE GPS until a solution is obtained that satisfies the requirements, constraints and objectives of the problem. In this way, a solution to the problem (i.e., the project) can be captured and generated.
This chapter covers applications of quantum computing in the area of nuclear and particle physics. We cover algorithms for simulating quantum field theories, where end-to-end problems include computing fundamental physical quantities and scattering cross sections. We also discuss simulations of nuclear physics, which encompasses individual nuclei as well as dense nucleonic matter such as neutron stars.
This chapter covers applications of quantum computing in the area of quantum chemistry, where the goal is to predict the physical properties and behaviors of atoms, molecules, and materials. We discuss algorithms for simulating electrons in molecules and materials, including both static properties such as ground state energies and dynamic properties. We also discuss algorithms for simulating static and dynamic aspects of vibrations in molecules and materials.
To enhance radiological and nuclear emergency preparedness of hospitals while responding to the refugee crisis, the Government of the Republic of Moldova implemented an innovative approach supported by the World Health Organization (WHO). This initiative featured a comprehensive package that integrated health system assessment, analysis of existing plans and procedures, and novel medical training component. The training, based on relevant WHO and International Atomic Energy Agency (IAEA) guidance, combined theory with contemporary adult learning solutions, such as practical skill stations, case reviews, and clinical simulation exercises.
This method allowed participants to identify and address gaps in their emergency response capacities, enhancing their ability to ensure medical management of radiological and nuclear events. This course is both innovative and adaptable, offering a potential model for other countries seeking to strengthen radiological and nuclear emergency response capabilities of the acute care clinical providers.
Due to the scarcity of data, the demographic regime of pre-plague England is poorly understood. In this article, we review the existing literature to estimate the mean age at first marriage for women (at 24) and men (at 27), the remaining life expectancy at first marriage for men (at 25 years), the mean household size (at 5.8), and marital fertility around 1300. Based on these values, we develop a macrosimulation that creates a consistent image of English demography at its medieval population peak that reflects a Western European marriage pattern with a relatively very high share of celibates.
This chapter surveys some of the many types of models used in science, and some of the many ways scientists use models. Of particular interest for our purposes are the relationships between models and other aspects of scientific inquiry, such as data, experiments, and theories. Our discussion shows important ways in which modeling can be thought of as a distinct and autonomous scientific activity, but always models can be crucial for making use of data and theories and for performing experiments. The growing reliance on simulation models has raised new and important questions about the kind of knowledge gained by simulations and the relationship between simulation and experimentation. Is it important to distinguish between simulation and experimentation, and if so, why?