To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
After discussing the divorce of configuration and observable that is characteristic of the quantum description of reality, the reader is introduced to the awesome potential computational power that is afforded by quantum computation.
The digital twin approach has gained recognition as a promising solution to the challenges faced by the Architecture, Engineering, Construction, Operations, and Management (AECOM) industries. However, its broader application across some AECOM sectors remains limited. A significant obstacle is that traditional DTs rely on deterministic models, which require deterministic input parameters. This limits their accuracy, as they do not account for the substantial uncertainties that are inherent in AECOM projects. These uncertainties are particularly pronounced in geotechnical design and construction. To address this challenge, we propose a probabilistic digital twin (PDT) framework that extends traditional DT methodologies by incorporating uncertainties and is tailored to the requirements of geotechnical design and construction. The PDT framework provides a structured approach to integrating all sources of uncertainty, including aleatoric, data, model, and prediction uncertainties, and propagates them throughout the entire modeling process. To ensure that site-specific conditions are accurately reflected as additional information is obtained, the PDT leverages Bayesian methods for model updating. The effectiveness of the PDT framework is showcased through an application to a highway foundation construction project, demonstrating its potential to integrate existing probabilistic methods to improve decision-making and project outcomes in the face of significant uncertainties. By embedding these methods within the PDT framework, we lower the barriers to practical implementation, making probabilistic approaches more accessible and applicable in real-world engineering workflows.
This paper documents the details of the design, verification, and certification of a novel technology: a remote monitoring system (digital twin) for a voyage data recorder, referred to as the HermAce Gateway. The electronic components, data transfer, and storage principle explain how the HermAce Gateway communicates and records safety-critical messages. Various prospective benefits to the industry are provided, primarily regarding the opportunities for remote support and testing that the digital twin facilitates. The HermAce Gateway was independently verified through a combination of semi-automated software in the loop and selected complimentary hardware in the loop tests. Different types of communication were simulated in multiple ways, including approximating real-world scenarios. Alarms contained in correctly formed messages were found to be detected and recorded by the HermAce Gateway, and a discussion of how this evidence can be quantified in the context of reducing uncertainty in the reliability of a digital twin. Certification of a digital system is a new concept in the maritime industry. The identification of functional requirements, which informed the verification testing, and the development of an AI register for what is expected to be an increasing number of such systems are also documented.
This article examines the impact of generative artificial intelligence (GAI) on higher education, emphasizing its effects in the broader educational contexts. As AI continues to reshape the landscape of teaching and learning, it is imperative for higher education institutions to adapt rapidly to equip graduates for the challenges of a progressively automated global workforce. However, a critical question emerges: will GAI lead to a more inclusive future of learning, or will it deepen existing divides and create a future where educational access and success are increasingly unequal? This study employs both theoretical and empirical approaches to explore the transformative potential of GAI. Drawing upon the literature on AI and education, we establish a framework that categorizes the essential knowledge and skills needed by graduates in the GAI era. This framework includes four key capability sets: AI ethics, AI literacy (focusing on human-replacement technologies), human–AI collaboration (emphasizing human augmentation), and human-distinctive capacities (highlighting unique human intelligence). Our empirical analysis involves scrutinizing GAI policy documents and the core curricula mandated for all graduates across leading Asian universities. Contrary to expectations of a uniform AI-driven educational transformation, our findings expose significant disparities in AI readiness and implementation among these institutions. These disparities, shaped by national and institutional specifics, are likely to exacerbate existing inequalities in educational outcomes, leading to divergent futures for individuals and universities alike in the age of GAI. Thus, this article not only maps the current landscape but also forecasts the widening educational gaps that GAI might engender.
Wheel-leg composite robots exhibit robust mobility and exceptional obstacle-crossing capabilities in complex environments. This paper proposes a novel transformable wheel-leg composite structure and presents the design of a wheel-leg composite obstacle-crossing robot, fundamentally configured as a two-wheeled quadruped. The research encompasses a comprehensive analysis of the robot’s overall mechanical structure, a detailed kinematic investigation of its body and obstacle-crossing gait planning, virtual prototype dynamics simulation, and field experimentation. Utilizing advanced modeling software, a 3D model of the robot was established. The kinematic characteristics of the robot in both wheeled and legged modes were thoroughly examined. Specifically, for the legged mode, the Denavit-Hartenberg coordinate system was established, and a detailed kinematic model was analyzed. The obstacle-crossing gait was planned based on the robot’s leg action mechanism. Furthermore, the Lagrangian method was employed to develop a mathematical model for the dynamics of the robot in both wheel-foot modes, allowing for a comprehensive force analysis. To validate the feasibility and rationality of the robot’s obstacle-crossing capabilities under various conditions, extensive simulations and prototype tests were conducted across diverse terrains. The results provide valuable insights and practical guidance for the structural design of wheel-leg composite obstacle-crossing robots, contributing to advancements in this promising field.
Detecting cracks in underwater dams is crucial for ensuring the quality and safety of the dam. However, underwater dam cracks are easily obscured by aquatic plants. Traditional single-view visual inspection methods cannot effectively extract the feature information of the occluded cracks, while multi-view crack images can extract the occluded target features through feature fusion. At the same time, underwater turbulence leads to nonuniform diffusion of suspended sediments, resulting in nonuniform flooding of image feature noise from multiple viewpoints affecting the fusion effect. To address these issues, this paper proposes a multi-view fusion network (MVFD-Net) for crack detection in occluded underwater dams. First, we propose a feature reconstruction interaction encoder (FRI-Encoder), which interacts the multi-scale local features extracted by the convolutional neural network with the global features extracted by the transformer encoder and performs the feature reconstruction at the end of the encoder to enhance the feature extraction capability and at the same time in order to suppress the interference of the nonuniform scattering noise. Subsequently, a multi-scale gated adaptive fusion module is introduced between the encoder and the decoder for feature gated fusion, which further complements and recovers the noise flooding detail information. Additionally, this paper designs a multi-view feature fusion module to fuse multi-view image features to restore the occluded crack features and achieve the detection of occluded cracks. Through extensive experimental evaluations, the MVFD-Net algorithm achieves excellent performance when compared with current mainstream algorithms.
Trajectory optimization is a critical research area in robotics and automation, especially in manufacturing industries where mechanical systems are often required to minimize the execution time or the consumed energy. In this context, the most common mechanical systems are those with a single degree of freedom because of their simplicity and ease of control. In this paper, we present an approach for the online optimization of minimum-time and minimum-energy trajectories for a robotic system with one degree of freedom. Point-to-point motions of the considered linear axis are planned online, without idle times, by leveraging a verified dynamic model of the robotic system, which also includes an accurate identification of friction parameters. Both minimum-time and minimum-energy trajectories are considered, and the performance of the online optimization using a selected open-source optimization tool is verified in different dynamic conditions of the system. The results of extensive experiments on a belt-driven robotic axis demonstrate the feasibility and the energy-saving capabilities of the proposed approach, as well as the flexibility of the online trajectory optimization in different scenarios, while meeting the kinematics and dynamics limits of the system and guaranteeing low computational time.
Efficient local trajectory optimization of the coordinated manipulator is a bottleneck task in the narrow feeding scenario. To optimize the trajectory locally and generate collision-free trajectories with local support performance, the analytical reinforcement method is proposed. Firstly, multiple coordinated machines operating in the narrow space are transformed into decentralized dynamic constraints for the target manipulator. Combined with the circle envelope method in the dynamic constraint, the collision-free gradient optimization function determines the support region of the local optimal trajectory. Based on the forward kinematics and inverse kinematics method, the collision-prone pose of the target manipulator outside the support region is analytically optimized. And chi-square distribution further ensures the smooth interpolation of the variable-period trajectory outside the fixed-period support region. For the emergency collision avoidance of the coordinated manipulator in the flexible stamping line, the analytical reinforcement method is successfully verified by generating the collision-free and smooth trajectory. It provides an optimization direction for rapidly improving the work efficiency of multi-machine coordination in the narrow feeding scenario.
Semi-simplicial and semi-cubical sets are commonly defined as presheaves over, respectively, the semi-simplex or semi-cube category. Homotopy type theory then popularized an alternative definition, where the set of $n$-simplices or $n$-cubes are instead regrouped into the families of the fibers over their faces, leading to a characterization we call indexed. Moreover, it is known that semi-simplicial and semi-cubical sets are related to iterated Reynolds parametricity, respectively, in their unary and binary variants. We exploit this correspondence to develop an original uniform indexed definition of both augmented semi-simplicial and semi-cubical sets, and fully formalize it in Coq.
While modern definitions of business processes exist and are shared in the business process management (BPM) community, a commonly agreed meta-model is still missing. Nonetheless, several different business process meta-models have been proposed and discussed in the literature, which look at business process models from different perspectives, focusing on different aspects and often using different labels for denoting the same element or element relation.
In this paper, we extend and consolidate an effort of building a business process meta-model starting from elements and relations discovered inspecting relevant literature through a systematic literature review. The obtained literature-based business process meta-model, which is on purpose built to disclose critical issues, is then inspected, compared to a previous, more restricted, version, and discussed. The analysis confirms a lack of attention to some crucial business process elements, as well as the presence of some unclear relations and subsumption cycles. Moreover it brings about new issues and inconsistencies in the meta-models proposed in literature, which we address - at least in part - using an ontological analysis.
Stochastic actor-oriented models (SAOMs) were designed in the social network setting to capture network dynamics representing a variety of influences on network change. The standard framework assumes the observed networks are free of false positive and false negative edges, which may be an unrealistic assumption. We propose a hidden Markov model (HMM) extension to these models, consisting of two components: 1) a latent model, which assumes that the unobserved, true networks evolve according to a Markov process as they do in the SAOM framework; and 2) a measurement model, which describes the conditional distribution of the observed networks given the true networks. An expectation-maximization algorithm is developed for parameter estimation. We address the computational challenge posed by a massive discrete state space, of a size exponentially increasing in the number of vertices, through the use of the missing information principle and particle filtering. We present results from a simulation study, demonstrating our approach offers improvement in accuracy of estimation, in contrast to the standard SAOM, when the underlying networks are observed with noise. We apply our method to functional brain networks inferred from electroencephalogram data, revealing larger effect sizes when compared to the naive approach of fitting the standard SAOM.
Collaborative engineering design is increasingly important for modern engineering practices as projects routinely require collaboration across multiple domains. Reaching shared understanding within the team is a critical factor in constructing a successful and enjoyable collaboration. One way to promote shared understanding is through the use of design artifacts and design representations as boundary objects. Different design representations have unique characteristics that benefit the engineering design process but could also hinder the development of shared understanding. It is important to identify the limitations of the design artifacts to select the suitable design artifact for the situation and mitigate potential adverse effects, including design fixation and miscommunication. Despite previous studies’ findings, there are still unsolved questions regarding the exact effect of the modality of the design representations on the development of team-shared understanding. This work examines three types of commonly used design representations in the engineering design community, namely, textual description, hand sketch and engineering CAD model. Their unique effect on the development of shared understanding is investigated in a collaborative engineering design setting. The results indicate that the modality of the design artifact would affect the development of shared understanding, and using visual representations can yield better team outcomes regardless of the modality complexity, mainly for design structures. This work shows the importance of using the proper design representation in collaborative engineering design tasks, and such a finding is a critical and timely reminder in the current age when team interactions constantly involve text-dominant online communications.
This self-contained guide introduces two pillars of data science, probability theory, and statistics, side by side, in order to illuminate the connections between statistical techniques and the probabilistic concepts they are based on. The topics covered in the book include random variables, nonparametric and parametric models, correlation, estimation of population parameters, hypothesis testing, principal component analysis, and both linear and nonlinear methods for regression and classification. Examples throughout the book draw from real-world datasets to demonstrate concepts in practice and confront readers with fundamental challenges in data science, such as overfitting, the curse of dimensionality, and causal inference. Code in Python reproducing these examples is available on the book's website, along with videos, slides, and solutions to exercises. This accessible book is ideal for undergraduate and graduate students, data science practitioners, and others interested in the theoretical concepts underlying data science methods.
The manufacturing sector is witnessing a paradigm shift toward servitization, where companies are transitioning from selling products to offering product–service systems. This shift creates additional challenges, where the providers must ensure the expected value throughout the operational phase of the solution. Especially when dealing with a system-of-systems (SoS), evaluating performance across diverse contexts and business models while understanding the interconnectedness between systems becomes critical. To address these challenges during the design phase, this article presents a novel integrated simulation framework that supports the development team in exploring value from a SoS perspective. This framework utilizes agent-based simulation and offers three key features: multifidelity, modular and multidisciplinary. The applicability of the proposed framework is further demonstrated in a quarry industry case.
Human-centered design involves designing for users who may have social identities that are dissimilar from designers’ social identities. These differences could impact designers’ ability to understand users’ needs and integrate considerations of social identity into design decisions. Reflective interventions could encourage designers to actively consider social identity in design and our aim in this research is to explore this hypothesis through an experimental study. We tested the effects of completing a social identity-based reflection exercise on novice designers’ task clarification behavior. We also qualitatively examined the quality and content of the reflection responses. We find that participants who completed the intervention generated more social identity-focused design requirements, irrespective of the persona provided to them. Additionally, the content analysis revealed that designers who occupy minority identities (e.g., women and students of color) were more likely to provide deeper and higher-quality reflection responses. These findings suggest that reflective interventions could be an effective mechanism to promote inclusive design, leading to the design of products that users across social identities can use equitably. Furthermore, designers with different social identities may require different reflection cues (e.g., ones more focused on their personal experiences), to encourage deeper reflection on the effects of social identity in design.
Unmanned surface vehicles (USVs) frequently encounter inadequate energy levels while navigating to their destinations, which complicates their successful berthing in intricate harbor environments. A bacterial foraging optimization algorithm (BFO) is proposed that takes energy consumption into account and incorporates multiple constraints (MC-BFO). The energy consumption model is redefined for wind environments, enhancing the sensitivity of USVs to wind conditions. Additionally, a reward function is integrated into the algorithm, and the fitness function is reconstructed to improve the goal orientation of the USV. This approach enables the USV to maintain a reasonable path length while pursuing low energy consumption, resulting in more practical navigation. Constraining the USV’s sailing posture for smoother paths and restricting the USV’s heading and speed near the berthage facilitate safe berthing. Finally, three distinct experimental environments are established to compare the paths generated by MC-BFO, BFO, and genetic algorithm under both downwind and upwind conditions, ensuring consistency in relevant parameters. Data on sailing posture, energy consumption, and path length are collected, generalized, and analyzed. The results indicate that MC-BFO effectively reduces energy consumption while maintaining an acceptable path length, resulting in smoother and more coherent paths compared to traditional segmented planning. In conclusion, this method significantly enhances the quality of the berthing path.
This article examines the creation of an Urban Archive as an English Garden, a work that uses GPU-accelerated low-resolution wavefield synthesis (WFS) to combine field recordings, live performance and generative audio in real time. Owing to computational overhead, WFS is often pre-rendered, leading to a less dynamic and more static scope for the embodied and intersubjective nature of human sensory understanding, a tendency that can also be found in traditional soundscape composition. We argue that engagement with real-time WFS offers a new approach to soundscape composition, wherein musical-system design may have multiple agencies, or that of virtual environment, co-creator, archive and hybrid instrument. Through a post-phenomenological lens, an analysis of the work’s creation through different domains reveals how these technologies afford novel practices to engage with our sonic environments. Additionally, the article unpacks how this same process, grounded in site-responsive design offers new approaches to composition, performance and artistic collaboration across these practices.
This paper presents the design and experimental validation of a robust flight control strategy for quadrotor unmanned aerial vehicles (UAVs) based on the Interconnection and Damping Assignment Passivity-Based Control (IDA-PBC) methodology. The proposed approach is specifically tailored to the Parrot Bebop 2, a commercial UAV. The IDA-PBC control law is derived using the Hamiltonian model of the UAV dynamics obtained from experimental data to represent the dynamics of all six degrees of freedom, including translational and rotational motions. The control strategy was validated through numerical simulations and experimental tests conducted in an indoor flight setup using MATLAB, Robot Operating System, and an OptiTrack motion capture system. Numerical and experimental results demonstrate that the controller effectively tracks desired flight trajectories, ensuring stable and robust performance.