To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This paper discusses the development of synthetic cohomology in Homotopy Type Theory (HoTT), as well as its computer formalisation. The objectives of this paper are (1) to generalise previous work on integral cohomology in HoTT by the current authors and Brunerie (2022) to cohomology with arbitrary coefficients and (2) to provide the mathematical details of, as well as extend, results underpinning the computer formalisation of cohomology rings by the current authors and Lamiaux (2023). With respect to objective (1), we provide new direct definitions of the cohomology group operations and of the cup product, which, just as in the previous work by the current authors and Brunerie (2022), enable significant simplifications of many earlier proofs in synthetic cohomology theory. In particular, the new definition of the cup product allows us to give the first complete formalisation of the axioms needed to turn the cohomology groups into a graded commutative ring. We also establish that this cohomology theory satisfies the HoTT formulation of the Eilenberg–Steenrod axioms for cohomology and study the classical Mayer–Vietoris and Gysin sequences. With respect to objective (2), we characterise the cohomology groups and rings of various spaces, including the spheres, torus, Klein bottle, real/complex projective planes, and infinite real projective space. All results have been formalised in Cubical Agda, and we obtain multiple new numbers, similar to the famous ‘Brunerie number’, which can be used as benchmarks for computational implementations of HoTT. Some of these numbers are infeasible to compute in Cubical Agda and hence provide new computational challenges and open problems which are much easier to define than the original Brunerie number.
The family of relevant logics can be faceted by a hierarchy of increasingly fine-grained variable sharing properties—requiring that in valid entailments $A\to B$, some atom must appear in both A and B with some additional condition (e.g., with the same sign or nested within the same number of conditionals). In this paper, we consider an incredibly strong variable sharing property of lericone relevance that takes into account the path of negations and conditionals in which an atom appears in the parse trees of the antecedent and consequent. We show that this property of lericone relevance holds of the relevant logic $\mathbf {BM}$ (and that a related property of faithful lericone relevance holds of $\mathbf {B}$) and characterize the largest fragments of classical logic with these properties. Along the way, we consider the consequences for lericone relevance for the theory of subject-matter, for Logan’s notion of hyperformalism, and for the very definition of a relevant logic itself.
In previous publications, it was shown that finite non-deterministic matrices are quite powerful in providing semantics for a large class of normal and non-normal modal logics. However, some modal logics, such as those whose axiom systems contained the Löb axiom or the McKinsey formula, were not analyzed via non-deterministic semantics. Furthermore, other modal rules than the rule of necessitation were not yet characterized in the framework.
In this paper, we will overcome this shortcoming and present a novel approach for constructing semantics for normal and non-normal modal logics that is based on restricted non-deterministic matrices. This approach not only offers a uniform semantical framework for modal logics, while keeping the interpretation of the involved modal operators the same, and thus making different systems of modal logic comparable. It might also lead to a new understanding of the concept of modality.
In this paper, we aim to investigate the fluid model associated with an open large-scale storage network of non-reliable file servers with finite capacity, where new files can be added, and a file with only one copy can be lost or duplicated. The Skorokhod problem with oblique reflection in a bounded convex domain is used to identify the fluid limits. This analysis involves three regimes: the under-loaded, the critically loaded, and the overloaded regimes. The overloaded regime is of particular importance. To identify the fluid limits, new martingales are derived, and an averaging principle is established. This paper extends the results of El Kharroubi and El Masmari [7].
This paper introduces the Arachne System, a scalable, cost-effective mobile microrobot swarm platform designed for educational and research applications. It details the design, functionalities, and potential of the Arachne Bots, emphasizing their accessibility to users with minimal robotics expertise. By providing a comprehensive overview of the system’s hardware, sensory capabilities, and control algorithms, the paper demonstrates the platform’s capacity to democratize and reduce entry barriers in mobile robotic swarms research, fostering innovation and educational opportunities in the field. Extensive experimental validation of the system showcases its broad range of capabilities and effectiveness in real-world implementation.
We discuss the logical principle of extensionality for set-valued operators and its relation to mathematical notions of continuity for these operators in the context of systems of finite types as used in proof mining. Concretely, we initially exhibit an issue that arises with treating full extensionality in the context of the prevalent intensional approach to set-valued operators in such systems. Motivated by these issues, we discuss a range of useful fragments of this full extensionality statement where these issues are avoided and discuss their interrelations. Further, we study the continuity principles associated with these fragments of extensionality and show how they can be introduced in the logical systems via a collection of axioms that do not contribute to the growth of extractable bounds from proofs. In particular, we place an emphasis on a variant of extensionality and continuity formulated using the Hausdorff-metric and, in the course of our discussion, we in particular employ a tame treatment of suprema over bounded sets developed by the author in previous work to provide the first proof-theoretically tame treatment of the Hausdorff metric in systems geared for proof mining. To illustrate the applicability of these treatments for the extraction of quantitative information from proofs, we provide an application of proof mining to the Mann iteration of set-valued mappings which are nonexpansive w.r.t. the Hausdorff metric and extract highly uniform and effective quantitative information on the convergence of that method.
Structural induction is pervasively used by functional programmers and researchers for both informal reasoning as well as formal methods for program verification and semantics. In this paper, we promote its dual—structural coinduction—as a technique for understanding corecursive programs only in terms of the logical structure of their context. We illustrate this technique as an informal method of proofs which closely match the style of informal inductive proofs, where it is straightforward to check that all cases are covered and the coinductive hypotheses are used correctly. This intuitive idea is then formalized through a syntactic theory for deriving program equalities, which is justified purely in terms of the computational behavior of abstract machines and proved sound with respect to observational equivalence.
Counting independent sets in graphs and hypergraphs under a variety of restrictions is a classical question with a long history. It is the subject of the celebrated container method which found numerous spectacular applications over the years. We consider the question of how many independent sets we can have in a graph under structural restrictions. We show that any $n$-vertex graph with independence number $\alpha$ without $bK_a$ as an induced subgraph has at most $n^{O(1)} \cdot \alpha ^{O(\alpha )}$ independent sets. This substantially improves the trivial upper bound of $n^{\alpha },$ whenever $\alpha \le n^{o(1)}$ and gives a characterisation of graphs forbidding which allows for such an improvement. It is also in general tight up to a constant in the exponent since there exist triangle-free graphs with $\alpha ^{\Omega (\alpha )}$ independent sets. We also prove that if one in addition assumes the ground graph is chi-bounded one can improve the bound to $n^{O(1)} \cdot 2^{O(\alpha )}$ which is tight up to a constant factor in the exponent.
Most research on UAV swarm architectures remains confined to simulation-based studies, with limited real-world implementation and validation. In order to mitigate this issue, this research presents an improved task allocation and formation control system within ARCog-NET (Aerial Robot Cognitive Architecture), aimed at deploying autonomous UAV swarms as a unified and scalable solution. The proposed architecture integrates perception, planning, decision-making, and adaptive learning, enabling UAV swarms to dynamically adjust path planning, task allocation, and formation control in response to evolving mission demands. Inspired by artificial intelligence and cognitive science, ARCog-NET employs an Edge-Fog-Cloud (EFC) computing model, where edge UAVs handle real-time data acquisition and local processing, fog nodes coordinate intermediate control, and cloud servers manage complex computations, storage, and human supervision. This hierarchical structure balances real-time autonomy at the UAV level with high-level optimization and decision-making, creating a collective intelligence system that automatically fine-tunes decision parameters based on configurable triggers. To validate ARCog-NET, a realistic simulation framework was developed using SITL (Software-In-The-Loop) with actual flight controller firmware and ROS-based middleware, enabling high-fidelity emulation. This framework bridges the gap between virtual simulations and real-world deployments, allowing evaluation of performance in environmental monitoring, search and rescue, and emergency communication network deployment. Results demonstrate superior energy efficiency, adaptability, and operational effectiveness compared to conventional robotic swarm methodologies. By dynamically optimizing data processing based on task urgency, resource availability, and network conditions, ARCog-NET bridges the gap between theoretical swarm intelligence models and real-world UAV applications, paving the way for future large-scale deployments.
Stroke is a prevalent neurological event that often induces significant motor impairments in the upper extremities, such as hemiplegia, which impacts bimanual coordination and fine motor skills. Robotic-assisted therapy has gained prominence as a contemporary rehabilitation modality, providing augmented motor repetitions and proprioceptive feedback, thereby potentiating neuroplasticity and functional recovery. This pilot study aimed to examine the therapeutic efficacy of a robotic intervention for wrist rehabilitation in two post-stroke adults aged 50–70 years. The intervention protocol, implemented biweekly over four weeks, encompassed 45-minute sessions consisting of passive muscle elongation (5 min) and robotic-facilitated exercises targeting pronation-supination (10 min), flexion-extension (10 min), and radial-ulnar deviation (10 min). Outcome measures included pre- and post-intervention assessments utilizing the motor activity log, Fugl-Meyer Scale, and robotic metrics for muscular strength. Results indicated enhancements in joint range of motion, motor precision, and neuromuscular control, with patient “B” demonstrating superior improvements, particularly in complex motor patterns. In contrast, patient “A” exhibited attenuated progress, attributable to pronounced baseline deficits and fatigue. Specific gains were observed in flexion-extension for patient “A” and pronation-supination for patient “B,” with minimal advancements in radial-ulnar deviation across both subjects. These findings provide preliminary evidence supporting the efficacy of robotic-assisted therapy in motor rehabilitation post-stroke with the novel proposed wrist rehabilitation device.
Recent academic contributions explore the integration of Digital Twins (DTs) within smart Product-Service System (sPSS). This integration aims to innovate business propositions, hardware and services. However, gaps persist in developing DT environments to support early-stage collaborative innovation for sPSS, and limited studies explore how real-time synchronized digital replicas enhance value co-creation in this area. This paper addresses this gap by presenting a framework and practical example of integrating value-driven decision support into early sPSS conceptual design. A case study on the development of the Smart Electric Vehicle (SEV) conducted with a global automotive Original Equipment Manufacturer (OEM) demonstrates the framework’s efficacy. Through qualitative data analyses based on experimental validation in a case company, the DT proves effective in aiding decision makers in selecting value-adding configurations within specific scenarios. Furthermore, the DT serves as a visual decision-making tool, fostering collaboration across diverse teams within the automotive company. This collaboration facilitates value creation across practitioners with varied backgrounds, emphasizing the DT’s role in enhancing early-stage innovation and co-creation processes in the sPSS domain.
Often in Software Engineering, a modeling formalism has to support scenarios of inconsistency in which several requirements either reinforce or contradict each other. Paraconsistent transition systems are proposed in this paper as one such formalism: states evolve through two accessibility relations capturing weighted evidence of a transition or its absence, respectively. Their weights come, parametrically, from a residuated lattice. This paper explores both i) a category of these systems, and the corresponding compositional operators and ii) a modal logic to reason upon them. Furthermore, two notions of crisp and graded simulation and bisimulation are introduced in order to relate two paraconsistent transition systems. Finally, results of modal invariance, for specific subsets of formulas, are discussed.
In this study, professional engineers and designers (n = 30) participated in a 1-hour-long design activity in which they brainstormed a list of ideas for two design problems (a smart grill and a smart laundry machine), created a sketched concept for each design problem, filled out a survey about their perceptions of the market for the concept they developed, participated in a bias mitigation intervention and then repeated the pre-intervention steps. The design problems were intended to trigger availability bias based on the participants’ occupations (engineers and designers at a kitchen appliance company) as well as conflict between the gender of the participants and the gender-stereotyping of the household tasks fulfilled by the smart machines. Based on correlations in the market survey, the participants, who were mostly men, displayed availability bias toward the smart laundry machine design problem. A key marker of availability bias – an association between participants’ personal enjoyment of the product and the belief that the product would be commercially successful – was eliminated after the bias mitigation intervention. Qualitative analysis of participants’ reflections indicated that the intervention primarily assisted designers in making additional considerations for users, such as increasing accessibility and building awareness of excluded user groups.
The central role of creativity in product engineering is evident in the generation of solutions with high innovative potential. Even in times of artificial intelligence being creative is still a skill in which the human outperforms the machine. Product engineering activities often take place in distributed environments, which elevates the importance of creative tasks due to the unique challenges these settings present. Furthermore, these distributed environments frequently involve intercultural teams. With intercultural team settings come additional benefits but also challenges. To support the creative processes of intercultural, distributed product engineering teams, the cultural synergy spectrum (CSS) method has been developed. The CSS method is designed to assist distributed product engineering teams with being creative while being culturally sensitive. To achieve this goal, mutual understanding is enhanced, and learning within the team is promoted. Using five phases to lead the participants through a creative process, the CSS starts with a warm-up, followed by building a knowledge baseline. The third phase is targeted at cultural learning, after which the creativity phase starts. Here, the actual problem-solving takes place. The final phase is for reflection and feedback. This study seeks to validate the CSS method’s effectiveness through application in a partially distributed team. Two teams, consisting of mechanical engineers in a research group at Shanghai Jiao Tong University, collaborated to address a practical problem using this method. The team is primarily Chinese as a follow-up to previous validation iterations that were done with teams with more diverse backgrounds, but who lived in Germany. To ensure that this bias due to the intercultural experience of living in another country is overcome, this study is performed with researchers in China with little intercultural experience. The CSS was applied successfully, proving that the CSS is suitable for the partially distributed or hybrid setting in which it was applied and for the team that applied it. The participants made use of the option to include additional tools and improvements to the method, like a more comprehensive warm-up.
In this chapter, we describe how to jointly model continuous quantities, by representing them as multiple continuous random variables within the same probability space. We define the joint cumulative distribution function and the joint probability density function and explain how to estimate the latter from data using a multivariate generalization of kernel density estimation. Next, we introduce marginal and conditional distributions of continuous variables and also discuss independence and conditional independence. Throughout, we model real-world temperature data as a running example. Then, we explain how to jointly simulate multiple random variables, in order to correctly account for the dependence between them. Finally, we define Gaussian random vectors which are the most popular multidimensional parametric model for continuous data, and apply them to model anthropometric data.
Le Liang, Southeast University, Nanjing,Shi Jin, Southeast University, Nanjing,Hao Ye, University of California, Santa Cruz,Geoffrey Ye Li, Imperial College of Science, Technology and Medicine, London
This chapter focuses on correlation, a key metric in data science that quantifies to what extent two quantities are linearly related. We begin by defining correlation between normalized and centered random variables. Then, we generalize the definition to all random variables and introduce the concept of covariance, which measures the average joint variation of two random variables. Next, we explain how to estimate correlation from data and analyze the correlation between the height of NBA players and different basketball stats.In addition, we study the connection between correlation and simple linear regression. We then discuss the differences between uncorrelation and independence. In order to gain better intuition about the properties of correlation, we provide a geometric interpretation of correlation, where the covariance is an inner product between random variables. Finally, we show that correlation does not imply causation, as illustrated by the spurious correlation between temperature and unemployment in Spain.
Le Liang, Southeast University, Nanjing,Shi Jin, Southeast University, Nanjing,Hao Ye, University of California, Santa Cruz,Geoffrey Ye Li, Imperial College of Science, Technology and Medicine, London