To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Increasing penetration of variable and intermittent renewable energy resources on the energy grid poses a challenge for reliable and efficient grid operation, necessitating the development of algorithms that are robust to this uncertainty. However, standard algorithms incorporating uncertainty for generation dispatch are computationally intractable when costs are nonconvex, and machine learning-based approaches lack worst-case guarantees on their performance. In this work, we propose a learning-augmented algorithm, RobustML, that exploits the good average-case performance of a machine-learned algorithm for minimizing dispatch and ramping costs of dispatchable generation resources while providing provable worst-case guarantees on cost. We evaluate the algorithm on a realistic model of a combined cycle cogeneration plant, where it exhibits robustness to distribution shift while enabling improved efficiency as renewables penetration increases.
In recent years, passive motion paradigms (PMPs), derived from the equilibrium point hypothesis and impedance control, have been utilised as manipulation methods for humanoid robots and robotic manipulators. These paradigms are typically achieved by creating a kinematic chain that enables the manipulator to perform goal-directed actions without explicitly solving the inverse kinematics. This approach leverages a kinematic model constructed through the training of artificial neural networks, aligning well with principles of cybernetics and cognitive computation by enabling adaptive and flexible control. Specifically, these networks model the relationship between joint angles and end-effector positions, facilitating the computation of the Jacobian matrix. Although this method does not require an accurate robot model, traditional neural networks often suffer from drawbacks such as overfitting and inefficient training, which can compromise the accuracy of the final PMP model. In this paper, we implement the method using a deep neural network and investigate the impact of activation functions and network depth on the performance of the kinematic model. Additionally, we propose a transfer learning approach to fine-tune the pre-trained model, enabling it to be transferred to other manipulator arms with different kinematic properties. Finally, we implement and evaluate the deep neural network-based PMP on the Universal Robots, comparing it with traditional kinematic controllers and assessing its physical interaction capabilities and accuracy.
A topological space has a domain model if it is homeomorphic to the maximal point space $\mbox{Max}(P)$ of a domain $P$. Lawson proved that every Polish space $X$ has an $\omega$-domain model $P$ and for such a model $P$, $\mbox{Max}(P)$ is a $G_{\delta }$-set of the Scott space of $P$. Martin (2003) then asked whether it is true that for every $\omega$-domain $Q$, $\mbox{Max}(Q)$ is $G_{\delta }$-set of the Scott space of $Q$. In this paper, we give a negative answer to Martin’s long-standing open problem by constructing a counterexample. The counterexample here actually shows that the answer is no even for $\omega$-algebraic domains. In addition, we also construct an $\omega$-ideal domain $\widetilde{Q}$ for the constructed $Q$ such that their maximal point spaces are homeomorphic. Therefore, $\textrm{Max}(Q)$ is a $G_\delta$-set of the Scott space of the new model $\widetilde{Q}$ .
Smooth Infinitesimal Analysis (SIA) is a remarkable late twentieth-century theory of analysis. It is based on nilsquare infinitesimals, and does not rely on limits. SIA poses a challenge of motivating its use of intuitionistic logic beyond merely avoiding inconsistency. The classical-modal account(s) provided here attempt to do just that. The key is to treat the identity of an arbitrary nilsquare, e, in relation to 0 or any other nilsquare, as objectually vague or indeterminate—pace a famous argument of Evans [10]. Thus, we interpret the necessity operator of classical modal logic as “determinateness” in truth-value, naturally understood to satisfy the modal system, S4 (the accessibility relation on worlds being reflexive and transitive). Then, appealing to the translation due to Gödel et al., and its proof-theoretic faithfulness (“mirroring theorem”), we obtain a core classical-modal interpretation of SIA. Next we observe a close connection with Kripke semantics for intuitionistic logic. However, to avoid contradicting SIA’s non-classical treatment of identity relating nilsquares, we translate “=” with a non-logical surrogate, ‘E,’ with requisite properties. We then take up the interesting challenge of adding new axioms to the core CM interpretation. Two mutually incompatible ones are considered: one being the positive stability of identity and the other being a kind of necessity of indeterminate identity (among nilsquares). Consistency of the former is immediate, but the proof of consistency of the latter is a new result. Finally, we consider moving from CM to a three-valued, semi-classical framework, SCM, based on the strong Kleene axioms. This provides a way of expressing “indeterminacy” in the semantics of the logic, arguably improving on our CM. SCM is also proof-theoretically faithful, and the extensions by either of the new axioms are consistent.
On both global and local levels, one can observe a trend toward the adoption of algorithmic regulation in the public sector, with the Chinese social credit system (SCS) serving as a prominent and controversial example of this phenomenon. Within the SCS framework, cities play a pivotal role in its development and implementation, both as evaluators of individuals and enterprises and as subjects of evaluation themselves. This study engages in a comparative analysis of SCS scoring mechanisms for individuals and enterprises across diverse Chinese cities while also scrutinizing the scoring system applied to cities themselves. We investigate the extent of algorithmic regulation exercised through the SCS, elucidating its operational dynamics at the city level in China and assessing its interventionism, especially concerning the involvement of algorithms. Furthermore, we discuss ethical concerns surrounding the SCS’s implementation, particularly regarding transparency and fairness. By addressing these issues, this article contributes to two research domains: algorithmic regulation and discourse surrounding the SCS, offering valuable insights into the ongoing utilization of algorithmic regulation to tackle governance and societal challenges.
Africa had a busy election calendar in 2024, with at least 19 countries holding presidential or general elections. In a continent with a large youth population, a common theme across these countries is a desire for citizens to have their voices heard, and a busy election year offers an opportunity for the continent to redeem its democratic credentials and demonstrate its leaning towards strengthening free and fair elections and a more responsive and democratic governance. Given the central role that governance plays in security in Africa, the stakes from many of these elections are high, not only to achieve a democratically elected government but also to achieve stability and development. Since governance norms, insecurity, and economic buoyancy are rarely contained by borders, the conduct and outcomes from each of these elections will also have implications for neighbouring countries and the continent overall. This article considers how the results of recent elections across Africa have been challenged in courts based on mistrust in the use of technology platforms, how the deployment of emerging technology, including AI, is casting a shadow on the integrity of elections in Africa, and the policy options to address these emerging trends with a particular focus on governance of AI technologies through a human rights-based approach and equitable public procurement practices.
This book applies rotation theory to problems involving vectors and coordinates, with an approach that combines easily visualised procedures with smart mathematics. It constructs rotation theory from the ground up, building from basic geometry through to the motion and attitude equations of rockets, and the tensor analysis of relativity. The author replaces complicated pictures of superimposed axes with a simple and intuitive procedure of rotating a model aircraft, to create rotation sequences that are easily turned into mathematics. He combines the best of the 'active' and 'passive' approaches to rotation into a single coherent theory, and discusses many potential traps for newcomers. This volume will be useful to astronomers and engineers sighting planets and satellites, computer scientists creating graphics for movies, and aerospace engineers designing aircraft; also to physicists and mathematicians who study its abstract aspects.
Sustainable agricultural practices have become increasingly important due to growing environmental concerns and the urgent need to mitigate the climate crisis. Digital agriculture, through advanced data analysis frameworks, holds promise for promoting these practices. Pesticides are a common tool in agricultural pest control, which are key in ensuring food security but also significantly contribute to the climate crisis. To combat this, Integrated Pest Management (IPM) stands as a climate-smart alternative. We propose a causal and explainable framework for enhancing digital agriculture, using pest management and its sustainable alternative, IPM, as a key example to highlight the contributions of causality and explainability. Despite its potential, IPM faces low adoption rates due to farmers’ skepticism about its effectiveness. To address this challenge, we introduce an advanced data analysis framework tailored to enhance IPM adoption. Our framework provides (i) robust pest population predictions across diverse environments with invariant and causal learning, (ii) explainable pest presence predictions using transparent models, (iii) actionable advice through counterfactual explanations for in-season IPM interventions, (iv) field-specific treatment effect estimations, and (v) assessments of the effectiveness of our advice using causal inference. By incorporating these features, our study illustrates the potential of causality and explainability concepts to enhance digital agriculture regarding promoting climate-smart and sustainable agricultural practices, focusing on the specific case of pest management. In this case, our framework aims to alleviate skepticism and encourage wider adoption of IPM practices among policymakers, agricultural consultants, and farmers.
Navigation is an important skill required for an autonomous robot, as information about the location of the robot is necessary for making decisions about upcoming events. The objective of the localization technique is “to know about the location of the collected data.” In previous works, several deep learning methods were used to detect localization, but none of them gives sufficient accuracy. To address this issue, an Enhanced Capsule Generation Adversarial Network and optimized Dual Interactive Wasserstein Generative Adversarial Network for landmark detection and localization of autonomous robots in outdoor environments (ECGAN-DIWGAN-RSO-LAR) is proposed in this manuscript. Here, the outdoor robot localization dataset is taken from the Virtual KITTI dataset. It contains two phases, which are landmark detection and localization. The landmark detection phase is determined using Enhanced Capsule Generation Adversarial Network for detecting the landmark of the captured image. Then the robot localization phase is determined using Dual Interactive Wasserstein Generative Adversarial Network (DIWGAN) for determining the robot location coordinates as well as compass orientation from identified landmarks. After that, the weight parameters of the DIWGAN are optimized by Rat Swarm Optimization (RSO) algorithm. The proposed ECGAN-DIWGAN-RSO-LAR is implemented in Python. The efficiency of the proposed ECGAN-DIWGAN-RSO-LAR technique shows higher accuracy of 22.67%, 12.45 %, and 8.89% compared to the existing methods.
The ubiquity of social media platforms allows individuals to easily share and curate their personal lives with friends, family, and the world. The selective nature of sharing one’s personal life may reinforce the memories and details of the shared experiences while simultaneously inducing the forgetting of related, unshared memories/experiences. This is a well-established psychological phenomenon known as retrieval-induced forgetting (RIF, Anderson et al.). To examine this phenomenon in the context of social media, two experiments were conducted using an adapted version of the RIF paradigm in which participants either shared experimenter-contrived (Study 1) or personal photographs (Study 2) on social media platforms. Study 1 revealed that participants had more accurate recall of the details surrounding the shared photographs as well as enhanced recognition of the shared photographs. Study 2 revealed that participants had more consistent recall of event details captured in the shared photographs than details captured or uncaptured in the unshared photographs. These results suggest that selectively sharing photographs on social media may specifically enhance the recollection of event details associated with the shared photographs. The novel and ecologically embedded methods provide fodder for future research to better understand the important role of social media in shaping how individuals remember their personal experiences.
We introduce the exponentially preferential recursive tree and study some properties related to the degree profile of nodes in the tree. The definition of the tree involves a radix $a\gt 0$. In a tree of size $n$ (nodes), the nodes are labeled with the numbers $1,2, \ldots ,n$. The node labeled $i$ attracts the future entrant $n+1$ with probability proportional to $a^i$.
We dedicate an early section for algorithms to generate and visualize the trees in different regimes. We study the asymptotic distribution of the outdegree of node $i$, as $n\to \infty$, and find three regimes according to whether $0 \lt a \lt 1$ (subcritical regime), $a=1$ (critical regime), or $a\gt 1$ (supercritical regime). Within any regime, there are also phases depending on a delicate interplay between $i$ and $n$, ramifying the asymptotic distribution within the regime into “early,” “intermediate” and “late” phases. In certain phases of certain regimes, we find asymptotic Gaussian laws. In certain phases of some other regimes, small oscillations in the asymototic laws are detected by the Poisson approximation techniques.
This paper proposes a generalized method for designing tendon-driven serial-chained manipulators with an arbitrary number of tendon redundancy. First, a special class of tendon-driven structures is defined by introducing the controllable block triangular form (CBTF) of a null space matrix and its complementary CBTF of a structure matrix, satisfying physical constraints related to the minimal connection of tendons and to the placement of actuators. Then it is shown that any general design of tendon-driven serial manipulators can be reduced to the design of such a special class of tendon-driven structures. Two associated design problems are derived and solved. The first design problem is about finding a complementary CBTF structure matrix for a given CBTF null space matrix using algebraic relations, whereas the second one seeks the both matrices that optimize the wanted structural characteristics based on the result of the first design problem. Numerical design examples are provided to show the validity of the proposed method.
Failure mode and effects analysis (FMEA) is a critical but labor-intensive process in product development that aims to identify and mitigate potential failure modes to ensure product quality and reliability. In this paper, a novel framework to improve the FMEA process by integrating generative artificial intelligence (AI), in particular large language models (LLMs), is presented. By using these advanced AI tools, we aim to streamline collaborative work in FMEA, reduce manual effort and improve the accuracy of risk assessments. The proposed framework includes LLMs to support data collection, pre-processing, risk identification, and decision-making in FMEA. This integration enables a more efficient and reliable analysis process and leverages the strengths of human expertise and AI capabilities. To validate the framework, we conducted a case study where we first used GPT-3.5 as a proof of concept, followed by a comparison of the performance of three well-known LLMs: GPT-4, GPT-4o and Gemini. These comparisons show significant improvements in terms of speed, accuracy, and reliability of FMEA results compared to traditional methods. Our results emphasize the transformative potential of LLMs in FMEA processes and contribute to more robust design and quality assurance practices. The paper concludes with recommendations for future research focusing on data security and the development of domain-specific LLM training protocols.
This book explores the intersection of miracle cures and technology with a unique methodology. Unravelling the intricate connections between social, technological, biomedical and non-biomedical spheres, it makes a significant contribution to debates on technology and health.
We extend the recently introduced setting of coherent differentiation by taking into account not only differentiation but also Taylor expansion in categories which are not necessarily (left) additive. The main idea consists in extending summability into an infinitary functor which intuitively maps any object to the object of its countable summable families. This functor is endowed with a canonical structure of a bimonad. In a linear logical categorical setting, Taylor expansion is then axiomatized as a distributive law between this summability functor and the resource comonad (aka. exponential). This distributive law allows to extend the summability functor into a bimonad on the coKleisli category of the resource comonad: this extended functor computes the Taylor expansion of the (nonlinear) morphisms of the coKleisli category. We also show how this categorical axiomatization of Taylor expansion can be generalized to arbitrary cartesian categories, leading to a general theory of Taylor expansion formally similar to that of cartesian differential categories, although it does not require the underlying cartesian category to be left additive. We provide several examples of concrete categories that arise in denotational semantics and feature such analytic structures.