To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Chapter 13 delves into the complex terrain of mixed-state entanglement, extending the discourse from pure-state entanglement to encompass the broader and more practical scenarios encountered in quantum systems. The chapter systematically explores the detection of entanglement in mixed states, introducing criteria and methods such as the Positive Partial Transpose (PPT) criterion and entanglement witnesses, which serve as diagnostic tools for identifying entanglement in a mixed quantum state. Furthermore, it addresses the quantification of entanglement in mixed states, discussing various measures like entanglement cost and distillable entanglement. These concepts highlight the operational aspects of entanglement, including its creation and extraction, within mixed-state frameworks. The chapter also introduces the notion of entanglement conversion distances, providing a quantitative approach to understanding the transformations between different entangled states.
Several analogs to fans and triangulations of point configurations are introduced and motivated as representability issues. The equivalence of some of these analogs is established, while others remain open. Results on the topology of triangulations are proved, most notably for Euclidean oriented matroids.
Photovoltaic (PV) energy grows rapidly and is crucial for the decarbonization of electric systems. However, centralized registries recording the technical characteristics of rooftop PV systems are often missing, making it difficult to monitor this growth accurately. The lack of monitoring could threaten the integration of PV energy into the grid. To avoid this situation, remote sensing of rooftop PV systems using deep learning has emerged as a promising solution. However, existing techniques are not reliable enough to be used by public authorities or transmission system operators (TSOs) to construct up-to-date statistics on the rooftop PV fleet. The lack of reliability comes from deep learning models being sensitive to distribution shifts. This work comprehensively evaluates distribution shifts’ effects on the classification accuracy of deep learning models trained to detect rooftop PV panels on overhead imagery. We construct a benchmark to isolate the sources of distribution shifts and introduce a novel methodology that leverages explainable artificial intelligence (XAI) and decomposition of the input image and model’s decision regarding scales to understand how distribution shifts affect deep learning models. Finally, based on our analysis, we introduce a data augmentation technique designed to improve the robustness of deep learning classifiers under varying acquisition conditions. Our proposed approach outperforms competing methods and can close the gap with more demanding unsupervised domain adaptation methods. We discuss practical recommendations for mapping PV systems using overhead imagery and deep learning models.
This chapter explores how artificial intelligence has enabled the automation of customer service in private industry, such as through online tools that assist customers in purchasing airline tickets, troubleshoot internet outages, and provide personal banking services. Private industry has used machine learning, as well as other forms of artificial intelligence, to develop chatbots and virtual assistants, which can respond to conversational oral or text-based commands. These tools have rapidly become standard customer service vehicles. Recent developments suggest that automated customer service, such as large language models, will become even more sophisticated in the future.
Compliant and safe human–robot interaction is an important requirement in lower limb exoskeleton design. Motivated by this need, this paper presents the design of a compatible lower limb exoskeleton with variable stiffness actuation and anthropomorphic joint mechanisms, for walking assistance and gait rehabilitation. A novel variable stiffness actuator (VSA) based on a guide-bar mechanism was designed, to provide force and impedance controllability. By changing the crank length of the mechanism, the stiffness of the actuator is adjusted in a wide range (from 0 to 1301 Nm/rad), at fast speed (about 2582 Nm/rad/s), and with low-energy cost. These features make it possible for online stiffness adjustment during one gait cycle, to change the human–robot coupling behavior and improve the performance of the exoskeleton. An anthropomorphic hip joint mechanism was designed based on a parallelogram linkage and a passive joint compensation approach, which absorbs misalignment and improves kinematic compatibility between the human and the exoskeleton joint. Furthermore, a torque control-based multimode control strategy, which consists of passive mode, active mode, and hybrid mode, was developed for different disease stages. Finally, the torque control performance of the actuator was verified by benchtop test, and experimental validations of the exoskeleton with a human subject were carried out, which demonstrate that compliant human–robot interaction was achieved, and stiffness variation benefits for control performance improvement when the control mode changes.
Chapter 11 delves into the manipulation of quantum resources, the core aspect of quantum resource theories that explore the transformation and conversion of quantum states within a given resource theory framework. The chapter introduces the generalized asymptotic equipartition property and the generalized quantum Stein’s lemma, both foundational to understanding the asymptotic behavior of quantum resources. These concepts pave the way for discussing the uniqueness of the Umegaki relative entropy in quantifying the efficiency of resource conversion processes. Furthermore, the text explores asymptotic interconversions, detailing the conditions and limits for converting one resource into another when multiple copies of quantum states are considered. This analysis is pivotal for establishing the reversible exchange rates between different resources in the asymptotic limit. By providing a comprehensive overview of resource manipulation strategies, the chapter equips readers with the theoretical tools needed for advanced study and research in quantum resource theories, emphasizing both the single-shot and asymptotic domains.
This chapter describes interviews the authors conducted with federal agency officials about their use of automated legal guidance. This chapter offers insights gained from these interviews, including regarding the different models that agencies use to develop such guidance, their views on the usability of such guidance, the ways that agencies evaluate the guidance, and agencies’ views on successes and challenges that such guidance faces.
A proof of the Topological Representation Theorem, including an introduction to shelling, topological interpretation of oriented matroid concepts, and an application to counting topes, are provided in this chapter.
Chapter 10 delves into the quantification of quantum resources, an essential aspect of quantum resource theories that determines the value of quantum states for specific applications. It begins by defining resource measures and investigating their fundamental properties such as monotonicity under free operations and convexity. The chapter discusses distance-based resource measures, which quantify how far a given quantum state is from the set of free states. Such measures often utilize divergences and metrics explored in earlier chapters. Techniques to compute the relative entropy of a resource are also covered.
To refine resource measures, the chapter introduces the concept of smoothing, which considers small deviations from the ideal state to make the measures more robust against perturbations. This approach is crucial in single-shot scenarios where finite resources are available. Furthermore, the chapter examines resource monotones and support functions, offering a comprehensive framework for the theoretical and practical assessment of quantum resources.
Chapter 7 discusses quantum conditional entropy, extending the concept of conditional majorization and introducing the notion of negative quantum conditional entropy. The chapter starts with the basic definition of conditional entropy, exploring its key properties like monotonicity and additivity. It further delves into the concepts of conditional min- and max-entropies, emphasizing their roles in quantifying uncertainty in quantum states and their operational significance in quantum information theory.
The text presents conditional entropy as a measure sensitive to the effects of entanglement, showing that negative conditional entropy is a distinctive feature of quantum systems, contrasting with the classical domain where entropy values are nonnegative. This negativity is particularly pronounced in the context of maximally entangled states and is connected to the fundamental differences between classical and quantum information processing. Moreover, the chapter includes theorems and exercises to solidify understanding, like the invariance of conditional entropy under local isometric channels and its reduction to entropy for product states. It concludes by underscoring the inevitability of negative conditional entropy in quantum systems, a topic of both theoretical and practical importance in the quantum domain.
Chapter 8 explores the asymptotic regime of quantum information processing, beginning with quantum typicality, which illustrates the convergence of quantum states toward a typical form with increasing copies. This leads to the asymptotic equipartition property (AEP), indicating that with a high number of copies, probability vectors become uniformly distributed. The method of types is introduced next, a tool from classical information theory that classifies sequences based on their statistical properties. This is crucial for understanding the behavior of large quantum systems and has implications for quantum data compression. Advancing to quantum hypothesis testing, the chapter outlines efficient strategies for distinguishing between two quantum states through repeated measurements. Central to this is the Quantum Stein’s lemma, which asserts the exponential decline in the error probability of hypothesis testing as the sample size of quantum systems increases. The chapter highlights the deep interplay between typicality, statistical methods, and hypothesis testing, laying the groundwork for asymptotic interconversion of quantum resources.
Chapter 16, centered on the resource theory of nonuniformity, serves as an essential precursor to discussions on thermodynamics as a resource theory. It presents nonuniformity as a fundamental quantum resource, using it as a toy model to prepare for more complex thermodynamic concepts. In this model, free states are considered to be maximally mixed states, analogous to Gibbs states with a trivial Hamiltonian, providing a simplified context for exploring quantum thermodynamics. The chapter carefully outlines how nonuniformity is quantified, offering closed formulas for the conversion distance, nonuniformity cost, and distillable nonuniformity. These measures are explored both in the single-shot and the asymptotic domains. The availability of closed formulas makes this model particularly insightful, demonstrating clear, quantifiable relationships between various measures of nonuniformity.
This paper is an extended version of Bílková et al. ((2023b). Logic, Language, Information, and Computation. WoLLIC 2023, Lecture Notes in Computer Science, vol. 13923, Cham, Springer Nature Switzerland, 101–117.). We discuss two-layered logics formalising reasoning with probabilities and belief functions that combine the Łukasiewicz $[0,1]$-valued logic with Baaz $\triangle$ operator and the Belnap–Dunn logic. We consider two probabilistic logics – $\mathsf {Pr}^{{\mathsf {\unicode {x0141}}}^2}_\triangle$ (introduced by Bílková et al. 2023d. Annals of Pure and Applied Logic, 103338.) and $\mathbf {4}\mathsf {Pr}^{{\mathsf {\unicode {x0141}}}_\triangle }$ (from Bílková et al. 2023b. Logic, Language, Information, and Computation. WoLLIC 2023, Lecture Notes in Computer Science, vol. 13923, Cham, Springer Nature Switzerland, 101–117.) – that present two perspectives on the probabilities in the Belnap–Dunn logic. In $\mathsf {Pr}^{{\mathsf {\unicode {x0141}}}^2}_\triangle$, every event $\phi$ has independent positive and negative measures that denote the likelihoods of $\phi$ and $\neg \phi$, respectively. In $\mathbf {4}\mathsf {Pr}^{{\mathsf {\unicode {x0141}}}_\triangle }$, the measures of the events are treated as partitions of the sample into four exhaustive and mutually exclusive parts corresponding to pure belief, pure disbelief, conflict and uncertainty of an agent in $\phi$. In addition to that, we discuss two logics for the paraconsistent reasoning with belief and plausibility functions from Bílková et al. ((2023d). Annals of Pure and Applied Logic, 103338.) – $\mathsf {Bel}^{{\mathsf {\unicode {x0141}}}^2}_\triangle$ and $\mathsf {Bel}^{\mathsf {N}{\mathsf {\unicode {x0141}}}}$. Both these logics equip events with two measures (positive and negative) with their main difference being that in $\mathsf {Bel}^{{\mathsf {\unicode {x0141}}}^2}_\triangle$, the negative measure of $\phi$ is defined as the belief in$\neg \phi$ while in $\mathsf {Bel}^{\mathsf {N}{\mathsf {\unicode {x0141}}}}$, it is treated independently as the plausibility of$\neg \phi$. We provide a sound and complete Hilbert-style axiomatisation of $\mathbf {4}\mathsf {Pr}^{{\mathsf {\unicode {x0141}}}_\triangle }$ and establish faithful translations between it and $\mathsf {Pr}^{\mathsf {\unicode {x0141}}^2}_\triangle$. We also show that the validity problem in all the logics is $\mathsf {coNP}$-complete.
The newly introduced discipline of Population-Based Structural Health Monitoring (PBSHM) has been developed in order to circumvent the issue of data scarcity in “classical” SHM. PBSHM does this by using data across an entire population, in order to improve diagnostics for a single data-poor structure. The improvement of inferences across populations uses the machine-learning technology of transfer learning. In order that transfer makes matters better, rather than worse, PBSHM assesses the similarity of structures and only transfers if a threshold of similarity is reached. The similarity measures are implemented by embedding structures as models —Irreducible-Element (IE) models— in a graph space. The problem with this approach is that the construction of IE models is subjective and can suffer from author-bias, which may induce dissimilarity where there is none. This paper proposes that IE-models be transformed to a canonical form through reduction rules, in which possible sources of ambiguity have been removed. Furthermore, in order that other variations —outside the control of the modeller— are correctly dealt with, the paper introduces the idea of a reality model, which encodes details of the environment and operation of the structure. Finally, the effects of the canonical form on similarity assessments are investigated via a numerical population study. A final novelty of the paper is in the implementation of a neural-network-based similarity measure, which learns reduction rules from data; the results with the new graph-matching network (GMN) are compared with a previous approach based on the Jaccard index, from pure graph theory.
We show that the principal types of the closed terms of the affine fragment of λ-calculus, with respect to a simple type discipline, are structurally isomorphic to their interpretations, as partial involutions, in a natural Geometry of Interaction model à la Abramsky. This permits to explain in elementary terms the somewhat awkward notion of linear application arising in Geometry of Interaction, simply as the resolution between principal types using an alternate unification algorithm. As a consequence, we provide an answer, for the purely affine fragment, to the open problem raised by Abramsky of characterizing those partial involutions which are denotations of combinatory terms.
We present an axiom system for what we call Prior’s Ideal Language and prove its completeness and pure completeness with respect to general models. With this is done, we explain, with examples, why this system provides a useful setting for exploring Arthur Prior’s work.
We revisit the communication primitive in ambient calculi. Previously, such communication was confined to first-order (FO) mode (e.g., merely names or capabilities of ambients can be sent), local mode (e.g., the communication only occurs inside an ambient), or particular cross-hierarchy mode (e.g., parent-child communication). In this work, we explore further higher-order (HO) communication in ambient calculi. Specifically, such a communication mechanism allows sending a whole piece of a program across the borders of ambients and is the only form of communication that can happen exactly between ambients. Since ambients are basically of HO nature (i.e., those being moved may be ambients themselves), in a sense, it appears more natural to have HO communication than FO communication. We stipulate that communications merely occur between equally positioned ambients in a peer-to-peer fashion (e.g., between sibling ambients). Following this line, we drop the local or other forms of communication that violate this criterion. As the workbench, we work on a variant of Fair Ambients extended with HO communication, FAHO. This variant also strengthens the original version in that entirely real-identity interaction is guaranteed. We study the semantics, bisimulation, and expressiveness of FAHO. Particularly, we provide the operational semantics using a labeled transition system. Over the semantics, we define the bisimulation in line with the standard notion of bisimulation for ambients and prove that the bisimulation equivalence (i.e., bisimilarity) is a congruence. In addition, we demonstrate that bisimilarity coincides with observational congruence (i.e., barbed congruence). Moreover, we show that FAHO can encode a minimal Turing-complete HO calculus and thus is computationally complete.