To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This paper examines how the design and realisation of a concert presentation entitled Drawing Sound in Space led to the concept of digital spatial notation. Seven Australian composers were commissioned to create digital scores for an electroacoustic chamber music ensemble, and scores were shared with the audience. The author argues that contemporary digital notation practices enhance live performances of new music by expanding concepts of the audiovisual to include alternative notational approaches engaging with space, creating a ‘spatial notation’. Further, Drawing Sound in Space aimed to transform musicianship and audience experience by offering a more immersive encounter with music notation as a multimodal, social practice where audience engagement and musical understanding are enhanced. A theoretical framework is provided to facilitate the analysis of each work, where semiotic expansion, temporal engagement, distributed agency and spatial reconfiguration are discussed. Through different approaches to presenting music notation in Drawing Sound in Space, the project sought to provide audiences with a novel concert experience whilst simultaneously challenging composers to design notation intended for audiences as well as performers.
Data-embedded instruments that couple sensing, modelling and sound production are increasingly used in electroacoustic practice, yet their ethical and cultural configurations remain under-analysed. This article develops an ethical-embodied framework for examining how particular data, sensing and mapping arrangements configure relations of care, listening and musical agency. Drawing on feminist and decolonial listening practices, disability and critical data studies and accounts of embodied instrumentality, it combines a selective genealogy of electroacoustic and globally situated practices with a mid-level comparative lens that treats its technical axes as heuristic rather than taxonomic. Case vignettes analyse works using gesture tracking, electromyography (EMG) and brain–computer interfaces (BCI), audience-sensing installations and machine-learning vocal systems, alongside the author’s own data-embedded instrument. Across these examples, the analysis shows how similar technologies can reproduce or contest institutional surveillance, extractivism and aesthetic normativity and outlines implications for the design, evaluation and teaching of data-mediated musical systems foregrounding situated listening and collective accountability.
Since the mid-20th century, approaches to musical notation have multiplied, giving rise to a multitude of terminologies and classifications. While there exists an extremely rich literature on new approaches to musical notation, it is easy to be confused by a nomenclature that is still under construction and has yet to be formalised. Based on a narrative review of the scientific literature comprising over 250 documents on new forms of notations, this article aims to present the main terminologies used to describe the different approaches to notation. This article proposes a framework illustrating what we observed as the most prominent notation approaches (action-based scores, animated scores, graphic scores, etc.) according to the types of indications (prescriptive and/or descriptive), the notation encoding (semantic and temporal encoding), and the mediums used for transmission (screen, printed, etc.). The contemporary notation framework aims to provide tools for the further analysis and classification of musical notation used in contemporary instrumental, electronic, and electroacoustic music.
The evaluation of idea sets for design solutions using Shah et al.’s criteria of quality, quantity, novelty and variety can help design teams understand the thoroughness of their ideation work and can help design researchers compare the performance of different ideation methods. However, existing methods for aggregating these metrics to obtain total set scores for quality, quantity, novelty and variety are problematic. The present paper proposes axioms for the desired behavior of aggregation functions for quality, quantity, variety and novelty, then defines functions that meet the axioms. These axioms are intended to ensure that scoring methods reflect best practices in ideation and appropriately reward preferred ideation behavior, such as promoting the contribution of all ideas. Further, this paper provides operational definitions for quality, novelty and quantity evaluations of ideas and draws from previous methods to provide expedient scoring methods of individual ideas. Evaluation mechanics are presented that allow repeatable evaluation of idea sets containing thousands of ideas. Software tools are provided to automatically calculate the aggregation functions for ideas evaluated according to the mechanics of this paper. Finally, a method for evaluating both the variety of complete sets of ideas and the contributions of individual ideas to the overall set variety is proposed. The evaluation of variety is sufficiently defined that it can be automatically evaluated for any genealogy tree of ideas. The operational definitions for evaluating quality, novelty and quantity are suitable for adoption in artificial intelligence tools to allow automated evaluation of idea sets for these quantities.
Quantum technologies have the potential to play a significant role in future technological and economic advancement. However, our understanding of the specific narratives and topics present in national quantum technology policies is limited, even though these policies are vital for shaping global strategies, progress, and responsible development in the field. In this study, we use narrative policy analysis together with computational topic modeling to examine 55 governmental documents from 24 countries, covering over a decade. Using BERTopic modeling and the Narrative Policy Framework, the results reveal that national initiatives primarily focus on technological leadership for security and economic prosperity, assessing technological readiness, and, to a lesser extent, commercialization, and societal impacts. Over time, we see a trend toward greater alignment in the prevalence of these narratives, with different themes beginning to be considered more equally. Nevertheless, the narrative surrounding responsible quantum development and societal implications remains the least represented. The study shows the strategic priorities of the analyzed countries and introduces an innovative method for analyzing policy texts. Based on the results, we recommend a balanced regulatory approach for quantum technologies that promotes ethical innovation, supports inclusive technological ecosystems, and encourages global collaboration. Furthermore, we caution that an excessive emphasis on leadership and competition may lead to isolated innovation systems that could hinder progress, cooperation, and joint efforts.
The $q$-Weibull distribution, as a generalization of the Weibull distribution, plays an important role in the field of reliability theory, survival analysis, finance, engineering, medical science, etc. In contrast to the Weibull distribution, which is limited to describing monotonic hazard rate functions, the $q$-Weibull distribution offers the flexibility to model various behaviors of the hazard rate function, including unimodal, bathtub-shaped, monotonic (both increasing and decreasing), and constant. In this article, we investigated the stochastic comparison of extreme order statistics derived from independent, heterogeneous $q$-Weibull random variables using various stochastic orderings, including the usual stochastic order, hazard rate order, reversed hazard rate order, and likelihood ratio order. Some of these results are further extended to dependent setups by incorporating Archimedean copulas to model the dependence structure. Finally, we explored the behavior of extreme order statistics when the random variables are subjected to random shocks.
A key factor in ensuring the accuracy of computer simulations that model physical systems is the proper calibration of their parameters based on real-world observations or experimental data. Bayesian methods provide a robust framework for quantifying and propagating the uncertainties that inevitably arise. Nevertheless, they produce predictions unable to represent the observed datapoints when paired with inexact models. Additionally, the quantified uncertainties of these overconfident models cannot be propagated to other Quantities of Interest (QoIs) reliably. A promising solution involves embedding a model inadequacy term in the inference parameters, allowing the quantified model form uncertainty to influence non-observed QoIs. In this work, we revisit this embedded formulation and analyze how different likelihood constructions affect the inference of model form uncertainty, particularly under the presence of prescribed measurement noise and unavoidable model discrepancies. Two additional likelihood formulations, the global moment-matching and relative global moment-matching likelihoods, are introduced to explore alternative ways of representing the residual distribution. The behavior of these likelihoods is examined alongside existing formulations to show how different treatments of measurement noise and discrepancies shape the inferred parameter posteriors, and thereby affect the uncertainty ultimately propagated to the QoIs. Particular attention is given to how the uncertainty associated with the model inadequacy term propagates to the QoIs for the posteriors obtained from different likelihood formulations, enabling a more comprehensive statistical analysis of the prediction’s reliability. Finally, the proposed approach is applied to estimate the uncertainty in the predicted heat flux from a transient thermal simulation using temperature observations.
Music notation has evolved to accommodate music and musical instruments as they have changed over time. However, the rapid advancement of musical technology has not been accompanied by a corresponding development and consensus in notation. This paper examines the challenges faced by notation in representing music written for augmented instruments. We contend that a novel understanding of musical works is necessary and propose a work-concept that recognises the significance of the technology – medium – that composers develop alongside their creations. We emphasise the role of the score within this work-concept model and present an instrumental augmentation system as a case study. Finally, we propose notation guidelines for augmented instruments and argue that standardising notation could facilitate the discovery of common ground that guides the development of augmented instruments and music written for them.
This paper investigates the complexity of residual lifetimes of live components in coherent systems through the lens of cumulative residual extropy and its divergence-based extension, Jensen-cumulative residual extropy. Unlike classical reliability metrics that focus on system inactivity or mean residual life, our framework quantifies the hidden informational structure of components that remain alive at the system failure time. We derive closed-form expressions for the cumulative residual extropy of conditional residual lifetimes using system signatures and establish stochastic bounds and comparisons that highlight the impact of structural configuration. A novel divergence measure, the Jensen-cumulative residual extropy, is introduced to capture discrepancies between coherent systems and benchmark $k$-out-of-$n$ structures. Numerical illustrations with gamma-distributed lifetimes demonstrate the sensitivity of cumulative residual extropy and Jensen-cumulative residual extropy to redundancy patterns and dependence structures. Furthermore, by integrating cost considerations into the divergence framework, we provide a rigorous optimization scheme for selecting system signatures that jointly minimize informational complexity and economic expenditure. The proposed approach enriches the theoretical foundation of reliability analysis and offers practical guidelines for designing resilient, cost-effective, and information-efficient engineering systems.
This tutorial guide introduces online nonstochastic control, an emerging paradigm in control of dynamical systems and differentiable reinforcement learning that applies techniques from online convex optimization and convex relaxations to obtain new methods with provable guarantees for classical settings in optimal and robust control. In optimal control, robust control, and other control methodologies that assume stochastic noise, the goal is to perform comparably to an offline optimal strategy. In online control, both cost functions and perturbations from the assumed dynamical model are chosen by an adversary. Thus, the optimal policy is not defined a priori and the goal is to attain low regret against the best policy in hindsight from a benchmark class of policies. The resulting methods are based on iterative mathematical optimization algorithms and are accompanied by finite-time regret and computational complexity guarantees. This book is ideal for graduate students and researchers interested in bridging classical control theory and modern machine learning.
Governing AI is about getting AI right. Building upon AI scholarship in science and technology studies, technology law, business ethics, and computer science, it documents potential risks and actual harms associated with AI, lists proposed solutions to AI-related problems around the world, and assesses their impact. The book presents a vast range of theoretical debates and empirical evidence to document how and how well technical solutions, business self-regulation, and legal regulation work. It is a call to think inside and outside the box. Technical solutions, business self-regulation, and especially legal regulation can mitigate and even eliminate some of the potential risks and actual harms arising from the development and use of AI. However, the long-term health of the relationship between technology and society depends on whether ordinary people are empowered to participate in making informed decisions to govern the future of technology – AI included.
This study assesses classification-based predictive maintenance (PdM) for aircraft engines on the NASA Commercial Modular Aero-Propulsion System Simulation dataset and addresses the lack of wide-scope, unified benchmarks. PdM is cast as a short-term binary task – predicting whether an engine will fail within the next 30 cycles – and a comparison is conducted across 10 machine-learning models (Logistic Regression, Decision Tree, Random Forest, Support Vector Machine, k-Nearest Neighbor, Naïve Bayes, Extreme Gradient Boosting, LightGBM, CatBoost, and Gradient Boosting) and 3 deep-learning models (Multilayer Perceptron, Gated Recurrent Unit, and Long Short-Term Memory). A leakage-aware pipeline applies Min–Max scaling; class imbalance is handled with Synthetic Minority Over-sampling Technique where appropriate; hyperparameters are tuned via GridSearchCV/BayesSearchCV; and performance is reported with accuracy, precision, recall, F1-score, and receiver operating characteristic–area under the curve (ROC–AUC), complemented by Shapley Additive Explanations (SHAP) explainability and nonparametric significance tests. Sequence models delivered the strongest performance: LSTM achieved Accuracy = 0.981 (Macro-F1 = 0.92; ROC–AUC = 0.96), and GRU achieved ROC–AUC = 0.97 with Accuracy = 0.975. Among classical learners, LightGBM reached Accuracy = 0.972 (Macro-F1 = 0.86; ROC–AUC = 0.93). These gains over weaker baselines were statistically significant across folds. Framing PdM as near-term failure classification yields operationally interpretable alerts. Models that explicitly capture temporal dependencies (GRU/LSTM) best track short-horizon failure dynamics, while gradient-boosted trees offer competitive, lightweight alternatives. The benchmark and analysis (including SHAP) provide a reproducible reference for model selection in aviation PdM.
In Outline of a Theory of Truth, Kripke introduces many of the central concepts of the logical study of truth and paradox. He informally defines some of these—such as groundedness and paradoxicality—using modal locutions. We introduce a modal language for regimenting these informal definitions. Though groundedness and paradoxicality are expressible in the modal language, we prove that intrinsicality—which Kripke emphasizes but does not define modally—is not. This follows from a characterization of the modally definable sets and relations and an attendant axiomatization of the modal semantics.
This chapter discusses empirical evidence for a range of decision procedures underlying individuals’ choices in games. These include (1) quantal response equilibria. These are stochastic choice models where a player does not best reply to their beliefs, but instead chooses actions with a higher expected payoff with a higher probability. It also includes (2) cognitive hierarchy models where, roughly speaking, players possess different, discrete levels of cognition. A level-0 player chooses some fixed anchor strategy. And, recursively, having fixed the strategies of players of level 0,...,k-1, a level-k player best responds to a belief that all other players are of a lower level. Lastly, it includes (3) a variety of learning models.
As a direct consequence of liquid kerosene injection, aeroengine combustors may be categorized as non-premixed combustion systems, characterized by a swirl-stabilized and highly complex flow field. In addition to the flow of air through the fuel injector, there are a large number of other features through which the oxidizer can enter the heat release region. These can have an impact on local fuel–air mixing, inducing strong spatial and temporal variations in stoichiometry, thereby affecting emissions and combustion system performance. This article discusses a novel statistical methodology, based on principal component analysis (PCA) and K-means clustering, that aims to improve the understanding of fuel–air mixing in realistic aeroengine combustors. The method is applied in a post-processing step to data sampled from a large-eddy simulation, where every chamber inflow has been tagged with a unique passive scalar, which allows it to be traced across space and time. PCA is used to construct a low-dimensional, visually interpretable representation of a spatially localized fuel–air mixing process, while K-means clustering is employed to produce an unsupervised discretization of the flow field into regions of similar fuel–air mixing characteristics. The proposed methodology is computationally inexpensive, and the easily interpretable outputs can help the combustion engineer make better-informed decisions about combustor design.
The prologue fleshes out the lessons drawn from this book. It offers best practices for a workable AI governance model that uses technical solutions, business self-regulation, and legal regulation. Then, it delves into some of the shortcomings of that model. The radical-democratic perspective that I advocate makes five general, practical suggestions for everyone concerned with AI risks and harms. (1) Organize: Build networks of support and civic organizations around technology-specific concerns as well as conventional rights considerations; (2) Learn: Acquire cross-disciplinary capabilities on the uses, practical applications, potential risks, and governance models associated with technologies like AI; (3) Participate: Push politicians and businesses to expand the boundaries of decision-making in the public and private sectors; (4) Care: Approach technological change from the perspective of vulnerable populations, and with an ethic of non-domination that refuses to treat nature and other people as instruments; and (5) Resist: Maintain an openness to contention with the producers and users of technologies that generate risks and harms.
We introduce de hyperarithmet sets. We do it from the viewpoint of computable structure theory, defining them as computably-infinitary definable sets. We prove many of the classical results, and, in particular, their connection with jump hierarchies along computable ordinals.