To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This paper describes the CQI (Continuous Quality Improvement) process of collecting and analyzing field level qualitative data in an ongoing cycle. This data can be used to guide decision-making for effective emergency response. When medical and community components are integrated from the earliest stages of the disaster, it allows for true collaboration and supports the CQI process to be responsive to evolving data. Our CQI process identified and addressed gaps in communication and coordination, problems with strategy implementation and, on a conceptual level, gaps in the disaster response model. The 2015 Ebola crisis in Sierra Leone provided a case study demonstrating improved effectiveness when a CQI approach is implemented in the Humanitarian Setting, equally in terms of reducing disease spread, and in meeting the broader needs of the population served.
In this paper I offer a model-theoretic interpretation of Autonomy Theory as defended by Moreno, Mossio, Montévil, and Bich. I address accusations that Autonomy Theory is excessively liberal, such as those made by Garson (2017), arguing that these misunderstand the role of strategic abstractions and generalizations in theory construction. Conceiving of closure of constraints as a model-building effort that emphasizes generality—in the spirit of Levins (1966)—also clarifies its potential for application in empirical contexts.
We introduce two new notions called the Daugavet constant and Δ-constant of a point, which measure quantitatively how far the point is from being Daugavet point and Δ-point and allow us to study Daugavet and Δ-points in Banach spaces from a quantitative viewpoint. We show that these notions can be viewed as a localized version of certain global estimations of Daugavet and diametral local diameter two properties such as Daugavet indices of thickness. As an intriguing example, we present the existence of a Banach space X in which all points on the unit sphere have positive Daugavet constants despite the Daugavet indices of thickness of X being zero. Moreover, using the Daugavet and Δ-constants of points in the unit sphere, we describe the existence of almost Daugavet and Δ-points, as well as the set of denting points of the unit ball. We also present exact values of the Daugavet and Δ-constant on several classical Banach spaces, as well as Lipschitz-free spaces. In particular, it is shown that there is a Lipschitz-free space with a Δ-point, which is the furthest away from being a Daugavet point. Finally, we provide some related stability results concerning the Daugavet and Δ-constant.
We examine the changing patterns of knowledge production and diffusion in political science over the past five decades using a dataset of over 200,000 SSCI-indexed research articles from 1970 to 2020. We analyze how author identity and team diversity influence research outputs and outcomes. The results show that historically excluded groups of scholars have gradually improved their representation and expanded their collaboration networks over time. Although the publication gaps are narrowing, obscured gaps in evaluation and citation practices persist. Research specialties with higher proportions of minority researchers tend to have lower average citation impacts. The least cited research specialties are largely studied by women and racial/ethnic minority scholars. Papers written by racial/ethnic minorities and Global South scholars are significantly less cited. However, collaborating with outgroup scholars can effectively overcome this citation gap. We also find that papers written by women receive more citations than those written by men, after controlling for journal prestige and research topics. Furthermore, when we limit our investigation to leading universities, citation gaps diminish. However, scholars of African descent continue to experience entrenched citation disadvantages even if they are affiliated with highly prestigious universities. This study provides multidimensional measurements to advance diversity debates and adds nuances to our understanding of opportunity structures in political science.
In the goundbreaking paper [BD11] (which opened a wide avenue of research regarding unlikely intersections in arithmetic dynamics), Baker and DeMarco prove that for the family of polynomials $f_\lambda (x):=x^d+\lambda $ (parameterized by $\lambda \in \mathbb {C}$), given two starting points a and b in $\mathbb {C}$, if there exist infinitely many $\lambda \in \mathbb {C}$ such that both a and b are preperiodic under the action of $f_\lambda $, then $a^d=b^d$. In this paper, we study the same question, this time working in a field of characteristic $p>0$. The answer in positive characteristic is more nuanced, as there are three distinct cases: (i) both starting points a and b live in ${\overline {\mathbb F}_p}$; (ii) d is a power of p; and (iii) not both a and b live in ${\overline {\mathbb F}_p}$, while d is not a power of p. Only in case (iii), one derives the same conclusion as in characteristic $0$ (i.e., that $a^d=b^d$). In case (i), one has that for each $\lambda \in {\overline {\mathbb F}_p}$, both a and b are preperiodic under the action of $f_\lambda $, while in case (ii), one obtains that also whenever $a-b\in {\overline {\mathbb F}_p}$, then for each parameter $\lambda $, we have that a is preperiodic under the action of $f_\lambda $ if and only if b is preperiodic under the action of $f_\lambda $.
This article follows the early history of the Eastman Kodak Company, examining how the photographic company came to be led by experts in chemistry, who created manufacturing processes that were crucial to the mass manufacture of motion pictures. It argues that celluloid film, the substance necessary for motion pictures, was central to the evolution of Kodak into an industrial chemical company. Kodak’s work to manage the specific technological problems and risks created by this material was itself constitutive of the new industrial shape the firm took. In embracing an intraplant goal of purity of raw materials and finished goods, Kodak made it possible for cinema to become a mass medium, with moving images able to look the same way across time and space, over countless copies. Kodak’s transformation, however, was uneven, as the firm’s photosensitive emulsion continued to be made according to far more empirical, secretive, and artisanal procedures, developed by a photographer without a high school degree. These artisanal processes coexisted alongside a highly standardized plant regime, and both were required to make celluloid film. This history demonstrates one way in which broad cultural transformations of the early twentieth century were closely tied to material and practical transformations within industrial firms.
In numerous applications, extracting a single rotation component (termed “planar rotation”) from a 3D rotation is of significant interest. In biomechanics, for example, the analysis of joint angles within anatomical planes offers better clinical interpretability than spatial rotations. Moreover, in parallel kinematics robotic machines, unwished rotations about an axis – termed “parasitic motions” – need to be excluded. However, due to the non-Abelian nature of spatial rotations, these components cannot be extracted by simple projections as in a vector space. Despite extensive discussion in the literature about the non-uniqueness and distortion of the results due to the nonlinearity of the SO(3) group, they continue to be used due to the absence of alternatives. This paper reviews the existing methods for planar-rotation extraction from 3D rotations, showing their similarities and differences as well as inconsistencies by mathematical analysis as well as two application cases, one of them from biomechanics (flexural knee angle in the sagittal plane). Moreover, a novel, simple, and efficient method based on a pseudo-projection of the Quaternion rotation vector is introduced, which circumvents the ambiguity and distortion problems of existing approaches. In this respect, a novel method for determining the orientation of a box from camera recordings based on a two-plane projection is also proposed, which yields more precise results than the existing Perspective 3-Point Problem from the literature. This paper focuses exclusively on the case of finite rotations, as infinitesimal rotations within a single plane are non-holonomic and, through integration, produce rotation components orthogonal to the plane.
For relevant logics, the admissibility of the rule of proof $\gamma $ has played a significant historical role in the development of relevant logics. For first-order logics, however, there have been only a handful of $\gamma $-admissibility proofs for a select few logics. Here we show that, for each logic L of a wide range of propositional relevant logics for which excluded middle is valid (with fusion and the Ackermann truth constant), the first-order extensions QL and LQ admit $\gamma $. Specifically, these are particular “conventionally normal” extensions of the logic $\mathbf {G}^{g,d}$, which is the least propositional relevant logic (with the usual relational semantics) that admits $\gamma $ by the method of normal models. We also note the circumstances in which our results apply to logics without fusion and the Ackermann truth constant.
Over a 2-year period, we identified Transmission from Room Environment Events (TREE) across the Johns Hopkins Health System, where the subsequent room occupant developed the same organism with the same antimicrobial susceptibilities as the patient who had previously occupied that room. Overall, the TREE rate was 50/100,000 inpatient days.
I will clarify when and how a tension arises between epistemic environmentalism (a new focus on assessing and improving the epistemic environment) and respect for epistemic autonomy (allowing, empowering, and requiring people to each govern their own beliefs). Using the example of participatory conceptual engineering (improving the linguistic environment through rational discussion with broad participation), I will also identify an option for avoiding the tension—namely, participatory environmentalism. This means a new focus on how people can each contribute to improving the shared epistemic environment through rational deliberation and thereby govern their own beliefs that are shaped by that environment.
The advent of integrative taxonomy in plankton research, employing molecular and morphology-based identification, promotes the discovery of new biodiversity records, especially of larval stages. The slipper lobster family Scyllaridae consists of planktonic phyllosoma larvae, persisting weeks to many months in the water column. High interspecific larval similarities and inconsistent delineation of stages have hindered the identification of scyllarid phyllosomata to the species level using morphological characteristics. Here we report the first record of the pygmy slipper lobster, Biarctus sordidus, in the Red Sea following the finding of its phyllosoma larva, extending its known distribution from the Persian Gulf to Australia and southern China. We identified the phyllosoma collected from the Northern Gulf of Aqaba as B. sordidus using the mitochondrial 16S and 18S rRNA genes, and described its morphology to determine the larval stage. We further discuss the potential factors contributing to the delayed detection of this species.
Task planning and its effect on the complexity of second language (L2) written production have been studied extensively. However, the results of these studies are inconclusive, and at times contradictory, potentially as a result of variation in metrics of linguistic complexity. This study is an extension of earlier research syntheses and quantitative meta-analyses on the effects of planning on oral and written L2 production. It examines the identification and selection of linguistic complexity metrics in previous research on planning and its subsequent effects on the linguistic complexity of written L2 production. This research-focused synthesis of studies surveys construct definitions and operational definitions of linguistic complexity in the research domain and provides an overview of rationales for metric selection in the included studies. Methodological implications for future research are discussed in light of the findings.
We analyze the disclosures of sustainable investing by Dutch pension funds in their annual reports by introducing a novel textual analysis approach using state-of-the-art natural language processing techniques to measure the awareness and implementation of sustainable investing. We find that a pension fund's size increases both the awareness and implementation of sustainable investing. Moreover, we analyze the role of signing a sustainable investment initiative. Although signing this initiative increases the specificity of pension fund statements about sustainable investing, we do not find an effect on the implementation of sustainable investing.
Modern machine-learning techniques are generally considered data-hungry. However, this may not be the case for turbulence as each of its snapshots can hold more information than a single data file in general machine-learning settings. This study asks the question of whether nonlinear machine-learning techniques can effectively extract physical insights even from as little as a single snapshot of turbulent flow. As an example, we consider machine-learning-based super-resolution analysis that reconstructs a high-resolution field from low-resolution data for two examples of two-dimensional isotropic turbulence and three-dimensional turbulent channel flow. First, we reveal that a carefully designed machine-learning model trained with flow tiles sampled from only a single snapshot can reconstruct vortical structures across a range of Reynolds numbers for two-dimensional decaying turbulence. Successful flow reconstruction indicates that nonlinear machine-learning techniques can leverage scale-invariance properties to learn turbulent flows. We also show that training data of turbulent flows can be cleverly collected from a single snapshot by considering characteristics of rotation and shear tensors. Second, we perform the single-snapshot super-resolution analysis for turbulent channel flow, showing that it is possible to extract physical insights from a single flow snapshot even with inhomogeneity. The present findings suggest that embedding prior knowledge in designing a model and collecting data is important for a range of data-driven analyses for turbulent flows. More broadly, this work hopes to stop machine-learning practitioners from being wasteful with turbulent flow data.