To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
During the COVID-19 pandemic, the United States Centers for Disease Control and Prevention provided strategies, such as extended use and reuse, to preserve N95 filtering facepiece respirators (FFR). We aimed to assess the prevalence of N95 FFR contamination with SARS-CoV-2 among healthcare personnel (HCP) in the Emergency Department (ED).
Design:
Real-world, prospective, multicenter cohort study. N95 FFR contamination (primary outcome) was measured by real-time quantitative polymerase chain reaction. Multiple logistic regression was used to assess factors associated with contamination.
Setting:
Six academic medical centers.
Participants:
ED HCP who practiced N95 FFR reuse and extended use during the COVID-19 pandemic between April 2021 and July 2022.
Primary exposure:
Total number of COVID-19-positive patients treated.
Results:
Two-hundred forty-five N95 FFRs were tested. Forty-four N95 FFRs (18.0%, 95% CI 13.4, 23.3) were contaminated with SARS-CoV-2 RNA. The number of patients seen with COVID-19 was associated with N95 FFR contamination (adjusted odds ratio, 2.3 [95% CI 1.5, 3.6]). Wearing either surgical masks or face shields over FFRs was not associated with FFR contamination, and FFR contamination prevalence was high when using these adjuncts [face shields: 25% (16/64), surgical masks: 22% (23/107)].
Conclusions:
Exposure to patients with known COVID-19 was independently associated with N95 FFR contamination. Face shields and overlying surgical masks were not associated with N95 FFR contamination. N95 FFR reuse and extended use should be avoided due to the increased risk of contact exposure from contaminated FFRs.
Nutrition intervention is an effective way to improve flesh qualities of fish. The effect of feed supplementation with glutamate (Glu) on flesh quality of gibel carp (Carassius gibelio) was investigated. In trial 1, the fish (initial weight: 37.49 ± 0.08 g) were fed two practical diets with 0 and 2% Glu supplementation. In trial 2, the fish (37.26 ± 0.04 g) were fed two purified diets with 0 and 3% Glu supplementation. The results after feeding trials showed that dietary Glu supplementation increased the hardness and springiness of muscle, whether using practical or purified diets. Glu-supplemented diets increased the thickness and density of myofibres and collagen content between myofibres. Furthermore, Glu promoted muscle protein deposition by regulating the IGF-1-AKT-mTOR signalling pathway, and enhanced the myofibre hypertrophy by upregulating genes related to myofibre growth and development (mef2a, mef2d, myod, myf5, mlc, tpi and pax7α). The protein deposition and myofibre hypertrophy in turn improved the flesh texture. In addition, IMP content in flesh increased when supplementing Glu whether to practical or to purified diet. Metabolomics confirmed that Glu promoted the deposition of muscle-flavoured substances and purine metabolic pathway most functioned, echoed by the upregulation of key genes (ampd, ppat and adsl) in purine metabolism. The sensory test also clarified that dietary Glu improved the flesh quality by enhancing the muscle texture and flavour. Conclusively, dietary Glu supplementation can improve the flesh quality in this fish, which can further support evidence from other studies more generally that improve flesh quality of cultured fish.
Bentonite is mined globally for use in commercial and industrial applications. In these applications, smectite content and composition are the paramount factors of the bentonite material and control its properties. As bentonite composition and properties can vary significantly over a large mining district or within a single mine, quality control is required including: mineral composition, especially smectite content; cation exchange capacity (CEC); exchangeable cation composition; and smectite crystallochemical features. Differences in bentonite composition locally or over a spatial area stem from the different geological settings present throughout bentonitization. The study aims were to: (1) determine the layer charge (LC) variation of dioctahedral smectite over the Bavarian mining district and within individual mines in the area; and (2) assess the error in smectite content calculations based on CEC data resulting from the actual range of experimentally determined LC values. This information has been missing in the scientific literature, as previous LC methods were laborious or subject to assumptions, making a comprehensive study over a large spatial area impractical. This study employed the use of the recently developed efficient and precise spectroscopic ‘O-D method’, which enabled the LC measurement of 40 samples from eight mines in the Bavarian bentonite mining district, covering an area of 250 km2, within the North Alpine Foreland Basin. Results showed LC values calibrated against the alkylammonium method (LC (AAM)) generally ranged between 0.29 and 0.30 eq per formula unit (FU), with only 10% of samples showing LC values >0.31 eq/FU. This narrow LC range has positive implications for the accuracy of determining smectite content calculated from CEC data, during routine quality control of Bavarian and other bentonites. The average error of the CEC-based smectite contents resulting from LC variations was, on average, ±3 wt.%.
Ferroåkermanite, Ca2FeSi2O7 – a new member of the melilite group, has been found in coarse-grained kirschsteinite-bearing paralava in the Hatrurim Basin outcrop between the Zohar and Halamish Wadies of the Hatrurim Complex in Israel. Ferroåkermanite rarely forms single subhedral light-yellow crystals up to 30–50 μm in size with a prismatic habit. The most common are irregular grains, aggregates and intergrowths with gehlenite or ferroåkermanite crystals with perovskite inclusions. The mineral is transparent, exhibits vitreous lustre and has a distinct cleavage on (001). It is non-fluorescent, brittle and has a conchoidal fracture, a Mohs hardness of ∼4.5–5 and a calculated density of 3.20 g/cm3. Ferroåkermanite is uniaxial (–), ω = 1.652(3) and ε = 1.643(3) (λ = 589 nm), and exhibits a visible pleochroism from light-yellow (ω) to intense yellow (ε). The empirical formula of ferroåkermanite calculated on the basis of 7 O is (Ca1.77Na0.18Sr0.02Ba0.02K0.02)Σ2.01(Fe2+0.68Al0.28Mg0.04)Σ1.00(Si1.93Al0.07)Σ2.00O7. The chemical data obtained confirm the presence of ferroåkermanite–gehlenite solid solution (Fe2+ + Si4+ ↔ 2Al3+) in the studied rock, which was verified by Raman spectroscopy investigation. The crystal structure of the new mineral was refined to R = 0.0617 in the space group P$\overline4$21m with the following unit-cell parameters a = 7.7813(7) Å, c = 5.0114(5) Å, V = 303.43(6) Å3, Z = 2. Ferroåkermanite has a melilite-type structure with layers consisting of (Si2O7)6– disilicate units and (Fe2+O4)6– tetrahedra intercalated by layers formed of eightfold-coordinated Ca atoms. Moreover, the T1 site in the holotype specimen shows a mixed occupancy refined to 0.63(3) Fe2+ and 0.37(3) Al. The presence of rock-forming minerals such as gehlenite or rankinite suggests that the paralava analysed formed under high-temperature conditions, confirming that the new mineral ferroåkermanite is indeed a high-temperature phase. Furthermore, the presence of Fe2+-bearing phases, such as kirschsteinite, ferroåkermanite, chromite, ulvöspinel and bennesherite indicates the reduced conditions.
We apply a synthesis review to revisit the concept, measurement, and operationalisation of social inclusion and exclusion in the context of comparative social policy, integrating the vast literature on the concepts, with the aim of elucidating a clearer understanding of the concepts for use by scholars and policymakers around the planet. In turn, we outline the conceptual development of the concepts, how they have been operationalised through social policy, and how they have been measured at the national and individual levels. Through our review, we identify limitations in extant conceptualisation and measurement approaches and suggest directions for refining conceptual and measurement frameworks to enhance their utility in social inclusion policy, emphasising the concepts’ multidimensional, multilevel, dynamic, and relational essence and highlighting their connection to related concepts such as social capital, social integration, and social citizenship.
Executive dysfunction is prevalent in early stroke and can predict long-term outcomes. Impairments can be subtle and undetected in cognitive stroke screens. To better assess executive functions, this study introduced a novel sentence completion test, which assesses multiple executive processes in <5 minutes (Brief Executive Language Screen – Sentence Completion; BELS-SC). The aim was to determine construct, convergent and divergent validity, sensitivity and specificity of the BELS-SC, and to explore differences between left and right hemisphere stroke patients (LHS and RHS, respectively) on the BELS-SC and standard executive function tests.
Method:
Eighty-eight acute/early sub-acute stroke patients and 116 age-matched healthy controls were included.
Results:
Principal Component Analysis (PCA) suggested four to five factors of the BELS-SC: Initiation, Selection, Inhibition (with strategy loading on Inhibition), Inhibition Response Time, and Semantic Retrieval Response Time. The BELS-SC had good sensitivity (.84) but poorer specificity (.66) differentiating controls and stroke, and good sensitivity (.83) and specificity (.80) differentiating executive function impaired versus executive function intact groups. BELS-SC Initiation and Inhibition subtests demonstrated convergent and divergent validity with corresponding Hayling subtests. LHS and RHS showed impairment across initiation, selection, inhibition and strategy; however, greatest deficits were shown by RHS on Inhibition items requiring suppression of one dominant response. More patients were impaired on BELS-SC than other executive function tests.
Conclusions:
The BELS-SC demonstrated convergent, divergent, and construct validity, good sensitivity and specificity, taps multiple executive processes, and provides insight into strategy. Use in early stroke may aid in targeted and timely cognitive rehabilitation.
Neurodegenerative diseases (NDDs) are a group of complex disorders marked by pathophysiological mechanisms involving protein aggregation, mitochondrial dysfunction, oxidative stress and neuroinflammation. Irrespective of extensive research advances, NDDs have become a serious global concern and persist as a major therapeutic challenge. In recent years, microRNAs (miRNAs), a class of small non-coding RNAs, have established a pivotal role in combating NDDs. The altered expression of miRNAs is reported to be associated with the progression of various NDDs. This review aims to discuss miRNA biogenesis; dysregulation in NDDs, specifically Alzheimer’s disease, Parkinson’s disease (PD) and amyotrophic lateral sclerosis; their potential as biomarkers; and promising therapeutic targets. Additionally, there are various emerging technologies discussed that are advanced approaches to enhance miRNA-based diagnostics and therapeutics.
To investigate the effect of physical exercise intensity on state anxiety symptoms and affective responses.
Methods:
Twenty-one healthy women (mean age: 23.6 ± 5.4 years) participated in three sessions: self-selected intensity exercise, moderate-intensity prescribed exercise, and a nonexercise control session. Before each session, participants were exposed to unpleasant stimuli. State anxiety symptoms and affective responses were assessed pre- and post-stimulus exposure and pre- and post-sessions. A two-way repeated measures ANOVA tested state anxiety, while the Friedman test analysed affective responses.
Results:
Time significantly affected state anxiety symptoms [F (2,0) = 25.977; P < 0.001; η2p = 0.565]. Anxiety increased post-stimulus (P < 0.001) and decreased after all sessions. No significant differences were found between exercise and control conditions. Time also significantly influenced affective responses [χ2 (8.0) = 62.953; P < 0.001; Kendall’s W: 0.375]. Affective responses decreased post-stimulus (P = 0.029) and significantly increased after both exercise sessions (P < 0.001) but remained unchanged in the control session (P = 0.183).
Conclusions:
Although state anxiety increased after unpleasant stimuli in all conditions, reductions following exercise sessions were comparable to the nonexercise session. However, both exercise sessions uniquely improved affective responses, highlighting their potential for emotional recovery after unpleasant stimuli.
Let $A\ \mathrm{and}\ B$ be subsets of $(\mathbb {Z}/p^r\mathbb {Z})^2$. In this note, we provide conditions on the densities of A and B such that $|gA-B|\gg p^{2r}$ for a positive proportion of $g\in SO_2(\mathbb {Z}/p^r\mathbb {Z})$. The conditions are sharp up to constant factors in the unbalanced case, and the proof makes use of tools from discrete Fourier analysis and results in restriction/extension theory.
Legacy collections frequently originate from Indigenous archaeological sites with extensive histories of investigation and removal by numerous institutions and individuals. These “split” collections complicate institutional compliance with the Native American Graves Protection and Repatriation Act (NAGPRA; 25 U.S.C. § 3001-13) , in part by hindering the identification of cultural items, including associated and unassociated funerary objects. In aligning with the spirit of NAGPRA and following guidance received during consultation with Tribal Nations, institutional NAGPRA practitioners strive to repatriate the Ancestors removed from these locations whole, both in body and cultural items, facilitating a respectful return to living communities. Moreover, collaborating across institutions and in coordination with Tribal Nations has the potential both to lessen the burden on Tribal Nations and minimize repetitive trauma brought about by multiple repatriations of Ancestors and cultural items from the same site. Accomplishment of this repatriation goal often requires cross-institutional collaboration to reconcile these legacy “split collections.” In this article, we present the roadmap developed and used by the University of Wisconsin–Oshkosh and the University of Wisconsin–Milwaukee for repatriation with split collections, with some considerations for fruitful interinstitutional collaboration.
In March 1933, the United States Congress declared beer up to 3.2 percent alcohol by weight to be “non-intoxicating,” thus allowing it to be produced and sold while the nation was still under the 18th Amendment’s ban of intoxicating liquors. Brewers had long argued that beer was a temperance beverage that should be regulated with a lighter touch than harder liquor. In fact, the declaration that 3.2 beer was non-intoxicating opened several markets that would otherwise have been closed to brewers. In the decades that followed Repeal, 3.2 beer continued to be treated differently than stronger alcohol with respect to who, when, where, and how it was legally available. This paper explores the important—and continuing—role that 3.2 beer has played in the post-Prohibition United States.
Cryptosporidium parvum is a well-established cause of gastrointestinal illness in both humans and animals and often causes outbreaks at animal contact events, despite the availability of a code of practice that provides guidance on the safe management of these events. We describe a large C. parvum outbreak following a lamb-feeding event at a commercial farm in Wales in 2024, alongside findings from a cohort study to identify high-risk exposures. Sixty-seven cases were identified, 57 were laboratory-confirmed C. parvum, with similar genotypes. Environmental investigations found a lack of adherence to established guidance. The cohort study identified 168 individuals with cryptosporidiosis-like illness from 540 exposure questionnaires (distributed via email to 790 lead bookers). Cases were more likely to have had closer contact with lambs (odds ratio (OR) kissed lambs = 2.4, 95% confidence interval (95% CI): 1.2–4.8). A multivariable analysis found cases were more likely to be under 10 years (adjusted OR (aOR) = 4.5, 95% CI: 2.0–10.0) and have had visible faeces on their person (aOR = 3.6, 95% CI: 2.1–6.2). We provide evidence that close contact at lamb-feeding events presents an increased likelihood of illness, suggesting that farms should limit animal contact at these events and that revisions to established codes of practice may be necessary. Enhancing risk awareness among farmers and visitors is needed, particularly regarding children.
Many countries struggle to heal the wounds caused by past governmental discrimination against minorities, a process sometimes made difficult by continuing instances of injustice. One tool to improve majority-minority relations is Truth and Reconciliation Commissions (TRCs), which document historical injustices. We collect data before and after the release of the Norwegian TRC report on the treatment of the Sámi and other national minorities, which allows us to investigate its effects on reconciliatory attitudes. We further leverage the unrelated outbreak of demonstrations against current injustices, allowing us to examine responses to both past and current injustices. We find greater support for some aspects of reconciliation, but mainly in areas with a small presence of national minorities. Our results show the limits of TRCs when current conflicts shape the interpretation of historical injustice.
This article aims at facilitating the widespread application of Energy Management Systems (EMSs), especially in buildings and cities, in order to support the realization of future carbon-neutral energy systems. We claim that economic viability is a severe issue for the utilization of EMSs at scale and that the provisioning of forecasting and optimization algorithms as a service can make a major contribution to achieving it. To this end, we present the Energy Service Generics software framework that allows the derivation of fully functional services from existing forecasting or optimization code with ease. This work documents the strictly systematic development of the framework, beginning with requirement analysis, from which a sophisticated design concept is derived, followed by a description of the implementation of the framework. Furthermore, we present the concept of the Open Energy Services community, our effort to continuously maintain the service framework but also provide ready-to-use forecasting and optimization services. Finally, an evaluation of our framework and community concept, as well as a demarcation between our work and the current state of the art, is presented.
This study uniquely explores the impact of militarization on carbon emissions in North Atlantic Treaty Organization (NATO) countries from 1985 to 2019 using panel econometric techniques. NATO countries, characterized by substantial defense budgets, advanced technologies, high industrialization, and significant energy consumption, offer a unique context for examining these factors. Employing the Pooled Mean Group Autoregressive Distributed Lag (PMG-ARDL) and FMOLS models, the research analyzes the long-term and short-term dynamics across three groups: traditional NATO members (Group 1), new NATO members (Group 2), and a combined group (Group 3). Relevant variables used in the estimation are industrialization, technological innovation, energy consumption, and economic growth. Findings reveal that in Group 1, military expenditure and energy consumption significantly increase carbon emissions, while industrialization and technological innovation reduce them. In Group 2, increased military spending and industrialization reduce emissions, but energy consumption and technological innovation increase them. For Group 3, economic growth significantly drives emissions, whereas industrial advancements and selective technological innovations mitigate them. The study underscores the need for tailored environmental policies and technological advancements to reduce carbon emissions, contributing to sustainable development within military alliances. These insights are crucial for policymakers aiming to balance defense needs with environmental sustainability in NATO countries.
While the Sustainable Development Goals (SDGs) were being negotiated, global policymakers assumed that advances in data technology and statistical capabilities, what was dubbed the “data revolution”, would accelerate development outcomes by improving policy efficiency and accountability. The 2014 report to the United Nations Secretary General, “A World That Counts” framed the data-for-development agenda, and proposed four pathways to impact: measuring for accountability, generating disaggregated and real-time data supplies, improving policymaking, and implementing efficiency. The subsequent experience suggests that while many recommendations were implemented globally to advance the production of data and statistics, the impact on SDG outcomes has been inconsistent. Progress towards SDG targets has stalled despite advances in statistical systems capability, data production, and data analytics. The coherence of the SDG policy agenda has undoubtedly improved aspects of data collection and supply, with SDG frameworks standardizing greater indicator reporting. However, other events, including the response to COVID-19, have played catalytic roles in statistical system innovation. Overall, increased financing for statistical systems has not materialized, though planning and monitoring of these national systems may have longer-term impacts. This article reviews how assumptions about the data revolution have evolved and where new assumptions are necessary to advance the impact across the data value chain. These include focusing on measuring what matters most for decision-making needs across polycentric institutions, leveraging the SDGs for global data standardization and strategic financial mobilization, closing data gaps while enhancing policymaker analytic capabilities, and fostering collective intelligence to drive data innovation, credible information, and sustainable development outcomes.
Life cycle assessment (LCA) reports are commonly used for sustainability documentation, but extracting useful information from them is challenging and requires expert oversight. Designers frequently face technical obstacles and time constraints when interpreting LCA documents. As AI-driven tools become increasingly integrated into design workflows, there is an opportunity to improve access to sustainability data. This study used a mixed-methods approach to develop life cycle design heuristics to help non-LCA experts acquire relevant design knowledge from LCA reports. Developed through in-depth interviews with LCA experts (n = 9), these heuristics revealed five prominent categories of information: (1) scope of analysis, (2) priority components, (3) eco hotspots, (4) key metrics, and (5) design strategies. The utility of these heuristics was tested in a need-finding study with designers (n = 17), who annotated an LCA report using the heuristics. Findings suggest a need for additional support to help designers contextualize quantitative metrics (e.g., carbon footprints) and suggest relevant design strategies. A follow-up reflective interview study with LCA experts gathered feedback on the heuristics. These heuristics offer designers a framework for engaging with sustainability data, supporting product redesign, and a foundation for AI-assisted knowledge extraction to integrate life cycle information into design workflows efficiently.
Older adults often experience a decline in functional abilities, affecting their independence and mobility at home. Wearable lower-limb exoskeletons (LLEs) have the potential to serve as both assistive devices to support mobility and training tools to enhance physical capabilities. However, active end-user involvement is crucial to ensure LLEs align with users’ needs and preferences. This study employed a co-design methodology to explore home-based LLE requirements from the perspectives of older adults with mobility impairments and physiotherapists. Four older adults with self-reported mobility limitations participated by creating personas to represent different user needs and experiences (i.e., PERCEPT methodology), alongside four experienced physiotherapists who contributed their professional insights. As assistive devices, LLEs were seen as valuable for promoting independence, supporting mobility, and facilitating social participation, with essential activities including shopping, toileting, and outdoor walking. Physiotherapists expressed enthusiasm for integrating LLEs into remote rehabilitation programs, particularly to improve strength, balance, coordination, and walking speed. Key design considerations included a lightweight, discreet device that is easy to don and doff and comfortable for extended wear. Physiotherapists highlighted the potential of digital monitoring to assess physical parameters and personalize therapy. Fatigue emerged as a significant challenge for older adults, reinforcing the need for assistive LLEs to alleviate exhaustion and enhance functional independence. A shortlist of LLE features was drafted and scored, covering activity and design applications. These findings provide valuable insights into the design and usability of home-based LLEs, offering a foundation for developing devices that improve acceptance, usability, and long-term impact on healthy ageing.
World-historical analyses often view the “Asian” empires that survived into the twentieth century (the Russian, Qing, and Ottoman empires) as anomalies: sovereign “archaic” formations that remained external to the capitalist system. They posit an antagonistic relationship between state and capital and assume that modern capitalism failed to emerge in these empires because local merchants could not take over their states, as they did in Europe. Ottoman economic actors, and specifically the sarraf as state financier, have accordingly been portrayed as premodern intermediaries serving a “predatory” fiscal state, and thus, as external to capitalist development. This article challenges these narratives by uncovering the central role of Ottoman sarrafs, tax-farmers, and other merchant-financiers in the expanding credit economy of the mid-nineteenth century, focusing on their investment in the treasury bonds of Damascus. I show how fiscal change and new laws on interest facilitated the expansion of credit markets while attempting to regulate them by distinguishing between legitimate interest and usury. I also discuss Ottoman efforts to mitigate peasant indebtedness and the abuse of public debt by foreigners, amid the treasury bonds’ growing popularity. In this analysis, global capitalism was forged in the encounter between Ottoman imperial structures, geo-political concerns, and diverse, interacting traditions of credit, while the boundaries between public and private finance were being negotiated and redefined. Ultimately, Ottoman economic policies aimed to retain imperial sovereignty against European attempts to dominate regional credit markets—efforts often recast by the latter as “fanatical” Muslim resistance.