We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The tattoos of the Pazyryk ice mummies are of paramount importance for the archaeology of Iron Age Siberia and are often discussed from a broad stylistic and symbolic perspective. However, deeper investigations into this cultural practice were hindered by the inaccessibility of quality data. Here, the authors use high-resolution, near-infrared data in conjunction with experimental evidence to re-examine the tools and techniques employed in Early Iron Age tattooing. The high-quality data allow for the previously unfeasible distinction of artist hands and enable us to put the individual back into the picture of a widespread but rarely preserved prehistoric practice.
A key step toward understanding psychiatric disorders that disproportionately impact female mental health is delineating the emergence of sex-specific patterns of brain organisation at the critical transition from childhood to adolescence. Prior work suggests that individual differences in the spatial organisation of functional brain networks across the cortex are associated with psychopathology and differ systematically by sex.
Aims
We aimed to evaluate the impact of sex on the spatial organisation of person-specific functional brain networks.
Method
We leveraged person-specific atlases of functional brain networks, defined using non-negative matrix factorisation, in a sample of n = 6437 youths from the Adolescent Brain Cognitive Development Study. Across independent discovery and replication samples, we used generalised additive models to uncover associations between sex and the spatial layout (topography) of personalised functional networks (PFNs). We also trained support vector machines to classify participants’ sex from multivariate patterns of PFN topography.
Results
Sex differences in PFN topography were greatest in association networks including the frontoparietal, ventral attention and default mode networks. Machine learning models trained on participants’ PFNs were able to classify participant sex with high accuracy.
Conclusions
Sex differences in PFN topography are robust, and replicate across large-scale samples of youth. These results suggest a potential contributor to the female-biased risk in depressive and anxiety disorders that emerge at the transition from childhood to adolescence.
We present the Evolutionary Map of the Universe (EMU) survey conducted with the Australian Square Kilometre Array Pathfinder (ASKAP). EMU aims to deliver the touchstone radio atlas of the southern hemisphere. We introduce EMU and review its science drivers and key science goals, updated and tailored to the current ASKAP five-year survey plan. The development of the survey strategy and planned sky coverage is presented, along with the operational aspects of the survey and associated data analysis, together with a selection of diagnostics demonstrating the imaging quality and data characteristics. We give a general description of the value-added data pipeline and data products before concluding with a discussion of links to other surveys and projects and an outline of EMU’s legacy value.
Antibiotics overuse leads to bacterial resistance. The biomarker procalcitonin rises with bacterial pneumonias and remains normal in viral respiratory tract infections. Its use can distinguish between these etiologies and thus guide antibiotics use. We aimed to quantify the effect of procalcitonin use on clinical decision-making.
Design:
A retrospective study, spanning a year at a tertiary care center, where 348 patients hospitalized with aspiration pneumonia and 824 with non-aspiration pneumonia were evaluated with regards to procalcitonin use, the length of stay (LOS) and antibiotics prescribing practices. Descriptive statistics and univariate analyses were applied to the ensemble data. Subsets of cases were manually reviewed and analyzed with descriptive statistics. P < 0.05 indicated statistical significance.
Results:
21% of both the aspiration and non-aspiration pneumonia cases had procalcitonin checked. In the ensemble analyses, a check of procalcitonin was more likely to happen in prolonged hospitalizations with aspiration pneumonia. The LOS was statistically the same regardless of procalcitonin results (elevated or normal) for both the aspiration and non-aspiration pneumonia cohorts. The overall use of antibiotics was not affected by the procalcitonin results. After excluding two extreme outliers, the per-person antibiotics cost was not affected by the procalcitonin results. Detailed chart reviews of 33 cases revealed that for the vast majority, the procalcitonin results were not used by clinicians to guide the duration of antibiotics use.
Conclusions:
Despite its promise as a biomarker for antibiotics stewardship, procalcitonin results appeared to not be utilized by clinicians as a decision-making tool in the management of pneumonia.
Objectives/Goals: We hypothesized that the bulk transcriptomic profiling of blood collected from within the ischemic vasculature during an acute ischemic stroke with large vessel occlusion (LVO) will contain unique biomarkers that are different from the peripheral circulation and may provide much-needed insight into the underlying pathogenesis of LVO in humans. Methods/Study Population: The transcriptomic biomarkers of Inflammation in Large Vessel Ischemic Stroke pilot study prospectively enrolled patients ≥ 18 years of age with an anterior circulation LVO, treated with endovascular thrombectomy (EVT). Two periprocedural arterial blood samples were obtained (DNA/RNA Shield™ tubes, Zymo Research); 1) proximal to the thrombus, from the internal carotid artery and 2) immediately downstream from the thrombus, by puncturing through the thrombus with the microcatheter. Bulk RNA sequencing was performed and differential gene expression was identified using the Wilcoxon signed rank test for paired data, adjusting for age, sex, use of thrombolytics, last known well to EVT, and thrombolysis in cerebral infarction score. Bioinformatic pathway analyses were computed using MCODE and reactome. Results/Anticipated Results: From May to October 2022, 20 patients were screened and 13 were enrolled (median age 68 [SD 10.1], 47% male, 100% white). A total of 608 differentially expressed genes were found to be significant (p-value) Discussion/Significance of Impact: These results provide evidence of significant gene expression changes occurring within the ischemic vasculature of the brain during LVO, which may correlate with larger ischemic infarct volumes and worse functional outcomes at 90 days. Future studies with larger sample sizes are supported by this work.
Glacier collapse features, linked to subglacial cavities, are increasingly common on retreating Alpine glaciers. These features are hypothesized to result from glacier downwasting and subsurface ablation processes but the understanding regarding their distribution, formation and contribution to glacier mass loss remains limited. We present a Swiss-wide inventory of 223 collapse features observed over the past 50 years, revealing a sharp increase in their occurrence since the early 2000s. Using high-resolution digital elevation models, we derive a relationship between collapse feature area and ice ablation and estimate the Swiss-wide contribution of collapse features to glacier mass loss to be $19.8\times 10^6\,\text{m}^3$ of ice between 1971 and 2023. Based on extensive observations at Rhonegletscher, including surface displacement, ground-penetrating radar and drone-based elevation models, we quantify subsurface ablation rates of up to 27 cm d−1 and provide a detailed description of the collapse processes. We propose that glacier downwasting, enhanced energy supply through subglacial conduits and locally increased basal melt are key components to subglacial cavity growth. Our results highlight the importance of collapse features in the ongoing retreat of Alpine glaciers, stressing the need for further research to understand their formation and long-term implications for glacier dynamics under climate change.
The severe ice losses observed for European glaciers in recent years have increased the interest in monitoring short-term glacier changes. Here, we present a method for constraining modelled glacier mass balance at the sub-seasonal scale and apply it to ten selected glaciers in the Swiss Alps over the period 2015–23. The method relies on observations of the snow-covered area fraction (SCAF) retrieved from Sentinel-2 imagery and long-term mean glacier mass balances. The additional information provided by the SCAF observations is shown to improve winter mass balance estimates by 22% on average over the study sites and by up to 70% in individual cases. Our approach exhibits good performance, with a mean absolute deviation (MAD) to the observed seasonal mass balances of 0.28 m w.e. and an MAD to the observed SCAFs of 6%. The results highlight the importance of accurately constraining winter accumulation when aiming to reproduce the evolution of glacier mass balance over the melt season and to better separate accumulation and ablation components. Since our method relies on remotely sensed observations and avoids the need for in situ measurements, we conclude that it holds potential for regional-scale glacier monitoring.
To quantify the impact of patient- and unit-level risk adjustment on infant hospital-onset bacteremia (HOB) standardized infection ratio (SIR) ranking.
Design:
A retrospective, multicenter cohort study.
Setting and participants:
Infants admitted to 284 neonatal intensive care units (NICUs) in the United States between 2016 and 2021.
Methods:
Expected HOB rates and SIRs were calculated using four adjustment strategies: birthweight (model 1), birthweight and postnatal age (model 2), birthweight and NICU complexity (model 3), and birthweight, postnatal age, and NICU complexity (model 4). Sites were ranked according to the unadjusted HOB rate, and these rankings were compared to rankings based on the four adjusted SIR models.
Results:
Compared to unadjusted HOB rate ranking (smallest to largest), the number and proportion of NICUs that left the fourth quartile (worst-performing) following adjustments were as follows: adjusted for birthweight (16, 22.5%), birthweight and postnatal age (19, 26.8%), birthweight and NICU complexity (22, 31.0%), birthweight, postnatal age and NICU complexity (23, 32.4%). Comparing NICUs that moved into the better-performing quartiles after birthweight adjustment to those that remained in the better-performing quartiles regardless of adjustment, the median percentage of low birthweight infants was 17.1% (Interquartile Range (IQR): 15.8, 19.2) vs 8.7% (IQR: 4.8, 12.6); and the median percentage of infants who died was 2.2% (IQR: 1.8, 3.1) vs 0.5% (IQR: 0.01, 12.0), respectively.
Conclusion:
Adjusting for patient and unit-level complexity moved one-third of NICUs in the worst-performing quartile into a better-performing quartile. Risk adjustment may allow for a more accurate comparison across units with varying levels of patient acuity and complexity.
Guided-jet waves have been shown to close resonance loops in a myriad of problems such as screech and impingement tones in jets. These discrete, upstream-travelling waves have long been identified in linear-stability models of jet flows, but in this work they are instead considered in the context of an acoustic-scattering problem. It is shown that the guided-jet mode results from total internal reflection and transmission of acoustic waves, arising from the shear layer behaving like a duct with some given wall impedance. After total reflection, only discrete streamwise wavenumbers may be supported by the flow, with these wavenumbers dictated by the fact that the standing wave formed inside of the jet must fit between the two shear layers. Close to the sonic line, the transmission of this mode to the outside is maximum, leading to a net-energy flux directed upstream, which dictates the direction of propagation of this mode, providing a clear connection to the better understood soft-duct mode (Towne et al., J. Fluid Mech., vol. 825, 2017, pp. 1113–1152). The model also indicates that these waves are generated in the core of the flow and can only be efficiently transmitted to the quiescent region under certain conditions, providing an explanation as to why screech is only observed at conditions where the discrete mode is supported by the flow. The present results explain, for the first time, the nature and characteristics of the guided-jet waves.
In response to the COVID-19 pandemic, we rapidly implemented a plasma coordination center, within two months, to support transfusion for two outpatient randomized controlled trials. The center design was based on an investigational drug services model and a Food and Drug Administration-compliant database to manage blood product inventory and trial safety.
Methods:
A core investigational team adapted a cloud-based platform to randomize patient assignments and track inventory distribution of control plasma and high-titer COVID-19 convalescent plasma of different blood groups from 29 donor collection centers directly to blood banks serving 26 transfusion sites.
Results:
We performed 1,351 transfusions in 16 months. The transparency of the digital inventory at each site was critical to facilitate qualification, randomization, and overnight shipments of blood group-compatible plasma for transfusions into trial participants. While inventory challenges were heightened with COVID-19 convalescent plasma, the cloud-based system, and the flexible approach of the plasma coordination center staff across the blood bank network enabled decentralized procurement and distribution of investigational products to maintain inventory thresholds and overcome local supply chain restraints at the sites.
Conclusion:
The rapid creation of a plasma coordination center for outpatient transfusions is infrequent in the academic setting. Distributing more than 3,100 plasma units to blood banks charged with managing investigational inventory across the U.S. in a decentralized manner posed operational and regulatory challenges while providing opportunities for the plasma coordination center to contribute to research of global importance. This program can serve as a template in subsequent public health emergencies.
We evaluated SARS-CoV-2 anti-nucleocapsid (anti-N) seroconversion and seroreversion rates, risk factors associated with SARS-CoV-2 seroconversion, and COVID-19 risk perceptions among academic healthcare center employees in a rural state.
Methods:
Among employees aged ≥18 years who completed a screening survey (n = 1,377), we invited all respondents reporting previous COVID-19 (n = 85; 82 accepted) and a random selection of respondents not reporting previous COVID-19 (n = 370; 220 accepted) to participate. Participants completed surveys and provided blood samples at 3-month intervals (T0, T3, T6, T9). We used logistic regression to identify risk factors for seropositivity at T0.
Results:
The cohort was primarily direct patient caregivers (205/302; 67.9%), white (278/302; 92.1%), and female (212/302; 70.2%). At T0, 86/302 (28.4%) participants were seropositive. Of the seronegative participants, 6/198 (3.0%), 6/183 (3.3%), and 14/180 (7.8%) had seroconverted at T3, T6, and T9, respectively. The overall seroreversion rate was 6.98% at T9. At T0, nursing staff (odds ratio [OR], 2.37; 95% confidence interval [CI], 1.08, 5.19) and being within six feet of a non-household member outside of work (OR, 2.91; 95% CI, 1.02, 8.33) had significantly higher odds of seropositivity. Vaccination (OR, 0.05; 95% CI, 0.02, 0.12) and face mask use (OR, 0.36; 95% CI, 0.17, 0.78) were protective.
Conclusions:
The seroconversion and seroreversion rates were low among participants. Public health and infection prevention measures implemented early in the COVID-19 pandemic – vaccination, face mask use, and social distancing – were associated with significantly lower odds of SARS-CoV-2 seropositivity among participants.
BrighT STAR was a diagnostic stewardship collaborative of 14 pediatric intensive care units (PICUs) across the United States designed to standardize and reduce unnecessary blood cultures and study the impact on patient outcomes and broad-spectrum antibiotic use. We now examine the implementation process in detail to understand how sites facilitated this diagnostic stewardship program in their PICUs.
Design:
A multi-center electronic survey of the 14 BrighT STAR sites, based on qualitative data about the implementation process collected during the primary phase of BrighT STAR.
Setting:
14 PICUs enrolled in BrighT STAR.
Participants:
Site leads at each enrolled site.
Methods:
An electronic survey guided by implementation science literature and based on data collected during BrighT STAR was administered to all 14 sites after completion of the primary phase of the collaborative.
Results:
10 specific tasks appear critical to implementing blood culture diagnostic stewardship, with variability in site-level strategies employed to accomplish those tasks. Sites rated certain tasks and strategies as highly important. Strategies used in top-performing sites were distinct from those used in lower-performing sites. Certain strategies may link to drivers of culture overuse and represent key targets for changing clinician behavior.
Conclusions:
BrighT STAR offers important insights into the tasks and strategies used to facilitate successful diagnostic stewardship in the PICU. More work is needed to compare specific strategies and optimize stewardship outcomes in this complex environment.
Estimate the risk for household transmission of Methicillin-Resistant Staphylococcus aureus (MRSA) following exposure to infected family members or family members recently discharged from a hospital.
Design:
Analysis of monthly MRSA incidence from longitudinal insurance claims using the Merative MarketScan Commercial and Medicare (2001–2021) databases.
Setting:
Visits to inpatient, emergency department, and outpatient settings.
Patients:
Households with ≥2 family members enrolled in the same insurance plan for the entire month.
Methods:
We estimated a monthly incidence model, where enrollees were binned into monthly enrollment strata defined by demographic, patient, and exposure characteristics. Monthly incidence within each stratum was computed, and a regression analysis was used to estimate the incidence rate ratio (IRR) associated with household exposures of interest while accounting for potential confounding factors.
Results:
A total of 157,944,708 enrollees were included and 424,512 cases of MRSA were identified. Across all included enrollees, exposure to a family member with MRSA in the prior 30 days was associated with significantly increased risk of infection (IRR: 71.03 [95% CI, 67.73–74.50]). After removing enrollees who were hospitalized or exposed to a family member with MRSA, exposure to a family member who was recently discharged from the hospital was associated with increased risk of infection (IRR: 1.44 [95% CI, 1.39–1.49]) and the risk of infection increased with the duration of the family member’s hospital stay (P value < .001).
Conclusions:
Exposure to a recently hospitalized and discharged family member increased the risk of MRSA infection in a household even when the hospitalized family member was not diagnosed with MRSA.
Bigotry distractions are strategic invocations of racism, transphobia, or negative stigma toward other marginalized groups to shape political discourse. Although the vast majority of Americans agree on large policy issues ranging from reducing air pollution to prosecuting corporate crime, bigotry distractions divert attention from areas of agreement toward divisive identity issues. This article explores how the nefarious targeting of identity groups through bigotry distractions may be the tallest barrier to health reform, and social change more broadly. The discussion extends the literature on dog whistles, strategic racism, and scapegoating.
A number of nearby dwarf galaxies have globular cluster (GC) candidates that require spectroscopic confirmation. Here, we present Keck telescope spectra for 15 known GCs and GC candidates that may be associated with a host dwarf galaxy and an additional 3 GCs in the halo of M31 that are candidates for accretion from a now-disrupted dwarf galaxy. We confirm six star clusters (of intermediate-to-old age) to be associated with NGC 247. The vast bulk of its GC system remains to be studied spectroscopically. We also confirm the GC candidates in F8D1 and DDO190, finding both to be young star clusters. The three M31 halo GCs all have radial velocities consistent with M31 and are old and very metal-poor. Their ages and metallicities are consistent with accretion from a low-mass satellite galaxy. Finally, three objects are found to be background galaxies – two are projected near NGC 247 and one (candidate GCC7) is near the IKN dwarf. The IKN dwarf thus has only five confirmed GCs but still a remarkable specific frequency of 124.
Although the link between alcohol involvement and behavioral phenotypes (e.g. impulsivity, negative affect, executive function [EF]) is well-established, the directionality of these associations, specificity to stages of alcohol involvement, and extent of shared genetic liability remain unclear. We estimate longitudinal associations between transitions among alcohol milestones, behavioral phenotypes, and indices of genetic risk.
Methods
Data came from the Collaborative Study on the Genetics of Alcoholism (n = 3681; ages 11–36). Alcohol transitions (first: drink, intoxication, alcohol use disorder [AUD] symptom, AUD diagnosis), internalizing, and externalizing phenotypes came from the Semi-Structured Assessment for the Genetics of Alcoholism. EF was measured with the Tower of London and Visual Span Tasks. Polygenic scores (PGS) were computed for alcohol-related and behavioral phenotypes. Cox models estimated associations among PGS, behavior, and alcohol milestones.
Results
Externalizing phenotypes (e.g. conduct disorder symptoms) were associated with future initiation and drinking problems (hazard ratio (HR)⩾1.16). Internalizing (e.g. social anxiety) was associated with hazards for progression from first drink to severe AUD (HR⩾1.55). Initiation and AUD were associated with increased hazards for later depressive symptoms and suicidal ideation (HR⩾1.38), and initiation was associated with increased hazards for future conduct symptoms (HR = 1.60). EF was not associated with alcohol transitions. Drinks per week PGS was linked with increased hazards for alcohol transitions (HR⩾1.06). Problematic alcohol use PGS increased hazards for suicidal ideation (HR = 1.20).
Conclusions
Behavioral markers of addiction vulnerability precede and follow alcohol transitions, highlighting dynamic, bidirectional relationships between behavior and emerging addiction.
OBJECTIVES/GOALS: This work aims to explore how citizen science serves as a transformative frame work to bridge scientific knowledge, focusing on its potential to enhance transdisciplinary learning in artificial intelligence (AI) biomedical and clinical sciences by facilitating near-peer mentoring. METHODS/STUDY POPULATION: Our group of eight friends comprise a multicultural and multidisciplinary cohort including students from the USA, Philippines, Indonesia, and Guatemala pursuing PhD degrees in electrical and computer engineering, epidemiology, physics, and MD, PharmD, and DMD degrees. We engage in shared online courses, collaborative projects, and abstract submissions. Employing our collective knowledge, we design interactive learning experiences, support each other’s initiatives, and collaboratively develop lectures and presentations. We in tend to expand collaborations in biomedical AI education while fostering principles of experiential and collaborativelearning, constructivism, and authentic inquiry. RESULTS/ANTICIPATED RESULTS: Our recent successes include submittedconference abstracts on data science and AI education in pharmacy and the facilitation of a guest lecture in health informatics. Additionally, we are currently collaborating on seven biomedical machine learning projects in radio frequency engineering, aiming for conference submissions. Moving forward, our goal is to expand our group, support the formation of similar communities, and promote data science and AI literacy in biomedical and clinical contexts. We aspire to extend this knowledge to families, classmates, and eventually patients, facilitating a broader understanding of the role of AI in healthcare. DISCUSSION/SIGNIFICANCE: We believe diverse expertise and pedagogical theories can help demonstrate the potential of citizen science to democratize scientific experience. By nurturing collaborative networks our efforts aim to bridge gaps between disciplines and enhance the broader public’s understanding of AI in healthcare.
Compare the effectiveness of multiple mitigation measures designed to protect nursing home residents from infectious disease outbreaks.
Design:
Agent-based simulation study.
Setting:
Simulation environment of a small nursing home.
Methods:
We collected temporally detailed and spatially fine-grained location information from nursing home healthcare workers (HCWs) using sensor motes. We used these data to power an agent-based simulation of a COVID-19 outbreak using realistic time-varying estimates of infectivity and diagnostic sensitivity. Under varying community prevalence and transmissibility, we compared the mitigating effects of (i) regular screening and isolation, (ii) inter-resident contact restrictions, (iii) reduced HCW presenteeism, and (iv) modified HCW scheduling.
Results:
Across all configurations tested, screening every other day and isolating positive cases decreased the attack rate by an average of 27% to 0.501 on average, while contact restrictions decreased the attack rate by an average of 35%, resulting in an attack rate of only 0.240, approximately half that of screening/isolation. Combining both interventions impressively produced an attack rate of only 0.029. Halving the observed presenteeism rate led to an 18% decrease in the attack rate, but if combined with screening every 6 days, the effect of reducing presenteeism was negligible. Altering work schedules had negligible effects on the attack rate.
Conclusions:
Universal contact restrictions are highly effective for protecting vulnerable nursing home residents, yet adversely affect physical and mental health. In high transmission and/or high community prevalence situations, restricting inter-resident contact to groups of 4 was effective and made highly effective when paired with weekly testing.
Diagnostic stewardship seeks to improve ordering, collection, performance, and reporting of tests. Test results play an important role in reportable HAIs. The inclusion of HAIs in public reporting and pay for performance programs has highlighted the value of diagnostic stewardship as part of infection prevention initiatives. Inappropriate testing should be discouraged, and approaches that seek to alter testing solely to impact a reportable metric should be avoided. HAI definitions should be further adapted to new testing technologies, with focus on actionable and clinically relevant test results that will improve patient care.