To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Impaired social functioning is commonly observed in youth at clinical high risk (CHR) for psychosis. Interpersonal synchrony, defined as the temporal alignment of movement between interacting partners, is a key component of successful social interactions. This study aimed to investigate interpersonal head synchrony in naturalistic virtual settings among CHR individuals using automated video analysis tools.
Methods
We analyzed short video recordings from virtual clinical interviews involving 116 participants including 50 CHR participants, 36 individuals with sub-threshold positive symptoms (SUB), and 30 healthy controls (HC). Vertical head movement time series were extracted using an open-access video-based head-tracking tool. Interpersonal head synchrony was computed using Windowed Cross-Correlation to assess group differences and associations with clinical symptoms and functioning.
Results
CHR participants showed significantly reduced strength of synchrony compared to HC (β = −0.05, 95% CI [−0.09, −0.02], p = .004), although 14% of variance in strength of synchrony was attributable to assessor identity. No significant group differences were found for delay of synchrony. Within the CHR group, delay of synchrony was positively associated with social anhedonia (r = 0.29). Strength of synchrony correlated with better social (r = 0.33) and role (r = 0.28) functioning.
Conclusion
Our findings suggest that impaired interpersonal head synchrony is already present in the psychosis-risk state and relates to negative symptoms and social and role functioning. These findings support the utility of nonverbal synchrony as a potential biomarker and demonstrate the feasibility of automated tools and virtual assessments to study social processes in at-risk populations.
The Chalbi Desert, located in eastern Africa, is a significant but overlooked archive of the Pleistocene and Holocene periods that could add insight into investigations on human evolution. We revisited southeastern Chalbi Desert landforms between the towns of Kargi and Maikona to improve the chronostratigraphy and provide paleoenvironmental context. Direct U-series and electron spin resonance dating of various fossil teeth recovered from a deflated dune (Qzs) landform at the Farre locality return a mean age of ∼545 ka, which is compatible with biostratigraphic inferences. While this numerical age result should probably be regarded as mostly indicative given the existing uncertainty on the environmental dose rate evaluation, the data set available nevertheless strongly suggests a Middle Pleistocene age for at least some of the fauna. Sedimentology, luminescence, and 14C dating further suggest that this Qzs landform and its contents were modified by alluvial fan development and weathering during denudation in a proximal fan setting through the late Pleistocene into the Holocene. The Qzs landform currently experiences aeolian additions, erosion, and salt-affected soil development in an arid climate. Pedogenic carbonate isotope geochemistry suggests that deflated sand dunes were covered by woody grasslands during Marine Isotope Stage (MIS) 4 and 3 pluvials, consistent with nearby fan progradation constrained at >35 ka. The desert experienced increased hydrologic activity during late Pleistocene and African humid period pluvials, as evidenced by additional optically stimulated luminescence and 14C dating from fan, dune, and playa contexts. The last significant pluvial episode ended after 4.4 ± 0.3 cal ka BP, which coincides with the final regression of nearby Lake Turkana. This study extends the chronology of Quaternary sediments in the Chalbi Desert to the Middle Pleistocene and offers paleoenvironmental insights into the conditions experienced by Middle Stone Age tool users in the region.
Paleontology provides insights into the history of the planet, from the origins of life billions of years ago to the biotic changes of the Recent. The scope of paleontological research is as vast as it is varied, and the field is constantly evolving. In an effort to identify “Big Questions” in paleontology, experts from around the world came together to build a list of priority questions the field can address in the years ahead. The 89 questions presented herein (grouped within 11 themes) represent contributions from nearly 200 international scientists. These questions touch on common themes including biodiversity drivers and patterns, integrating data types across spatiotemporal scales, applying paleontological data to contemporary biodiversity and climate issues, and effectively utilizing innovative methods and technology for new paleontological insights. In addition to these theoretical questions, discussions touch upon structural concerns within the field, advocating for an increased valuation of specimen-based research, protection of natural heritage sites, and the importance of collections infrastructure, along with a stronger emphasis on human diversity, equity, and inclusion. These questions offer a starting point—an initial nucleus of consensus that paleontologists can expand on—for engaging in discussions, securing funding, advocating for museums, and fostering continued growth in shared research directions.
In this article, we evaluate several large language models (LLMs) on a word-level translation alignment task between Ancient Greek and English. Comparing model performance to a human gold standard, we examine the performance of four different LLMs, two open-weight and two proprietary. We then take the best-performing model and generate examples of word-level alignments for further finetuning of the open-weight models. We observe significant improvement of open-weight models due to finetuning on synthetic data. These findings suggest that open-weight models, though not able to perform a certain task themselves, can be bolstered through finetuning to achieve impressive results. We believe that this work can help inform the development of more such tools in the digital classics and the computational humanities at large.
Changes like the shift of tropical forests into savannah in the Amazon highlight the potential for deforestation to drive ecosystems past potentially irreversible tipping points. Reforestation may avert or delay tipping points, but its success depends on the degree to which secondary and primary forests are substitutes in the production of ecosystem services. This article explores how deforestation, reforestation and substitutability between forest types affect the likelihood that a forest system will cross a tipping point. Efforts to ensure that secondary forests better mimic primary forests only yield a small improvement in terms of delaying ecosystem collapse. The most significant effects on tipping points arise from an increase in the relative costs of clearing primary forests or a decrease in the costs of protecting land tenure in secondary forests. Our results highlight the importance of the latter, which are often ignored as a policy target, to reduce the risk of ecosystem collapse.
Identifying early-life risk factors for chronic depression symptomology in young people, is essential to informing early targeted interventions. One highly prevalent symptom (and potential risk factor) in depression is sleep problems, such as insomnia or hypersomnia. However, most studies have measured sleep disturbances and depression symptoms at only one time point, and the prospective relationship between persistent shorter or longer sleep duration in childhood and chronic depression symptoms in adolescence through to adulthood has not been explored.
Objectives
To identify whether longitudinal trajectories of persistent shorter sleep and persistent longer sleep duration between 6 months to 7 years of age, are associated with increased risk of developing chronic depression symptoms between 13-22 years of age.
Methods
Prospective associations were explored using the Avon Longitudinal Study of Parents and Children (ALSPAC), in the UK. Childhood night-time sleep duration was parent-reported at 6, 18, and 30 months and at 3.5, 4 to 5, 5 to 6, and 6 to 7 years. Depression symptoms were self-reported via the Short Mood and Feelings Questionnaire (SMFQ) at, 12.5, 13.5, 16, 17.5, 18, 21 and 22 years of age. Latent Growth Curve Analysis was used to identify longitudinal trajectories of night-time sleep duration from 6 months to 7 years of age (i.e. longer (63%), shorter (2%), average-shorter sleep (22%) and average-longer sleep (13%)) and depression symptoms (i.e. chronic (5%), non-chronic (95%)) from 13 to 22 years. Logistic regressions were conducted to identify the prospective association between persistent shorter and persistent longer sleep trajectories and chronic depression symptoms.
Results
Preliminary results revealed that persistent shorter sleep duration across childhood was associated with increased likelihood of presenting with chronic depression symptoms, even after adjusting for the effects of sex, birthweight, maternal age, child ethnicity, family adversity and maternal socioeconomic status (OR = 1.94, 95% CIs, 1.01, 3.73 p =. 046). Persistent longer sleep however did not show significant associations.
Conclusions
A persistent pattern of shorter sleep duration across childhood is associated with chronic depression symptoms in adolescence through to adulthood. Sleep is a modifiable risk factor and targeted interventions for those presenting a sustained pattern of shorter sleep duration across childhood is suggested to prevent future mental health problems, such as depression.
Termination of an existing failed corn stand before replanting is essential. Two studies were conducted in Stoneville and Verona, MS, from 2020 to 2021 to evaluate timing of corn or soybean replanting following different herbicide treatments applied to simulated failed stands of corn. Treatments included paraquat alone at 841 g ai ha−1, paraquat at 841 g ha−1 + metribuzin at 211 g ai ha−1, and clethodim at 51 g ai ha−1 + glyphosate at 1,121 g ae ha−1 applied at the V2 growth stage. Replant timings were 1 and 7 d after herbicide treatment (DAT). Pooled across replant timings, paraquat + metribuzin provided the greatest control 3 DAT compared with other treatments in both studies. At 14 and 21 DAT, clethodim + glyphosate controlled more corn than did paraquat + metribuzin and paraquat alone. Control of a simulated failed corn stand with paraquat alone never exceeded 50% at 3 to 21 DAT. Soybean yield in all plots receiving herbicide treatment targeting simulated failed corn stands were similar and ≥2,150 kg ha−1. When applied at the V2 corn growth stage, both clethodim + glyphosate and paraquat + metribuzin controlled a simulated failed stand of corn. This study demonstrated the importance of terminating failed stands of corn before replanting because of dramatic reductions in yield in the plots not treated with herbicide.
Healthcare-associated infections (HAIs) result in substantial patient harm and avoidable costs. Pay-for-performance programs (PFP) through the Centers for Medicare and Medicaid Services (CMS) have resulted in reductions of HAIs like central line-associated bloodstream infections (CLABSI) and methicillin-resistant Staphylococcus aureus bacteremia, through robust infection prevention programs and practices. Hospital Onset Bacteremia and Fungemia (HOB) is proposed as an alternative quality measure for public reporting and PFP, and was endorsed by the National Quality Forum in 2022. This broad measure is designed as an electronic quality measure that avoids manual abstraction and excludes risk adjustment. HOB would substantially expand the scope of focus of existing bloodstream infection measurement, and is currently being considered for voluntary reporting in 2025. In this article, we provide arguments for and against adopting HOB as a PFP measure linked to CMS payments.
Exposure to environmentally transmitted parasites should increase with population density due to accumulation of infective parasites in space. However, resource competition also increases with density, lowering immunity and increasing susceptibility, offering an alternative pathway for density-dependent infection. To test the relationships between these two processes and parasitism, we examined associations between host density, resource availability, immunity, and counts of 3 common helminth parasites using a long-term study of red deer. We found evidence that immunity increased with resource availability while parasite counts declined with immunity. We also found that greater density correlated with reduced resource availability, and while density was positively associated with both strongyle and tissue worm burdens, resource availability was independently and negatively associated with the same burdens. Our results support separate roles of density-dependent exposure and susceptibility in driving infection, providing evidence that resource competition is an important driver of infection, exacerbating effects of density-dependent increases in exposure.
High density should drive greater parasite exposure. However, evidence linking density with infection generally uses density proxies or measures of population size, rather than measures of individuals per space within a continuous population. We used a long-term study of wild sheep to link within-population spatiotemporal variation in host density with individual parasite counts. Although four parasites exhibited strong positive relationships with local density, these relationships were mostly restricted to juveniles and faded in adults. Furthermore, one ectoparasite showed strong negative relationships across all age classes. In contrast, population size – a measure of global density – had limited explanatory power, and its effects did not remove those of spatial density, but were distinct. These results indicate that local and global density can exhibit diverse and contrasting effects on infection within populations. Spatial measures of within-population local density may provide substantial additional insight to temporal metrics based on population size, and investigating them more widely could be revealing.
Healthcare-prescribed opioids are a known contributor to the opioid epidemic. Locally, there was an identified opportunity to improve opioid prescribing practices in cardiac surgical patients. The cardiac surgical team sought to standardise prescribing practices in postoperative patients and reduce opioid prescriptions at discharge. The improvement was undertaken at a large midwestern freestanding children’s hospital with over 400 beds and 120 cardiac surgeries annually. A multidisciplinary team was formed, using the model for Improvement to guide the improvement work. The key improvement interventions included standardised evidence-based prescribing guidelines based patient age and surgical approach, enhanced pain management with non-opioid medications, and integration of prescribing guidelines into the electronic health record. The primary outcome measure was rate of compliance with the prescribing guidelines and secondary measures included morphine equivalent dosing at discharge, opioid-free discharge, and length of stay. A balancing measure of opioid re-prescriptions was tracked. There were 289 patients included in the primary study period (January 2019 through December 2021). Sustainability of key outcomes was tracked though December 2022. The guideline compliance increased from 24% to 100%. The morphine equivalent dosing decreased to 22.5 in 2021 then 0 in 2022, from baseline of 36.25 in 2019. Opioid-free discharges decreased from 8% (2019) to 1.5% (2021) and 0% in 2022. Establishment and compliance with standardised guidelines for post-operative cardiac surgical pain management yielded a reduction in morphine equivalent dosing, an increase opioid-free discharges, and no increase in length of stay or opioid re-prescriptions.
Mass Gathering Medicine focuses on mitigating issues at Mass Gathering Events. Medical skills can vary substantially among staff, and the literature provides no specific guidance on staff training. This study highlights expert opinions on minimum training for medical staff to formalize preparation for a mass gathering.
Methods
This is a 3-round Delphi study. Experts were enlisted at Mass Gathering conferences, and researchers emailed participation requests through Stat59 software. Consent was obtained verbally and on Stat59 software. All responses were anonymous. Experts generated opinions. The second and third rounds used a 7-point linear ranking scale. Statements reached a consensus if the responses had a standard deviation (SD) of less than or equal to 1.0.
Results
Round 1 generated 137 open-ended statements. Seventy-three statements proceeded to round 2. 28.7% (21/73) found consensus. In round 3, 40.3% of the remaining statements reached consensus (21/52). Priority themes included venue-specific information, staff orientation to operations and capabilities, and community coordination. Mass casualty preparation and triage were also highlighted as a critical focus.
Conclusions
This expert consensus framework emphasizes core training areas, including venue-specific operations, mass casualty response, triage, and life-saving skills. The heterogeneity of Mass Gatherings makes instituting universal standards challenging. The conclusions highlight recurrent themes of priority among multiple experts.
The best prehospital transport strategy for patients with suspected stroke due to possible large vessel occlusion varies by jurisdiction and available resources. A foundational problem is the lack of a definitive diagnosis at the scene. Rural stroke presentations provide the most problematic triage destination decision-making. In Alberta, Canada, the implementation and 5-year experience with a rural field consultation approach to provide service to rural patients with acute stroke is described.
Methods:
The protocols established through the rural field consultation system and the subsequent transport patterns for suspected stroke patients during the first 5 years of implementation are presented. Outcomes are reported using home time and data are summarized using descriptive statistics.
Results:
From April 2017 to March 2022, 721 patients met the definition for a rural field consultation, and 601 patients were included in the analysis. Most patients (n = 541, 90%) were transported by ground ambulance. Intravenous thrombolysis was provided for 65 (10.8%) of patients, and 106 (17.6%) underwent endovascular thrombectomy. The median time from first medical contact to arterial access was 3.2 h (range 1.3–7.6) in the direct transfers, compared to 6.5 h (range 4.6–7.9) in patients arriving indirectly to the comprehensive stroke center (CSC). Only a small proportion of patients (n = 5, 0.8%) were routed suboptimally to a primary stroke center and then to a CSC where they underwent endovascular therapy.
Conclusions:
The rural field consultation system was associated with shortened delays to recanalization and demonstrated that it is feasible to improve access to acute stroke care for rural patients.
This study aimed to assess the impact of hypertensive disorders of pregnancy on infant neurodevelopment by comparing 6-month and 2-year psychomotor development outcomes of infants exposed to gestational hypertension (GH) or preeclampsia (PE) versus normotensive pregnancy (NTP). Participating infants were children of women enrolled in the Postpartum Physiology, Psychology and Paediatric (P4) cohort study who had NTPs, GH or PE. 6-month and 2-year Ages and Stages Questionnaires (ASQ-3) scores were categorised as passes or fails according to domain-specific values. For the 2-year Bayley Scales of Infant and Toddler Development (BSID-III) assessment, scores > 2 standard deviations below the mean in a domain were defined as developmental delay. Infants (n = 369, male = 190) exposed to PE (n = 75) versus GH (n = 20) and NTP (n = 274) were more likely to be born small for gestational age and premature. After adjustment, at 2 years, prematurity status was significantly associated with failing any domain of the ASQ-3 (p = 0.015), and maternal tertiary education with increased cognitive scores on the BSID-III (p = 0.013). However, PE and GH exposure were not associated with clinically significant risks of delayed infant neurodevelopment in this study. Larger, multicentre studies are required to further clarify early childhood neurodevelopmental outcomes following hypertensive pregnancies.
Chemical, biological, radiological, and nuclear (CBRN) incidents pose increasing transborder risks globally, necessitating enhanced health sector preparedness.
Objectives:
This study aimed to develop a comprehensive CBRN preparedness assessment tool (PAT), operational response guidelines (ORG), and tabletop simulation scenarios for the health sectors of the Middle East and North Africa (MENA) region.
Method/Description:
A mixed-methods approach comprised a systematic review of the literature up to 2022 in English and French, modified expert interviews (MIM), and an online Delphi questionnaire. Content analysis was performed on interview data. Using R-Studio™, consensus metrics and artificial intelligence techniques, including natural language processing, sentiment analysis, and unsupervised machine learning (ML) clustering algorithms, were deployed for advanced data analysis across all phases.
Results/Outcomes:
The literature review identified 63 relevant studies illustrating various preparedness strategies. The MIM’s thematic analysis, reinforced by AI-driven content analysis, emphasized the need for stronger inter-regional cooperation facilitated by organizations such as WHO and standardized tabletop simulation training. A robust consensus was achieved on the proposed assessment tool and operational response guidelines. ML analysis identified distinct expert clusters, providing additional consensus perspectives.
Conclusion:
The study emphasized the urgency for collaborative CBRN response strategies within MENA, valuing the innovative aspect of our suggested PAT, ORG, and simulation scenarios. This work advocates a dynamic, resilient approach to disaster medicine preparedness, which is crucial for regional security and global health resilience, especially in the MENA. It also highlights the significant role of AI analysis methods in enriching analytical outcomes in disaster medicine research and promoting data-informed preparedness strategies.
The Red Cross Red Crescent Health Information System (RCHIS) is an electronic medical record (EHR) and health information management system (HIS) which has been designed for international disaster responses with a cloud-based server and a local server to bridge temporary internet outages. This architecture allows for remote information management and operational support should data processing agreements allows it.
Objectives:
Describe adapting a cloud-based health information system to a fully offline setting and improve business continuity in case of a system failure.
Method/Description:
An analysis of the existing architecture of RCHIS was conducted to identify components and procedures that only work on the cloud-based server with an existing internet connection. Offline alternatives were identified and developed to ensure full offline operational capacity and redundancy.
Results/Outcomes:
A mechanism to set up a second local server for redundancy improves business continuity planning, and having locally stored backup allows the recovery of data without an internet connection. Instead of creating new user accounts in the cloud and emailing a one-time password (OTP), a mechanism to create accounts on the local server and display the OTP was added. The offline generation of the WHO EMT MDS report was embedded.
Conclusion:
Adding the capability to work fully offline to RCHIS meant significant software architecture changes. Despite losing some of the benefits, such as remote information management, RCHIS is now a robust offline tool for deployment in settings without any internet connectivity. Having a local server also means that we can comply with data sovereignty rules where they exist.
Historically, medical response efforts to large-scale disaster events have highlighted significant variability in the capabilities of responding medical providers and emergency medical teams (EMTs). Analysis of the 2010 Haiti earthquake response found that a number of medical teams were poorly prepared, inexperienced, or lacked the competencies to provide the level of medical care required, highlighting the need for medical team standards.
The World Health Organization (WHO) EMT initiative that followed created minimum team standards for responding international EMTs to improve the quality and timeliness of medical services. At the present time however, there remains a lack of globally recognized minimum competency standards at the level of the individual disaster medical responder, allowing for continued variability in patient care.
Objectives:
This study examines existing competencies for physicians, nurses, and paramedics who are members of deployable disaster response teams.
Method/Description:
A scoping review of published English-language articles on existing competencies for physicians, nurses, and paramedics who are members of deployable disaster response teams was performed in Ovid MEDLINE, Ovid Embase, CINAHL, Scopus, and Web of Science Core Collection. A total of 3,474 articles will be reviewed.
Results/Outcomes:
Data to be analyzed by October 1, 2024.
Conclusion:
There is a need to develop minimum standards for healthcare providers on disaster response teams. Identification of key existing competencies for disaster responders will provide the foundation for the creation of globally recognized minimum competency standards for individuals seeking to join an EMT in the future and will guide training and curricula development.
Objectives/Goals: The second highest fear of the aging population is cognitive decline. Diet is associated with brain aging; therefore, the objective is to determine the effects of a Western diet (WD) on cognitive decline and the efficacy of a Mediterranean diet (MeDi) fecal microbiota transplant (FMT) in WD-induced cognitive deficit progression in aged rats. Methods/Study Population: For Study 1, 12-month-old Fischer344 rats (NIA Aging Colony) will be randomly assigned to a WD, MeDi, or control (positive control) for 6 or 12 months. Microbiota composition, blood pressure, and body composition (DXA Scan) will be longitudinally assessed. Groups will undergo a battery of neurobehavioral assessments to measure cognitive performance. At the end of the study, mitochondria bioenergetic assays in isolated cerebral microvessels will be used to determine changes in cerebrovascular function. For Study 2, 18-month-old Fischer344 rats (NIA Aging Colony) will be randomly assigned to a WD, MeDi, or control for 6 months. At month 4, the WD+ MeDi-FMT group will receive once weekly MeDi-FMT for two months. Assessments will be performed as described in Study 1. Results/Anticipated Results: It is anticipated that the WD-related gut dysbiosis will increase blood pressure, fat-free mass, neurovascular dysfunction, and induce cognitive impairment relative to a MeDi. When using a MeDi-FMT as an intervention, it is anticipated that there will be measurable improvements in cognitive function relative to a WD through the regulation of gut dysbiosis, blood pressure, fat-free mass, and neurovascular dysfunction. Discussion/Significance of Impact: These results are expected to have an important positive impact because they will provide insights into the WD-induced gut dysbiosis-associated cognitive impairments, and evaluate the roles and mechanisms of MeDi-FMT in the therapeutic intervention of aged rats.
Objectives/Goals: The timing of neurosurgery is highly variable for post-hemorrhagic hydrocephalus (PHH) of prematurity. We sought to utilize microvascular imaging (MVI) in ultrasound (US) to identify biomarkers to discern the opportune time for intervention and to analyze the cerebrospinal fluid (CSF) characteristics as they pertain to neurosurgical outcome. Methods/Study Population: The inclusion criteria for the study are admission to the neonatal intensive care unit (NICU) with a diagnosis of Papile grade III or IV. Exclusion criteria are congenital hydrocephalus and hydrocephalus secondary to myelomeningocele/brain tumor/vascular malformation. We are a level IV tertiary referral center. Our current clinical care pathway utilizes brain US at admission and at weekly intervals. Patients who meet certain clinical and radiographic parameters undergo temporary or permanent CSF diversion. Results/Anticipated Results: NEL was implemented at our institution for PHH of prematurity in fall 2022. To date, we have had 20 patients who were diagnosed with grade III or IV IVH, of which 12 qualified for NEL. Our preliminary safety and feasibility results as well as the innovative bedside technique pioneered at our institution are currently in revision stages for publication. Preliminary results of the MVI data have yielded that hyperemia may confer venous congestion in the germinal matrix, which should then alert the neurosurgeon to delay any intervention to avoid progression of intraventricular blood. With regard to CSF characteristics, we anticipate that protein, cell count, hemoglobin, iron, and ferritin will decrease with NEL. Discussion/Significance of Impact: The timing of PHH of prematurity is highly variable. We expect that MVI will offer radiographic biomarkers to guide optimal timing of neurosurgical intervention. A better understanding of CSF characteristics could potentially educate the neurosurgeon with regard to optimal timing of permanent CSF diversion based on specific CSF parameters.
We examine repetition as an institution that affects coordination failure in a game with and without pre-play communication. We use probit regression with random effects to test hypotheses regarding the frequency and form of coordination failure in the presence of repeated play versus one-shot games. Our results indicate that repetition without pre-play communication results in a lower frequency of coordination failure relative to one-shot game outcomes. This result is reversed when pre-play communication is allowed. Our evidence also suggests that repeated play coordination failures tend to be suboptimal Nash equilibria, whereas one-shot game coordination failures are disequilibria regardless of the presence of pre-play communication.