We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The opportunity to increase soybean yield has prompted Illinois farmers to plant soybean earlier than historical norms. Extending the growing season with an earlier planting date might alter the relationship between soybean growth and weed emergence timings, potentially altering the optimal herbicide application timings to minimize crop yield loss due to weed interference and ensure minimal weed seed production. The objective of this research was to examine various herbicide treatments applied at different timings and rates to assess the effect on weed control and yield in early-planted soybean. Field experiments were conducted in 2021 at three locations across central Illinois to determine effective chemical strategies for weed management in early-planted soybean. PRE treatments consisted of a S-metolachlor + metribuzin premix applied at planting or just prior to soybean emergence at 0.5X (883 + 210 g ai ha−1) or 1X (1,766 + 420 g ai ha−1) label-recommended rates. POST treatments were applied when weeds reached 10 cm tall and consisted of 1X rates of glufosinate (655 g ai ha−1) + glyphosate (1,260 g ae ha−1) + ammonium sulfate, without or with pyroxasulfone at a 0.5X (63 g ai ha−1) or 1X (126 g ai ha−1) rate. Treatments comprising both a full rate of PRE followed by a POST resulted in the greatest and most consistent weed control at the final evaluation timing. The addition of pyroxasulfone to POST treatments did not consistently reduce late-season weed emergence. The lack of a consistent effect by pyroxasulfone could be attributed to suppression of weeds by soybean canopy closure due to earlier soybean development. The full rate of PRE extended the timing of POST application 2 to 3 wk for all treatments at all locations except Urbana. Full-rate PRE treatments also reduced the time between the POST application and soybean canopy closure. Overall, a full-rate PRE reduced early-season weed interference and minimized soybean yield loss due to weed interference.
Clostridioides difficile infection (CDI) may be misdiagnosed if testing is performed in the absence of signs or symptoms of disease. This study sought to support appropriate testing by estimating the impact of signs, symptoms, and healthcare exposures on pre-test likelihood of CDI.
Methods:
A panel of fifteen experts in infectious diseases participated in a modified UCLA/RAND Delphi study to estimate likelihood of CDI. Consensus, defined as agreement by >70% of panelists, was assessed via a REDCap survey. Items without consensus were discussed in a virtual meeting followed by a second survey.
Results:
All fifteen panelists completed both surveys (100% response rate). In the initial survey, consensus was present on 6 of 15 (40%) items related to risk of CDI. After panel discussion and clarification of questions, consensus (>70% agreement) was reached on all remaining items in the second survey. Antibiotics were identified as the primary risk factor for CDI and grouped into three categories: high-risk (likelihood ratio [LR] 7, 93% agreement among panelists in first survey), low-risk (LR 3, 87% agreement in first survey), and minimal-risk (LR 1, 71% agreement in first survey). Other major factors included new or unexplained severe diarrhea (e.g., ≥ 10 liquid bowel movements per day; LR 5, 100% agreement in second survey) and severe immunosuppression (LR 5, 87% agreement in second survey).
Conclusion:
Infectious disease experts concurred on the importance of signs, symptoms, and healthcare exposures for diagnosing CDI. The resulting risk estimates can be used by clinicians to optimize CDI testing and treatment.
Develop and implement a system in the Veterans Health Administration (VA) to alert local medical center personnel in real time when an acute- or long-term care patient/resident is admitted to their facility with a history of colonization or infection with a multidrug-resistant organism (MDRO) previously identified at any VA facility across the nation.
Methods:
An algorithm was developed to extract clinical microbiology and local facility census data from the VA Corporate Data Warehouse initially targeting carbapenem-resistant Enterobacterales (CRE) and methicillin-resistant Staphylococcus aureus (MRSA). The algorithm was validated with chart review of CRE cases from 2010-2018, trialed and refined in 24 VA healthcare systems over two years, expanded to other MDROs and implemented nationwide on 4/2022 as “VA Bug Alert” (VABA). Use through 8/2023 was assessed.
Results:
VABA performed well for CRE with recall of 96.3%, precision of 99.8%, and F1 score of 98.0%. At the 24 trial sites, feedback was recorded for 1,011 admissions with a history of CRE (130), MRSA (814), or both (67). Among Infection Preventionists and MDRO Prevention Coordinators, 338 (33%) reported being previously unaware of the information, and of these, 271 (80%) reported they would not have otherwise known this information. By fourteen months after nationwide implementation, 113/130 (87%) VA healthcare systems had at least one VABA subscriber.
Conclusions:
A national system for alerting facilities in real-time of patients admitted with an MDRO history was successfully developed and implemented in VA. Next steps include understanding facilitators and barriers to use and coordination with non-VA facilities nationwide.
Suicide is a leading cause of death in the United States, particularly among adolescents. In recent years, suicidal ideation, attempts, and fatalities have increased. Systems maps can effectively represent complex issues such as suicide, thus providing decision-support tools for policymakers to identify and evaluate interventions. While network science has served to examine systems maps in fields such as obesity, there is limited research at the intersection of suicidology and network science. In this paper, we apply network science to a large causal map of adverse childhood experiences (ACEs) and suicide to address this gap. The National Center for Injury Prevention and Control (NCIPC) within the Centers for Disease Control and Prevention recently created a causal map that encapsulates ACEs and adolescent suicide in 361 concept nodes and 946 directed relationships. In this study, we examine this map and three similar models through three related questions: (Q1) how do existing network-based models of suicide differ in terms of node- and network-level characteristics? (Q2) Using the NCIPC model as a unifying framework, how do current suicide intervention strategies align with prevailing theories of suicide? (Q3) How can the use of network science on the NCIPC model guide suicide interventions?
Stable water isotope records of six firn cores retrieved from two adjacent plateaus on the northern Antarctic Peninsula between 2014 and 2016 are presented and investigated for their connections with firn-core glacio-chemical data, meteorological records and modelling results. Average annual accumulation rates of 2500 kg m−2 a−1 largely reduce the modification of isotopic signals in the snowpack by post-depositional processes, allowing excellent signal preservation in space and time. Comparison of firn-core and ECHAM6-wiso modelled δ18O and d-excess records reveals a large agreement on annual and sub-annual scales, suggesting firn-core stable water isotopes to be representative of specific synoptic situations. The six firn cores exhibit highly similar isotopic patterns in the overlapping period (2013), which seem to be related to temporal changes in moisture sources rather than local near-surface air temperatures. Backward trajectories calculated with the HYSPLIT model suggest that prominent δ18O minima in 2013 associated with elevated sea salt concentrations are related to long-range moisture transport dominated by westerly winds during positive SAM phases. In contrast, a broad δ18O maximum in the same year accompanied by increased concentrations of black carbon and mineral dust corresponds to the advection of more locally derived moisture with northerly flow components (South America) when the SAM is negative.
Incorporating emerging knowledge into Emergency Medical Service (EMS) competency assessments is critical to reflect current evidence-based out-of-hospital care. However, a standardized approach is needed to incorporate new evidence into EMS competency assessments because of the rapid pace of knowledge generation.
Objective:
The objective was to develop a framework to evaluate and integrate new source material into EMS competency assessments.
Methods:
The National Registry of Emergency Medical Technicians (National Registry) and the Prehospital Guidelines Consortium (PGC) convened a panel of experts. A Delphi method, consisting of virtual meetings and electronic surveys, was used to develop a Table of Evidence matrix that defines sources of EMS evidence. In Round One, participants listed all potential sources of evidence available to inform EMS education. In Round Two, participants categorized these sources into: (a) levels of evidence quality; and (b) type of source material. In Round Three, the panel revised a proposed Table of Evidence. Finally, in Round Four, participants provided recommendations on how each source should be incorporated into competency assessments depending on type and quality. Descriptive statistics were calculated with qualitative analyses conducted by two independent reviewers and a third arbitrator.
Results:
In Round One, 24 sources of evidence were identified. In Round Two, these were classified into high- (n = 4), medium- (n = 15), and low-quality (n = 5) of evidence, followed by categorization by purpose into providing recommendations (n = 10), primary research (n = 7), and educational content (n = 7). In Round Three, the Table of Evidence was revised based on participant feedback. In Round Four, the panel developed a tiered system of evidence integration from immediate incorporation of high-quality sources to more stringent requirements for lower-quality sources.
Conclusion:
The Table of Evidence provides a framework for the rapid and standardized incorporation of new source material into EMS competency assessments. Future goals are to evaluate the application of the Table of Evidence framework in initial and continued competency assessments.
Obesity is highly prevalent and disabling, especially in individuals with severe mental illness including bipolar disorders (BD). The brain is a target organ for both obesity and BD. Yet, we do not understand how cortical brain alterations in BD and obesity interact.
Methods:
We obtained body mass index (BMI) and MRI-derived regional cortical thickness, surface area from 1231 BD and 1601 control individuals from 13 countries within the ENIGMA-BD Working Group. We jointly modeled the statistical effects of BD and BMI on brain structure using mixed effects and tested for interaction and mediation. We also investigated the impact of medications on the BMI-related associations.
Results:
BMI and BD additively impacted the structure of many of the same brain regions. Both BMI and BD were negatively associated with cortical thickness, but not surface area. In most regions the number of jointly used psychiatric medication classes remained associated with lower cortical thickness when controlling for BMI. In a single region, fusiform gyrus, about a third of the negative association between number of jointly used psychiatric medications and cortical thickness was mediated by association between the number of medications and higher BMI.
Conclusions:
We confirmed consistent associations between higher BMI and lower cortical thickness, but not surface area, across the cerebral mantle, in regions which were also associated with BD. Higher BMI in people with BD indicated more pronounced brain alterations. BMI is important for understanding the neuroanatomical changes in BD and the effects of psychiatric medications on the brain.
Multiple micronutrient deficiencies are widespread in Ethiopia. However, the distribution of Se and Zn deficiency risks has previously shown evidence of spatially dependent variability, warranting the need to explore this aspect for wider micronutrients. Here, blood serum concentrations for Ca, Mg, Co, Cu and Mo were measured (n 3102) on samples from the Ethiopian National Micronutrient Survey. Geostatistical modelling was used to test spatial variation of these micronutrients for women of reproductive age, who represent the largest demographic group surveyed (n 1290). Median serum concentrations were 8·6 mg dl−1 for Ca, 1·9 mg dl−1 for Mg, 0·4 µg l−1 for Co, 98·8 µg dl−1 for Cu and 0·2 µg dl−1 for Mo. The prevalence of Ca, Mg and Co deficiency was 41·6 %, 29·2 % and 15·9 %, respectively; Cu and Mo deficiency prevalence was 7·6 % and 0·3 %, respectively. A higher prevalence of Ca, Cu and Mo deficiency was observed in north western, Co deficiency in central and Mg deficiency in north eastern parts of Ethiopia. Serum Ca, Mg and Mo concentrations show spatial dependencies up to 140–500 km; however, there was no evidence of spatial correlations for serum Co and Cu concentrations. These new data indicate the scale of multiple mineral micronutrient deficiency in Ethiopia and the geographical differences in the prevalence of deficiencies suggesting the need to consider targeted responses during the planning of nutrition intervention programmes.
Social anxiety (SA), a prevalent comorbid condition in psychotic disorders with a negative impact on functioning, requires adequate intervention relatively early. Using a randomized controlled trial, we tested the efficacy of a group cognitive-behavioral therapy intervention for SA (CBT-SA) that we developed for youth who experienced the first episode of psychosis (FEP). For our primary outcome, we hypothesized that compared to the active control of group cognitive remediation (CR), the CBT-SA group would show a reduction in SA that would be maintained at 3- and 6-month follow-ups. For secondary outcomes, it was hypothesized that the CBT-SA group would show a reduction of positive and negative symptoms and improvements in recovery and functioning.
Method
Ninety-six patients with an FEP and SA, recruited from five different FEP programs in the Montreal area, were randomized to 13 weekly group sessions of either CBT-SA or CR intervention.
Results
Linear mixed models revealed that multiple measures of SA significantly reduced over time, but with no significant group differences. Positive and negative symptoms, as well as functioning improved over time, with negative symptoms and functioning exhibiting a greater reduction in the CBT-SA group.
Conclusions
While SA decreased over time with both interventions, a positive effect of the CBT-SA intervention on measures of negative symptoms, functioning, and self-reported recovery at follow-up suggests that our intervention had a positive effect that extended beyond symptoms specific to SA.
We summarize some of the past year's most important findings within climate change-related research. New research has improved our understanding about the remaining options to achieve the Paris Agreement goals, through overcoming political barriers to carbon pricing, taking into account non-CO2 factors, a well-designed implementation of demand-side and nature-based solutions, resilience building of ecosystems and the recognition that climate change mitigation costs can be justified by benefits to the health of humans and nature alone. We consider new insights about what to expect if we fail to include a new dimension of fire extremes and the prospect of cascading climate tipping elements.
Technical summary
A synthesis is made of 10 topics within climate research, where there have been significant advances since January 2020. The insights are based on input from an international open call with broad disciplinary scope. Findings include: (1) the options to still keep global warming below 1.5 °C; (2) the impact of non-CO2 factors in global warming; (3) a new dimension of fire extremes forced by climate change; (4) the increasing pressure on interconnected climate tipping elements; (5) the dimensions of climate justice; (6) political challenges impeding the effectiveness of carbon pricing; (7) demand-side solutions as vehicles of climate mitigation; (8) the potentials and caveats of nature-based solutions; (9) how building resilience of marine ecosystems is possible; and (10) that the costs of climate change mitigation policies can be more than justified by the benefits to the health of humans and nature.
Social media summary
How do we limit global warming to 1.5 °C and why is it crucial? See highlights of latest climate science.
We present the data and initial results from the first pilot survey of the Evolutionary Map of the Universe (EMU), observed at 944 MHz with the Australian Square Kilometre Array Pathfinder (ASKAP) telescope. The survey covers
$270 \,\mathrm{deg}^2$
of an area covered by the Dark Energy Survey, reaching a depth of 25–30
$\mu\mathrm{Jy\ beam}^{-1}$
rms at a spatial resolution of
$\sim$
11–18 arcsec, resulting in a catalogue of
$\sim$
220 000 sources, of which
$\sim$
180 000 are single-component sources. Here we present the catalogue of single-component sources, together with (where available) optical and infrared cross-identifications, classifications, and redshifts. This survey explores a new region of parameter space compared to previous surveys. Specifically, the EMU Pilot Survey has a high density of sources, and also a high sensitivity to low surface brightness emission. These properties result in the detection of types of sources that were rarely seen in or absent from previous surveys. We present some of these new results here.
The objectives of this study were to develop and refine EMPOWER (Enhancing and Mobilizing the POtential for Wellness and Resilience), a brief manualized cognitive-behavioral, acceptance-based intervention for surrogate decision-makers of critically ill patients and to evaluate its preliminary feasibility, acceptability, and promise in improving surrogates’ mental health and patient outcomes.
Method
Part 1 involved obtaining qualitative stakeholder feedback from 5 bereaved surrogates and 10 critical care and mental health clinicians. Stakeholders were provided with the manual and prompted for feedback on its content, format, and language. Feedback was organized and incorporated into the manual, which was then re-circulated until consensus. In Part 2, surrogates of critically ill patients admitted to an intensive care unit (ICU) reporting moderate anxiety or close attachment were enrolled in an open trial of EMPOWER. Surrogates completed six, 15–20 min modules, totaling 1.5–2 h. Surrogates were administered measures of peritraumatic distress, experiential avoidance, prolonged grief, distress tolerance, anxiety, and depression at pre-intervention, post-intervention, and at 1-month and 3-month follow-up assessments.
Results
Part 1 resulted in changes to the EMPOWER manual, including reducing jargon, improving navigability, making EMPOWER applicable for a range of illness scenarios, rearranging the modules, and adding further instructions and psychoeducation. Part 2 findings suggested that EMPOWER is feasible, with 100% of participants completing all modules. The acceptability of EMPOWER appeared strong, with high ratings of effectiveness and helpfulness (M = 8/10). Results showed immediate post-intervention improvements in anxiety (d = −0.41), peritraumatic distress (d = −0.24), and experiential avoidance (d = −0.23). At the 3-month follow-up assessments, surrogates exhibited improvements in prolonged grief symptoms (d = −0.94), depression (d = −0.23), anxiety (d = −0.29), and experiential avoidance (d = −0.30).
Significance of results
Preliminary data suggest that EMPOWER is feasible, acceptable, and associated with notable improvements in psychological symptoms among surrogates. Future research should examine EMPOWER with a larger sample in a randomized controlled trial.
The first demonstration of laser action in ruby was made in 1960 by T. H. Maiman of Hughes Research Laboratories, USA. Many laboratories worldwide began the search for lasers using different materials, operating at different wavelengths. In the UK, academia, industry and the central laboratories took up the challenge from the earliest days to develop these systems for a broad range of applications. This historical review looks at the contribution the UK has made to the advancement of the technology, the development of systems and components and their exploitation over the last 60 years.
Chronic inflammatory demyelinating polyradiculoneuropathy (CIDP) refractory to conventional therapy can lead to marked disability and represents a therapeutic challenge.
Objective:
To report five cases of treatment-refractory disabling CIDP treated with autologous hematopoietic stem cell transplantation (AHSCT).
Methods:
This was a retrospective cohort study from a tertiary care referral center for both neuromuscular disease and AHSCT. Patients with CIDP treated with AHSCT between 2008 and 2020 were included. All patients had major persistent and disabling neuropathic deficits despite combinations of intensive immunosuppressive therapy. The primary outcome measures were: Medical Research Council sum score, Overall Neuropathy Limitations Scale and requirement for ongoing CIDP immunotherapy after transplantation. We also analyzed safety outcomes by documenting all severe AHSCT-related complications.
Results:
Five patients with refractory CIDP underwent AHSCT. Three were classified as manifesting a typical syndrome, two were classified as the multifocal Lewis Sumner variant. The mean age at time of CIDP diagnosis was 33.4 years (range 24–46 years), with a median delay of 46 months (range 21–135 months) between diagnosis and AHSCT. The median follow-up period was 41 months. All five patients were able to wean off CIDP-related immunotherapy. Marked improvements in Medical Research Council scale and overall Neuropathy Limitations Scale were noted in 4/5 patients. One patient with longstanding neurogenic atrophy showed no improvement in disability scales. There were no treatment-related deaths or critical illnesses.
Conclusions:
AHSCT can achieve marked sustained clinical improvement of refractory CIDP and may allow for weaning off long-term complex immunotherapies.
The sustainability concept seeks to balance how present and future generations of humans meet their needs. But because nature is viewed only as a resource, sustainability fails to recognize that humans and other living beings depend on each other for their well-being. We therefore argue that true sustainability can only be achieved if the interdependent needs of all species of current and future generations are met, and propose calling this ‘multispecies sustainability’. We explore the concept through visualizations and scenarios, then consider how it might be applied through case studies involving bees and healthy green spaces.
An inflammation-induced imbalance in the kynurenine pathway (KP) has been reported in major depressive disorder but the utility of these metabolites as predictive or therapeutic biomarkers of behavioral activation (BA) therapy is unknown.
Methods
Serum samples were provided by 56 depressed individuals before BA therapy and 29 of these individuals also provided samples after 10 weeks of therapy to measure cytokines and KP metabolites. The PROMIS Depression Scale (PROMIS-D) and the Sheehan Disability Scale were administered weekly and the Beck depression inventory was administered pre- and post-therapy. Data were analyzed with linear mixed-effect, general linear, and logistic regression models. The primary outcome for the biomarker analyses was the ratio of kynurenic acid to quinolinic acid (KynA/QA).
Results
BA decreased depression and disability scores (p's < 0.001, Cohen's d's > 0.5). KynA/QA significantly increased at post-therapy relative to baseline (p < 0.001, d = 2.2), an effect driven by a decrease in QA post-therapy (p < 0.001, uncorrected, d = 3.39). A trend towards a decrease in the ratio of kynurenine to tryptophan (KYN/TRP) was also observed (p = 0.054, uncorrected, d = 0.78). Neither the change in KynA/QA, nor baseline KynA/QA were associated with response to BA therapy.
Conclusion
The current findings together with previous research show that electronconvulsive therapy, escitalopram, and ketamine decrease concentrations of the neurotoxin, QA, raise the possibility that a common therapeutic mechanism underlies diverse forms of anti-depressant treatment but future controlled studies are needed to test this hypothesis.
Antarctica's ice shelves modulate the grounded ice flow, and weakening of ice shelves due to climate forcing will decrease their ‘buttressing’ effect, causing a response in the grounded ice. While the processes governing ice-shelf weakening are complex, uncertainties in the response of the grounded ice sheet are also difficult to assess. The Antarctic BUttressing Model Intercomparison Project (ABUMIP) compares ice-sheet model responses to decrease in buttressing by investigating the ‘end-member’ scenario of total and sustained loss of ice shelves. Although unrealistic, this scenario enables gauging the sensitivity of an ensemble of 15 ice-sheet models to a total loss of buttressing, hence exhibiting the full potential of marine ice-sheet instability. All models predict that this scenario leads to multi-metre (1–12 m) sea-level rise over 500 years from present day. West Antarctic ice sheet collapse alone leads to a 1.91–5.08 m sea-level rise due to the marine ice-sheet instability. Mass loss rates are a strong function of the sliding/friction law, with plastic laws cause a further destabilization of the Aurora and Wilkes Subglacial Basins, East Antarctica. Improvements to marine ice-sheet models have greatly reduced variability between modelled ice-sheet responses to extreme ice-shelf loss, e.g. compared to the SeaRISE assessments.