We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Saflufenacil, atrazine, and pyroxasulfone represent herbicides with a relative field persistence of low, medium, and high, respectively. Field studies were conducted over 2 yr when herbicides and rates were assembled in a factorial arrangement of treatments, and herbicides were applied at rates of 100, 1,000, and 10,000 g ai ha−1. Soil samples were collected over the course of 365 d and analyzed to detect dissipation of the herbicides. Regression analysis was used to quantify the dissipation of each herbicide. The initial herbicide concentration had no effect on the observed dissipation rates of atrazine or saflufenacil; however, pyroxasulfone dissipation was slower at the highest field dosage in both years. Soils from Georgia, Illinois, and Tennessee were fortified with known concentrations of the three herbicides dissolved in water and incubated at 22 C for 154 d. Laboratory studies generally demonstrated slower dissipation compared to field studies, which is plausible because the important loss mechanisms of volatilization or photodegradation do not occur in the laboratory test system. Pyroxasulfone and saflufenacil exhibited no effect of half-life from various initial concentrations, but atrazine exhibited slower degradation occurring at lower initial concentrations. Findings from these studies suggest that initial herbicide concentration has a limited effect on the dissipation of some herbicides: pyroxasulfone in the field and atrazine in the laboratory. This finding is important for researchers who use herbicide degradation rates in simulation modeling because herbicide degradation is often assumed to be independent of the rate applied. Another aspect of this research was the application of each herbicide alone and in combination with the others. Under field and laboratory conditions, there was no change in dissipation if the herbicides were applied alone or in combination.
Patients with posttraumatic stress disorder (PTSD) exhibit smaller regional brain volumes in commonly reported regions including the amygdala and hippocampus, regions associated with fear and memory processing. In the current study, we have conducted a voxel-based morphometry (VBM) meta-analysis using whole-brain statistical maps with neuroimaging data from the ENIGMA-PGC PTSD working group.
Methods
T1-weighted structural neuroimaging scans from 36 cohorts (PTSD n = 1309; controls n = 2198) were processed using a standardized VBM pipeline (ENIGMA-VBM tool). We meta-analyzed the resulting statistical maps for voxel-wise differences in gray matter (GM) and white matter (WM) volumes between PTSD patients and controls, performed subgroup analyses considering the trauma exposure of the controls, and examined associations between regional brain volumes and clinical variables including PTSD (CAPS-4/5, PCL-5) and depression severity (BDI-II, PHQ-9).
Results
PTSD patients exhibited smaller GM volumes across the frontal and temporal lobes, and cerebellum, with the most significant effect in the left cerebellum (Hedges’ g = 0.22, pcorrected = .001), and smaller cerebellar WM volume (peak Hedges’ g = 0.14, pcorrected = .008). We observed similar regional differences when comparing patients to trauma-exposed controls, suggesting these structural abnormalities may be specific to PTSD. Regression analyses revealed PTSD severity was negatively associated with GM volumes within the cerebellum (pcorrected = .003), while depression severity was negatively associated with GM volumes within the cerebellum and superior frontal gyrus in patients (pcorrected = .001).
Conclusions
PTSD patients exhibited widespread, regional differences in brain volumes where greater regional deficits appeared to reflect more severe symptoms. Our findings add to the growing literature implicating the cerebellum in PTSD psychopathology.
Our overall goal was to enhance the usability and interactivity of the RE-AIM website (re-aim.org) and improve resources to support the application of the RE-AIM framework within the context of dissemination & implementation (D&I) research and practice.
Methods:
We applied a mixed-methods approach to obtain user feedback from 24 D&I researchers and practitioners. Usability (System Usability Scale) and interactivity (Interactivity Scale) were assessed through validated surveys, at baseline and after two iterative rounds of website modifications (Phase 1 and Phase 2). We also conducted qualitative assessments at each phase.
Results:
Qualitative baseline and Phase 1 findings indicated a need to simplify organization, enhance information accessibility, provide concrete guidance on applying RE-AIM, and clarify contextual factors related to RE-AIM constructs. After streamlining website and homepage organization, Phase 2 qualitative results suggested improved user navigation experience; users also requested greater interactivity. Modifications included: new interactive planning tool and a video introduction of contextual factors influencing RE-AIM outcomes. Significant improvements were found in the SUS score from baseline to Phase 1(64.2[SD18.7] to 80.8 [SD 12.1] (p < .05) and remained higher in Phase 2(77.1[SD 15] (p = 0.08). Interactivity also improved from baseline to Phase 2(3.5[SD1.2] to 41[0.9], though not statistically significant.
Conclusion:
User-centered feedback on online resources, as exemplified by this use case example of enhancements to the RE-AIM website, are important in bridging the gap between research and practice, and the revised website should be more accessible and useful to users.
Inadequate recruitment and retention impede clinical trial goals. Emerging decentralized clinical trials (DCTs) leveraging digital health technologies (DHTs) for remote recruitment and data collection aim to address barriers to participation in traditional trials. The ACTIV-6 trial is a DCT using DHTs, but participants’ experiences of such trials remain largely unknown. This study explored participants’ perspectives of the ACTIV-6 DCT that tested outpatient COVID-19 therapeutics.
Methods:
Participants in the ACTIV-6 study were recruited via email to share their day-to-day trial experiences during 1-hour virtual focus groups. Two human factors researchers guided group discussions through a semi-structured script that probed expectations and perceptions of study activities. Qualitative data analysis was conducted using a grounded theory approach with open coding to identify key themes.
Results:
Twenty-eight ACTIV-6 study participants aged 30+ years completed a virtual focus group including 1–4 participants each. Analysis yielded three major themes: perceptions of the DCT experience, study activity engagement, and trust. Participants perceived the use of remote DCT procedures supported by DHTs as an acceptable and efficient method of organizing and tracking study activities, communicating with study personnel, and managing study medications at home. Use of social media was effective in supporting geographically dispersed participant recruitment but also raised issues with trust and study legitimacy.
Conclusions:
While participants in this qualitative study viewed the DCT-with-DHT approach as reasonably efficient and engaging, they also identified challenges to address. Understanding facilitators and barriers to DCT participation and DHT interaction can help improve future research design.
For near-future missions planed for Mars Sample Return (MSR), an international working group organized by the Committee on Space Research (COSPAR) developed the sample safety assessment framework (SSAF). For the SSAF, analytical instruments were selected by taking the practical limitations of hosting them within a facility with the highest level of biosafety precautions (biosafety level 4) and the precious nature of returned samples into account. To prepare for MSR, analytical instruments of high sensitivity need to be tested on effective Mars analogue materials. As an analogue material, we selected a rock core of basalt, a prominent rock type on the Martian surface. Two basalt samples with aqueous alteration cached in Jezero crater by the Perseverance rover are planned to be returned to Earth. Our previously published analytical procedures using destructive but spatially sensitive instruments such as nanoscale secondary ion mass spectrometry (NanoSIMS) and transmission electron microscopy coupled to energy-dispersive spectroscopy revealed microbial colonization at clay-filled fractures. With an aim to test the capability of an analytical instrument listed in SSAF, we now extend that work to conventional Fourier transform infrared (FT-IR) microscopy with a spatial resolution of 10 μm. Although Fe-rich smectite called nontronite was identified after crushing some portion of the rock core sample into powder, the application of conventional FT-IR microscopy is limited to a sample thickness of <30 μm. In order to obtain IR-based spectra without destructive preparation, a new technique called optical-photothermal infrared (O-PTIR) spectroscopy with a spatial resolution of 0.5 μm was applied to a 100 μm thick section of the rock core. By O-PTIR spectroscopic analysis of the clay-filled fracture, we obtained in-situ spectra diagnostic to microbial cells, consistent with our previously published data obtained by NanoSIMS. In addition, nontronite identification was also possible by O-PTIR spectroscopic analysis. From these results, O-PTIR spectroscopy is suggested be superior to deep ultraviolet fluorescence microscopy/μ-Raman spectroscopy, particularly for smectite identification. A simultaneous acquisition of the spatial distribution of structural motifs associated with biomolecules and smectites is critical for distinguishing biological material in samples as well as characterizing an abiotic background.
Mean levels of cognitive functioning typically do not show an association with self-reported cognitive fatigue in persons with multiple sclerosis (PwMS), but some studies indicate that cognitive variability has an association with cognitive fatigue. Additionally, coping has been shown to be a powerful moderator of some outcomes in multiple sclerosis (MS). To date, however, coping has not been considered as a possible moderator of the relationship between cognitive fatigue and cognitive variability in MS. The current study examined this relationship.
Method:
We examined 52 PwMS. All participants were administered the Fatigue Impact Scale, the Coping Orientation to Problems Experienced Questionnaire, and cognitive tests. Indices of variability for memory and attention/executive functioning tests were used as outcome variables. Avoidant coping, active coping, and composite coping indices were used as moderators.
Results:
The interaction analyses for the avoidant coping and composite coping indices were significant and accounted for 8 and 11% of the attention/executive functioning variability outcome, respectively. The interactions revealed that at low levels of cognitive fatigue, attention/executive functioning variability was comparable between the low and high avoidant and composite coping groups. However, at high levels of cognitive fatigue, PwMS using lower levels of avoidant coping (less maladaptive coping) showed less variable attention/executive functioning scores compared with those using higher levels of avoidant coping. We found a similar pattern for the composite coping groups.
Conclusion:
At high levels of cognitive fatigue, PwMS using adaptive coping showed less attention/executive functioning variability. These findings should be considered in the context of treatment implications.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
Employment and relationship are crucial for social integration. However, individuals with major psychiatric disorders often face challenges in these domains.
Aims
We investigated employment and relationship status changes among patients across the affective and psychotic spectrum – in comparison with healthy controls, examining whether diagnostic groups or functional levels influence these transitions.
Method
The sample from the longitudinal multicentric PsyCourse Study comprised 1260 patients with affective and psychotic spectrum disorders and 441 controls (mean age ± s.d., 39.91 ± 12.65 years; 48.9% female). Multistate models (Markov) were used to analyse transitions in employment and relationship status, focusing on transition intensities. Analyses contained multiple multistate models adjusted for age, gender, job or partner, diagnostic group and Global Assessment of Functioning (GAF) in different combinations to analyse the impact of the covariates on the hazard ratio of changing employment or relationship status.
Results
The clinical group had a higher hazard ratio of losing partner (hazard ratio 1.46, P < 0.001) and job (hazard ratio 4.18, P < 0.001) than the control group (corrected for age/gender). Compared with controls, clinical groups had a higher hazard of losing partner (affective group, hazard ratio 2.69, P = 0.003; psychotic group, hazard ratio 3.06, P = 0.001) and job (affective group, hazard ratio 3.43, P < 0.001; psychotic group, hazard ratio 4.11, P < 0.001). Adjusting for GAF, the hazard ratio of losing partner and job decreased in both clinical groups compared with controls.
Conclusion
Patients face an increased hazard of job loss and relationship dissolution compared with healthy controls, and this is partially conditioned by the diagnosis and functional level. These findings underscore a high demand for destigmatisation and support for individuals in managing their functional limitations.
To understand healthcare workers’ (HCWs) beliefs and practices toward blood culture (BCx) use.
Design:
Cross-sectional electronic survey and semi-structured interviews.
Setting:
Academic hospitals in the United States.
Participants:
HCWs involved in BCx ordering and collection in adult intensive care units (ICU) and wards.
Methods:
We administered an anonymous electronic survey to HCWs and conducted semi-structured interviews with unit staff and quality improvement (QI) leaders in these institutions to understand their perspectives regarding BCx stewardship between February and November 2023.
Results:
Of 314 HCWs who responded to the survey, most (67.4%) were physicians and were involved in BCx ordering (82.3%). Most survey respondents reported that clinicians had a low threshold to culture patients for fever (84.4%) and agreed they could safely reduce the number of BCx obtained in their units (65%). However, only half of them believed BCx was overused. Although most made BCx decisions as a team (74.1%), a minority reported these team discussions occurred daily (42.4%). A third of respondents reported not usually collecting the correct volume per BCx bottle, half were unaware of the improved sensitivity of 2 BCx sets, and most were unsure of the nationally recommended BCx contamination threshold (87.5%). Knowledge regarding the utility of BCx for common infections was limited.
Conclusions:
HCWs’ understanding of best collection practices and yield of BCx was limited.
In response to the COVID-19 pandemic, we rapidly implemented a plasma coordination center, within two months, to support transfusion for two outpatient randomized controlled trials. The center design was based on an investigational drug services model and a Food and Drug Administration-compliant database to manage blood product inventory and trial safety.
Methods:
A core investigational team adapted a cloud-based platform to randomize patient assignments and track inventory distribution of control plasma and high-titer COVID-19 convalescent plasma of different blood groups from 29 donor collection centers directly to blood banks serving 26 transfusion sites.
Results:
We performed 1,351 transfusions in 16 months. The transparency of the digital inventory at each site was critical to facilitate qualification, randomization, and overnight shipments of blood group-compatible plasma for transfusions into trial participants. While inventory challenges were heightened with COVID-19 convalescent plasma, the cloud-based system, and the flexible approach of the plasma coordination center staff across the blood bank network enabled decentralized procurement and distribution of investigational products to maintain inventory thresholds and overcome local supply chain restraints at the sites.
Conclusion:
The rapid creation of a plasma coordination center for outpatient transfusions is infrequent in the academic setting. Distributing more than 3,100 plasma units to blood banks charged with managing investigational inventory across the U.S. in a decentralized manner posed operational and regulatory challenges while providing opportunities for the plasma coordination center to contribute to research of global importance. This program can serve as a template in subsequent public health emergencies.
Tiafenacil is a new nonselective protoporphyrinogen IX oxidase (PPO)–inhibiting herbicide with both grass and broadleaf activity labeled for preplant application to corn, cotton, soybean, and wheat. Early-season corn emergence and growth often coincides in the mid-South with preplant herbicide application in cotton and soybean, thereby increasing opportunity for off-target herbicide movement from adjacent fields. Field studies were conducted in 2022 to identify the impacts of reduced rates of tiafenacil (12.5% to 0.4% of the lowest labeled application rate of 24.64 g ai ha–1) applied to two- or four-leaf corn. Corn injury 1 wk after treatment (WAT) for the two- and four-leaf growth stages ranged from 31% to 6% and 37% to 9%, respectively, whereas at 2 WAT these respective ranges were 21.7% to 4% and 22.5% to 7.2%. By 4 WAT, visible injury following the two- and four-leaf exposure timing was no greater than 8% in all instances except the highest tiafenacil rate applied at the four-leaf growth stage (13%). Tiafenacil had no negative season-long impact, as the early-season injury observed was not manifested in a reduction in corn height 2 WAT or yield. Application of tiafenacil directly adjacent to corn in early vegetative stages of growth should be avoided. In cases where off-target movement does occur, however, affected corn should be expected to fully recover with no impact on growth and yield, assuming adequate growing conditions and agronomic/pest management practices are provided.
To assess the potential contribution of large-scale food fortification (LSFF) towards meeting dietary micronutrient requirements in Tanzania.
Design:
We used household food consumption data from the National Panel Survey 2014–15 to estimate fortifiable food vehicle coverage and consumption (standardised using the adult female equivalent approach) and the prevalence at risk of inadequate apparent intake of five micronutrients included in Tanzania’s fortification legislation. We modelled four LSFF scenarios: no fortification, status quo (i.e. compliance with current fortification contents) and full fortification with and without maize flour fortification.
Setting:
Tanzania.
Participants:
A nationally representative sample of 3290 Tanzanian households.
Results:
The coverage of edible oils and maize and wheat flours (including products of wheat flour and oil such as bread and cakes) was high, with 91 percent, 88 percent and 53 percent of households consuming these commodities, respectively. We estimated that vitamin A-fortified oil could reduce the prevalence of inadequate apparent intake of vitamin A (retinol activity equivalent) from 92 percent without LSFF to 80 percent with LSFF at current fortification levels. Low industry LSFF compliance of flour fortification limits the contribution of other micronutrients, but a hypothetical full fortification scenario shows that LSFF of cereal flours could substantially reduce the prevalence at risk of inadequate intakes of iron, zinc, folate and vitamin B12.
Conclusions:
The current Tanzania LSFF programme likely contributes to reducing vitamin A inadequacy. Policies that support increased compliance could improve the supply of multiple nutrients, but the prominence of small-scale maize mills restricts this theoretical benefit.
The Accelerating COVID-19 Therapeutic Interventions and Vaccines (ACTIV) Cross-Trial Statistics Group gathered lessons learned from statisticians responsible for the design and analysis of the 11 ACTIV therapeutic master protocols to inform contemporary trial design as well as preparation for a future pandemic. The ACTIV master protocols were designed to rapidly assess what treatments might save lives, keep people out of the hospital, and help them feel better faster. Study teams initially worked without knowledge of the natural history of disease and thus without key information for design decisions. Moreover, the science of platform trial design was in its infancy. Here, we discuss the statistical design choices made and the adaptations forced by the changing pandemic context. Lessons around critical aspects of trial design are summarized, and recommendations are made for the organization of master protocols in the future.
This manuscript addresses a critical topic: navigating complexities of conducting clinical trials during a pandemic. Central to this discussion is engaging communities to ensure diverse participation. The manuscript elucidates deliberate strategies employed to recruit minority communities with poor social drivers of health for participation in COVID-19 trials. The paper adopts a descriptive approach, eschewing analysis of data-driven efficacy of these efforts, and instead provides a comprehensive account of strategies utilized. The Accelerate COVID-19 Treatment Interventions and Vaccines (ACTIV) public–private partnership launched early in the COVID-19 pandemic to develop clinical trials to advance SARS-CoV-2 treatments. In this paper, ACTIV investigators share challenges in conducting research during an evolving pandemic and approaches selected to engage communities when traditional strategies were infeasible. Lessons from this experience include importance of community representatives’ involvement early in study design and implementation and integration of well-developed public outreach and communication strategies with trial launch. Centralization and coordination of outreach will allow for efficient use of resources and the sharing of best practices. Insights gleaned from the ACTIV program, as outlined in this paper, shed light on effective strategies for involving communities in treatment trials amidst rapidly evolving public health emergencies. This underscores critical importance of community engagement initiatives well in advance of the pandemic.
There is a relative lack of research, targeted models and tools to manage beaches in estuaries and bays (BEBs). Many estuaries and bays have been highly modified and urbanised, for example port developments and coastal revetments. This paper outlines the complications and opportunities for conserving and managing BEBs in modified estuaries. To do this, we focus on eight diverse case studies from North and South America, Asia, Europe, Africa and Australia combined with the broader global literature. Our key findings are as follows: (1) BEBs are diverse and exist under a great variety of tide and wave conditions that differentiate them from open-coast beaches; (2) BEBs often lack statutory protection and many have already been sacrificed to development; (3) BEBs lack specific management tools and are often managed using tools developed for open-coast beaches; and (4) BEBs have the potential to become important in “nature-based” management solutions. We set the future research agenda for BEBs, which should include broadening research to include greater diversity of BEBs than in the past, standardising monitoring techniques, including the development of global databases using citizen science and developing specific management tools for BEBs. We must recognise BEBs as unique coastal features and develop the required fundamental knowledge and tools to effectively manage them, so they can continue providing their unique ecosystem services.
Knowledge of sex differences in risk factors for posttraumatic stress disorder (PTSD) can contribute to the development of refined preventive interventions. Therefore, the aim of this study was to examine if women and men differ in their vulnerability to risk factors for PTSD.
Methods
As part of the longitudinal AURORA study, 2924 patients seeking emergency department (ED) treatment in the acute aftermath of trauma provided self-report assessments of pre- peri- and post-traumatic risk factors, as well as 3-month PTSD severity. We systematically examined sex-dependent effects of 16 risk factors that have previously been hypothesized to show different associations with PTSD severity in women and men.
Results
Women reported higher PTSD severity at 3-months post-trauma. Z-score comparisons indicated that for five of the 16 examined risk factors the association with 3-month PTSD severity was stronger in men than in women. In multivariable models, interaction effects with sex were observed for pre-traumatic anxiety symptoms, and acute dissociative symptoms; both showed stronger associations with PTSD in men than in women. Subgroup analyses suggested trauma type-conditional effects.
Conclusions
Our findings indicate mechanisms to which men might be particularly vulnerable, demonstrating that known PTSD risk factors might behave differently in women and men. Analyses did not identify any risk factors to which women were more vulnerable than men, pointing toward further mechanisms to explain women's higher PTSD risk. Our study illustrates the need for a more systematic examination of sex differences in contributors to PTSD severity after trauma, which may inform refined preventive interventions.
Understanding characteristics of healthcare personnel (HCP) with SARS-CoV-2 infection supports the development and prioritization of interventions to protect this important workforce. We report detailed characteristics of HCP who tested positive for SARS-CoV-2 from April 20, 2020 through December 31, 2021.
Methods:
CDC collaborated with Emerging Infections Program sites in 10 states to interview HCP with SARS-CoV-2 infection (case-HCP) about their demographics, underlying medical conditions, healthcare roles, exposures, personal protective equipment (PPE) use, and COVID-19 vaccination status. We grouped case-HCP by healthcare role. To describe residential social vulnerability, we merged geocoded HCP residential addresses with CDC/ATSDR Social Vulnerability Index (SVI) values at the census tract level. We defined highest and lowest SVI quartiles as high and low social vulnerability, respectively.
Results:
Our analysis included 7,531 case-HCP. Most case-HCP with roles as certified nursing assistant (CNA) (444, 61.3%), medical assistant (252, 65.3%), or home healthcare worker (HHW) (225, 59.5%) reported their race and ethnicity as either non-Hispanic Black or Hispanic. More than one third of HHWs (166, 45.2%), CNAs (283, 41.7%), and medical assistants (138, 37.9%) reported a residential address in the high social vulnerability category. The proportion of case-HCP who reported using recommended PPE at all times when caring for patients with COVID-19 was lowest among HHWs compared with other roles.
Conclusions:
To mitigate SARS-CoV-2 infection risk in healthcare settings, infection prevention, and control interventions should be specific to HCP roles and educational backgrounds. Additional interventions are needed to address high social vulnerability among HHWs, CNAs, and medical assistants.