We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Patients with posttraumatic stress disorder (PTSD) exhibit smaller regional brain volumes in commonly reported regions including the amygdala and hippocampus, regions associated with fear and memory processing. In the current study, we have conducted a voxel-based morphometry (VBM) meta-analysis using whole-brain statistical maps with neuroimaging data from the ENIGMA-PGC PTSD working group.
Methods
T1-weighted structural neuroimaging scans from 36 cohorts (PTSD n = 1309; controls n = 2198) were processed using a standardized VBM pipeline (ENIGMA-VBM tool). We meta-analyzed the resulting statistical maps for voxel-wise differences in gray matter (GM) and white matter (WM) volumes between PTSD patients and controls, performed subgroup analyses considering the trauma exposure of the controls, and examined associations between regional brain volumes and clinical variables including PTSD (CAPS-4/5, PCL-5) and depression severity (BDI-II, PHQ-9).
Results
PTSD patients exhibited smaller GM volumes across the frontal and temporal lobes, and cerebellum, with the most significant effect in the left cerebellum (Hedges’ g = 0.22, pcorrected = .001), and smaller cerebellar WM volume (peak Hedges’ g = 0.14, pcorrected = .008). We observed similar regional differences when comparing patients to trauma-exposed controls, suggesting these structural abnormalities may be specific to PTSD. Regression analyses revealed PTSD severity was negatively associated with GM volumes within the cerebellum (pcorrected = .003), while depression severity was negatively associated with GM volumes within the cerebellum and superior frontal gyrus in patients (pcorrected = .001).
Conclusions
PTSD patients exhibited widespread, regional differences in brain volumes where greater regional deficits appeared to reflect more severe symptoms. Our findings add to the growing literature implicating the cerebellum in PTSD psychopathology.
Background: Nipocalimab is a human IgG1 monoclonal antibody targeting FcRn that selectively reduces IgG levels without impacting antigen presentation, T- and B-cell functions. This study describes the effect of nipocalimab on vaccine response. Methods: Open-label, parallel, interventional study randomized participants 1:1 to receive intravenous 30mg/kg nipocalimab at Week0 and 15mg/kg at Week2 and Week4 (active) or no drug (control). On Day 3, participants received Tdap and PPSV®23 vaccinations and were followed through Wk16. Results: Twenty-nine participants completed the study and are included (active, n=15; control, n=14). Participants with a positive anti-tetanus IgG response was comparable between groups at Wk2 and Wk16, but lower at Wk4 (nipocalimab 3/15 [20%] vs control 7/14 [50%]; P=0.089). All maintained anti-tetanus IgG above the protective threshold (0.16IU/mL) through Wk16. While anti-pneumococcal-capsular-polysaccharide (PCP) IgG levels were lower during nipocalimab treatment, the percent increase from baseline at Wk2 and Wk16 was comparable between groups. Post-vaccination, anti-PCP IgG remained above 50mg/L and showed a 2-fold increase from baseline throughout the study in both groups. Nipocalimab co-administration with vaccines was safe and well-tolerated. Conclusions: These findings suggest that nipocalimab does not impact the development of an adequate IgG response to T-cell–dependent/independent vaccines and that nipocalimab-treated patients can follow recommended vaccination schedules.
Recent changes to US research funding are having far-reaching consequences that imperil the integrity of science and the provision of care to vulnerable populations. Resisting these changes, the BJPsych Portfolio reaffirms its commitment to publishing mental science and advancing psychiatric knowledge that improves the mental health of one and all.
Metabolic dysfunction-associated fatty liver disease (MAFLD) is the most common liver disease globally, affecting 1 in 3 Australian adults and up to 39% in rural communities(1). Behaviour changes targeting diet and physical activity to achieve weight loss are considered the cornerstones of MAFLD management. A Mediterranean diet (MedDiet) rich in wholegrains, vegetables, fruits, fish, olives, raw nuts and seeds is recommended in key global guidelines as the optimal dietary pattern for MAFLD(2). Additionally, research evidence indicates moderate-intensity aerobic exercise is effective in reducing liver fat and improving cardiometabolic health(3). Given the higher rates of MAFLD in rural communities and their limited access to healthcare services, digital health interventions present a valuable opportunity to improve the accessibility, availability and personalisation of healthcare services to address this important unmet need. However, no digital interventions to address health risk behaviours in MAFLD including diet and physical activity, are currently available. This research aimed to use best practice co-design methodology to develop a web-based healthy living intervention for people with MAFLD. An iterative co-design process using the Double Diamond Framework, including four key phases was undertaken over 12 months. Twenty-seven adults (≥ 18 years) were recruited from The Alfred Hospital, Australia. This included people with MAFLD (n = 10; 50% female; mean age: 63.6 years), healthcare professionals (HCPs) (n = 17; 59% female; mean age: 37.1 years) [dietitians (n = 5), exercise professionals (n = 6), and clinicians/hepatologists (n = 6)]. Phase 1–discover. Barriers and facilitators were explored through semi-structured interviews to understand the needs of the target population regarding accessibility, appearance, resources and application of the web-based intervention. Interviews were virtual, conducted one-on-one via ZoomTM, transcribed and inductively analysed using NVivo. Phase 2–define. A reflexive thematic analysis identified five key themes within the data. These included: i) web-based functionality, navigation and formatting, ii) holistic behaviour change including MedDiet and physical activity, iii) digital health accessibility, iv) knowledge and resources, and v) intervention duration and reminders. Phase 3–develop. The knowledge gained from this process lead to the development of the web-based intervention taking into consideration expressed preferences for features that can enhance knowledge about the condition, offer dietary and physical activity support via targeted resources and videos, and increase engagement via chat group and frequent reminders. Phase 4–deliver. The co-design has led to the development of a web-based healthy living intervention that will be further evaluated for feasibility and implementation in a pilot trial. The resulting intervention aims to achieve behavioural change and promote healthier living amongst Australians with MAFLD. This knowledge has the potential to drive strategies to reduce barriers to accessing healthcare remotely, making the web-based intervention a valuable tool for both patients and professionals.
Objectives/Goals: We describe the prevalence of individuals with household exposure to SARS-CoV-2, who subsequently report symptoms consistent with COVID-19, while having PCR results persistently negative for SARS-CoV-2 (S[+]/P[-]). We assess whether paired serology can assist in identifying the true infection status of such individuals. Methods/Study Population: In a multicenter household transmission study, index patients with SARS-CoV-2 were identified and enrolled together with their household contacts within 1 week of index’s illness onset. For 10 consecutive days, enrolled individuals provided daily symptom diaries and nasal specimens for polymerase chain reaction (PCR). Contacts were categorized into 4 groups based on presence of symptoms (S[+/-]) and PCR positivity (P[+/-]). Acute and convalescent blood specimens from these individuals (30 days apart) were subjected to quantitative serologic analysis for SARS-CoV-2 anti-nucleocapsid, spike, and receptor-binding domain antibodies. The antibody change in S[+]/P[-] individuals was assessed by thresholds derived from receiver operating characteristic (ROC) analysis of S[+]/P[+] (infected) versusS[-]/P[-] (uninfected). Results/Anticipated Results: Among 1,433 contacts, 67% had ≥1 SARS-CoV-2 PCR[+] result, while 33% remained PCR[-]. Among the latter, 55% (n = 263) reported symptoms for at least 1 day, most commonly congestion (63%), fatigue (63%), headache (62%), cough (59%), and sore throat (50%). A history of both previous infection and vaccination was present in 37% of S[+]/P[-] individuals, 38% of S[-]/P[-], and 21% of S[+]/P[+] (P<0.05). Vaccination alone was present in 37%, 41%, and 52%, respectively. ROC analyses of paired serologic testing of S[+]/P[+] (n = 354) vs. S[-]/P[-] (n = 103) individuals found anti-nucleocapsid data had the highest area under the curve (0.87). Based on the 30-day antibody change, 6.9% of S[+]/P[-] individuals demonstrated an increased convalescent antibody signal, although a similar seroresponse in 7.8% of the S[-]/P[-] group was observed. Discussion/Significance of Impact: Reporting respiratory symptoms was common among household contacts with persistent PCR[-] results. Paired serology analyses found similar seroresponses between S[+]/P[-] and S[-]/P[-] individuals. The symptomatic-but-PCR-negative phenomenon, while frequent, is unlikely attributable to true SARS-CoV-2 infections that go missed by PCR.
Duchenne muscular dystrophy is a devastating neuromuscular disorder characterized by the loss of dystrophin, inevitably leading to cardiomyopathy. Despite publications on prophylaxis and treatment with cardiac medications to mitigate cardiomyopathy progression, gaps remain in the specifics of medication initiation and optimization.
Method:
This document is an expert opinion statement, addressing a critical gap in cardiac care for Duchenne muscular dystrophy. It provides thorough recommendations for the initiation and titration of cardiac medications based on disease progression and patient response. Recommendations are derived from the expertise of the Advance Cardiac Therapies Improving Outcomes Network and are informed by established guidelines from the American Heart Association, American College of Cardiology, and Duchenne Muscular Dystrophy Care Considerations. These expert-derived recommendations aim to navigate the complexities of Duchenne muscular dystrophy-related cardiac care.
Results:
Comprehensive recommendations for initiation, titration, and optimization of critical cardiac medications are provided to address Duchenne muscular dystrophy-associated cardiomyopathy.
Discussion:
The management of Duchenne muscular dystrophy requires a multidisciplinary approach. However, the diversity of healthcare providers involved in Duchenne muscular dystrophy can result in variations in cardiac care, complicating treatment standardization and patient outcomes. The aim of this report is to provide a roadmap for managing Duchenne muscular dystrophy-associated cardiomyopathy, by elucidating timing and dosage nuances crucial for optimal therapeutic efficacy, ultimately improving cardiac outcomes, and improving the quality of life for individuals with Duchenne muscular dystrophy.
Conclusion:
This document seeks to establish a standardized framework for cardiac care in Duchenne muscular dystrophy, aiming to improve cardiac prognosis.
Mica particles approximately 10 or 25 mm square and 0.5 mm thick were placed in NaCl-NaTPB solutions to make visual observations of the changes that occur in micas when the interlayer K is replaced by Na. Samples of muscovite, biotite, phlogopite, lepidolite, and lepidomelane were used, and the effects of different degradation periods were photographed.
An increase in the thickness of the particles due to basal planes splitting apart was observed with all micas. This exfoliation released interlayer K and in some cases caused the particles to cleave into separate flakes. Lepidomelane particles remained intact despite a 20-fold increase in thickness in 7 days. Even muscovite and lepidolite exfoliated and cleaved, but much longer degradation periods were needed.
There was a distinct change in the color of the dark biotite, phlogopite and lepidomelane particles when K was removed. Therefore, the initial stages of K depletion at holes, scratches, and edges of the particles were easily followed. As the degradation of the mica particles progressed, however, the color of the mica became a less reliable index of the stage of K depletion. Visual evidence of K depletion at the edges of particles was also obtained with muscovite, but not with lepidolite.
Transverse sections of 25-mm particles of K-depleted biotite were photographed to show the edge expansion that occurred when interlayer K was replaced by Na.
Interlayer K in muscovite, biotite, phlogopite, illite and vermiculite-hydrobiotite samples was replaced by cation exchange with Na. The rate and amount of exchange varied with the mineral and the level of K in solution.
Essentially, all the K in muscovite, biotite, phlogopite and vermiculite was exchangeable when the mass-action effect of the replaced KT was reduced by maintaining a very low level of K in solution. The time required for this exchange varied from < 10 hr with vermiculite to > 45 weeks with muscovite. Only 66% of the K in the illite was exchangeable under these conditions. When the replaced K was allowed to accumulate in the solution, the amount of exchange was determined by the level of K in solution required for equilibrium. These levels decreased with the degree of K-depletion and with the selectivity of the mica for K. The order of selectivity was muscovite > illite > biotite > phlogopite > vermiculite. Decreasing the K in solution from 10 to 7 ppm increased the exchangeable K in biotite from 30 to 100%. A K level of only 0.1 ppm restricted the exchange of K in muscovite to 17%.
A decrease in layer charge was not required for K exchange, but a decrease did occur in K-depleted biotite and vermiculite. Muscovite with the highest layer charge (247 meq/100 g), least expansion with Na (12.3Å), and least sensitivity to solution pH had the highest selectivity for K and the slowest rate of exchange. The K in vermiculite was the most readily exchangeable.
Samples of several naturally fine-grained micaceous minerals were heated at 450°C for 24 hr (after the effects of other temperatures and heating periods were evaluated with the < 2 μm fraction of Grun-dite) and then characterized in terms of their release of K to NaCl-NaTPB (sodium tetraphenylboron) solutions and other potentially related properties.
This heat treatment produced a substantial increase in the amount of K that each mineral released when first placed in the NaCl-NaTPB solution (the greatest increase being 22 m-equiv K/100 g in Marblehead illite). Depending upon the mineral heated, the subsequent rate of K release was increased, decreased or unchanged. Also, all the minerals except glauconite exhibited an increase (ranging from 4 to 38 m-equiv K/100 g) in their maximum degree of K release if they were heated. Thus, it was established that the K release behavior of these minerals is not only subject to appreciable alteration by heat treatments but is altered in a manner that varies with the mineral. The nature of these alterations, however, did not clearly identify an involvement of the other mineral properties that were examined. An increase in NH4- and Cs-exchangeable K occurred when these minerals were heated—presumably as a result of exfoliation. With Morris illite samples, this increase was nearly 28 m-equiv 100 g. Thus, heated samples of these minerals may be useful sinks for the removal of NH4 and Cs in various wastes.
Knowledge of sex differences in risk factors for posttraumatic stress disorder (PTSD) can contribute to the development of refined preventive interventions. Therefore, the aim of this study was to examine if women and men differ in their vulnerability to risk factors for PTSD.
Methods
As part of the longitudinal AURORA study, 2924 patients seeking emergency department (ED) treatment in the acute aftermath of trauma provided self-report assessments of pre- peri- and post-traumatic risk factors, as well as 3-month PTSD severity. We systematically examined sex-dependent effects of 16 risk factors that have previously been hypothesized to show different associations with PTSD severity in women and men.
Results
Women reported higher PTSD severity at 3-months post-trauma. Z-score comparisons indicated that for five of the 16 examined risk factors the association with 3-month PTSD severity was stronger in men than in women. In multivariable models, interaction effects with sex were observed for pre-traumatic anxiety symptoms, and acute dissociative symptoms; both showed stronger associations with PTSD in men than in women. Subgroup analyses suggested trauma type-conditional effects.
Conclusions
Our findings indicate mechanisms to which men might be particularly vulnerable, demonstrating that known PTSD risk factors might behave differently in women and men. Analyses did not identify any risk factors to which women were more vulnerable than men, pointing toward further mechanisms to explain women's higher PTSD risk. Our study illustrates the need for a more systematic examination of sex differences in contributors to PTSD severity after trauma, which may inform refined preventive interventions.
Understanding characteristics of healthcare personnel (HCP) with SARS-CoV-2 infection supports the development and prioritization of interventions to protect this important workforce. We report detailed characteristics of HCP who tested positive for SARS-CoV-2 from April 20, 2020 through December 31, 2021.
Methods:
CDC collaborated with Emerging Infections Program sites in 10 states to interview HCP with SARS-CoV-2 infection (case-HCP) about their demographics, underlying medical conditions, healthcare roles, exposures, personal protective equipment (PPE) use, and COVID-19 vaccination status. We grouped case-HCP by healthcare role. To describe residential social vulnerability, we merged geocoded HCP residential addresses with CDC/ATSDR Social Vulnerability Index (SVI) values at the census tract level. We defined highest and lowest SVI quartiles as high and low social vulnerability, respectively.
Results:
Our analysis included 7,531 case-HCP. Most case-HCP with roles as certified nursing assistant (CNA) (444, 61.3%), medical assistant (252, 65.3%), or home healthcare worker (HHW) (225, 59.5%) reported their race and ethnicity as either non-Hispanic Black or Hispanic. More than one third of HHWs (166, 45.2%), CNAs (283, 41.7%), and medical assistants (138, 37.9%) reported a residential address in the high social vulnerability category. The proportion of case-HCP who reported using recommended PPE at all times when caring for patients with COVID-19 was lowest among HHWs compared with other roles.
Conclusions:
To mitigate SARS-CoV-2 infection risk in healthcare settings, infection prevention, and control interventions should be specific to HCP roles and educational backgrounds. Additional interventions are needed to address high social vulnerability among HHWs, CNAs, and medical assistants.
Identifying long-term care facility (LTCF)-exposed inpatients is important for infection control research and practice, but ascertaining LTCF exposure is challenging. Across a large validation study, electronic health record data fields identified 76% of LTCF-exposed patients compared to manual chart review.
Cohort studies demonstrate that people who later develop schizophrenia, on average, present with mild cognitive deficits in childhood and endure a decline in adolescence and adulthood. Yet, tremendous heterogeneity exists during the course of psychotic disorders, including the prodromal period. Individuals identified to be in this period (known as CHR-P) are at heightened risk for developing psychosis (~35%) and begin to exhibit cognitive deficits. Cognitive impairments in CHR-P (as a singular group) appear to be relatively stable or ameliorate over time. A sizeable proportion has been described to decline on measures related to processing speed or verbal learning. The purpose of this analysis is to use data-driven approaches to identify latent subgroups among CHR-P based on cognitive trajectories. This will yield a clearer understanding of the timing and presentation of both general and domain-specific deficits.
Participants and Methods:
Participants included 684 young people at CHR-P (ages 12–35) from the second cohort of the North American Prodromal Longitudinal Study. Performance on the MATRICS Consensus Cognitive Battery (MCCB) and the Wechsler Abbreviated Scale of Intelligence (WASI-I) was assessed at baseline, 12-, and 24-months. Tested MCCB domains include verbal learning, speed of processing, working memory, and reasoning & problem-solving. Sex- and age-based norms were utilized. The Oral Reading subtest on the Wide Range Achievement Test (WRAT4) indexed pre-morbid IQ at baseline. Latent class mixture models were used to identify distinct trajectories of cognitive performance across two years. One- to 5-class solutions were compared to decide the best solution. This determination depended on goodness-of-fit metrics, interpretability of latent trajectories, and proportion of subgroup membership (>5%).
Results:
A one-class solution was found for WASI-I Full-Scale IQ, as people at CHR-P predominantly demonstrated an average IQ that increased gradually over time. For individual domains, one-class solutions also best fit the trajectories for speed of processing, verbal learning, and working memory domains. Two distinct subgroups were identified on one of the executive functioning domains, reasoning and problem-solving (NAB Mazes). The sample divided into unimpaired performance with mild improvement over time (Class I, 74%) and persistent performance two standard deviations below average (Class II, 26%). Between these classes, no significant differences were found for biological sex, age, years of education, or likelihood of conversion to psychosis (OR = 1.68, 95% CI 0.86 to 3.14). Individuals assigned to Class II did demonstrate a lower WASI-I IQ at baseline (96.3 vs. 106.3) and a lower premorbid IQ (100.8 vs. 106.2).
Conclusions:
Youth at CHR-P demonstrate relatively homogeneous trajectories across time in terms of general cognition and most individual domains. In contrast, two distinct subgroups were observed with higher cognitive skills involving planning and foresight, and they notably exist independent of conversion outcome. Overall, these findings replicate and extend results from a recently published latent class analysis that examined 12-month trajectories among CHR-P using a different cognitive battery (Allott et al., 2022). Findings inform which individuals at CHR-P may be most likely to benefit from cognitive remediation and can inform about the substrates of deficits by establishing meaningful subtypes.
Late Life Major Depressive Disorder (LLD) and Hoarding Disorder (HD) are common in older adults with prevalence estimates up to 29% and 7%, respectively. Both LLD and HD are characterized by executive dysfunction and disability. There is evidence of overlapping neurobiological dysfunction in LLD and HD suggesting potential for compounded executive dysfunction and disability in the context of comorbid HD and LLD. Yet, prevalence of HD in primary presenting LLD has not been examined and potential compounded impact on executive functioning, disability, and treatment response remains unknown. Thus, the present study aimed to determine the prevalence of co-occurring HD in primary presenting LLD and examine hoarding symptom severity as a contributor to executive dysfunction, disability, and response to treatment for LLD.
Participants and Methods:
Eighty-three adults ages 65-90 participating in a psychotherapy study for LLD completed measures of hoarding symptom severity (Savings Inventory-Revised: SI-R), executive functioning (WAIS-IV Digit Span, Letter-Number Sequencing, Coding; Stroop Interference; Trail Making Test-Part B; Letter Fluency), functional ability (World Health Organization Disability Assessment Schedule-II-Short), and depression severity (Hamilton Depression Rating Scale) at post-treatment. Pearson's Chi-squared tests evaluated group differences in cognitive and functional impairment rates and depression treatment response between participants with (HD+LLD) and without (LLD-only) clinically significant hoarding symptoms. Linear regressions were used to examine the association between hoarding symptom severity and executive function performance and functional ability and included as covariates participant age, years of education, gender, and concurrent depression severity.
Results:
At post-treatment, 24.1% (20/83) of participants with LLD met criteria for clinically significant hoarding symptoms (SI-R.41). Relative to LLD-only, the LLD+HD group demonstrated greater impairment rates in Letter-Number Sequencing (χ2(1)=4.0, p=.045) and Stroop Interference (χ2(1)=4.8, p=.028). Greater hoarding symptom severity was associated with poorer executive functioning performance on Digit Span (t(71)=-2.4, β=-0.07, p=.019), Letter-Number Sequencing (t(70)=-2.1, β=-0.05, p=.044), and Letter Fluency (t(71)=-2.8, β=-0.24, p=.006). Rates of functional impairment were significantly higher in the LLD+HD (88.0%) group compared to the LLD-only (62.3%) group, (χ2(1)=5.41, p=.020). Additionally, higher hoarding symptom severity was related to greater disability (t(72)=2.97, β=0.13, p=.004). Furthermore, depression treatment response rates were significantly lower in the LLD+HD group at 24.0% (6/25) compared to 48.3% (28/58) in the LLD-only group, χ2(1)=4.26, p=.039.
Conclusions:
The present study is among the first to report prevalence of clinically significant hoarding symptoms in primary presenting LLD. The findings of 24.1% co-occurrence of HD in primary presenting LLD and increased burden on executive functioning, disability, and depression treatment outcomes have important implications for intervention and prevention efforts. Hoarding symptoms are likely under-evaluated, and thus may be overlooked, in clinical settings where LLD is identified as the primary diagnosis. Taken together with results indicating poorer depression treatment response in LLD+HD, these findings underscore the need for increased screening of hoarding behaviors in LLD and tailored interventions for this LLD+HD group. Future work examining the course of hoarding symptomatology in LLD (e.g., onset age of hoarding behaviors) may provide insights into the mechanisms associated with greater executive dysfunction and disability.
The GINI project investigates the dynamics of inequality among populations over the long term by synthesising global archaeological housing data. This project brings archaeologists together from around the world to assess hypotheses concerning the causes and consequences of inequality that are of relevance to contemporary societies globally.
Emergency departments are high-risk settings for severe acute respiratory coronavirus virus 2 (SARS-CoV-2) surface contamination. Environmental surface samples were obtained in rooms with patients suspected of having COVID-19 who did or did not undergo aerosol-generating procedures (AGPs). SARS-CoV-2 RNA surface contamination was most frequent in rooms occupied by coronavirus disease 2019 (COVID-19) patients who received no AGPs.
Clinical implementation of risk calculator models in the clinical high-risk for psychosis (CHR-P) population has been hindered by heterogeneous risk distributions across study cohorts which could be attributed to pre-ascertainment illness progression. To examine this, we tested whether the duration of attenuated psychotic symptom (APS) worsening prior to baseline moderated performance of the North American prodrome longitudinal study 2 (NAPLS2) risk calculator. We also examined whether rates of cortical thinning, another marker of illness progression, bolstered clinical prediction models.
Methods
Participants from both the NAPLS2 and NAPLS3 samples were classified as either ‘long’ or ‘short’ symptom duration based on time since APS increase prior to baseline. The NAPLS2 risk calculator model was applied to each of these groups. In a subset of NAPLS3 participants who completed follow-up magnetic resonance imaging scans, change in cortical thickness was combined with the individual risk score to predict conversion to psychosis.
Results
The risk calculator models achieved similar performance across the combined NAPLS2/NAPLS3 sample [area under the curve (AUC) = 0.69], the long duration group (AUC = 0.71), and the short duration group (AUC = 0.71). The shorter duration group was younger and had higher baseline APS than the longer duration group. The addition of cortical thinning improved the prediction of conversion significantly for the short duration group (AUC = 0.84), with a moderate improvement in prediction for the longer duration group (AUC = 0.78).
Conclusions
These results suggest that early illness progression differs among CHR-P patients, is detectable with both clinical and neuroimaging measures, and could play an essential role in the prediction of clinical outcomes.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Several hypotheses may explain the association between substance use, posttraumatic stress disorder (PTSD), and depression. However, few studies have utilized a large multisite dataset to understand this complex relationship. Our study assessed the relationship between alcohol and cannabis use trajectories and PTSD and depression symptoms across 3 months in recently trauma-exposed civilians.
Methods
In total, 1618 (1037 female) participants provided self-report data on past 30-day alcohol and cannabis use and PTSD and depression symptoms during their emergency department (baseline) visit. We reassessed participant's substance use and clinical symptoms 2, 8, and 12 weeks posttrauma. Latent class mixture modeling determined alcohol and cannabis use trajectories in the sample. Changes in PTSD and depression symptoms were assessed across alcohol and cannabis use trajectories via a mixed-model repeated-measures analysis of variance.
Results
Three trajectory classes (low, high, increasing use) provided the best model fit for alcohol and cannabis use. The low alcohol use class exhibited lower PTSD symptoms at baseline than the high use class; the low cannabis use class exhibited lower PTSD and depression symptoms at baseline than the high and increasing use classes; these symptoms greatly increased at week 8 and declined at week 12. Participants who already use alcohol and cannabis exhibited greater PTSD and depression symptoms at baseline that increased at week 8 with a decrease in symptoms at week 12.
Conclusions
Our findings suggest that alcohol and cannabis use trajectories are associated with the intensity of posttrauma psychopathology. These findings could potentially inform the timing of therapeutic strategies.