We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Anti-CD20 monoclonal antibodies are highly effective for RMS treatment. Ocrelizumab (OCR) is standard, while Rituximab (RTX) is an alternative. The impact of anti-CD20 therapies on immune markers remains understudied, though deficiencies are frequently observed and have been associated with increased risk of infection. Our objective is to characterize and compare lymphocyte, neutrophil, and immunoglobulin levels in OCR- versus RTX-treated persons with RMS. Methods: This retrospective chart review included RMS patients on OCR or RTX (2017–2023). Pre- and post-treatment levels of lymphocytes, neutrophils, and immunoglobulins (IgG, IgA, IgM) were analyzed. Kaplan-Meier curves, log-rank tests, and Cox proportional hazards models were used for survival analysis. Results: 350 patients (OCR=175, RTX=175) were included. The mean treatment length was 60.9 (SD 19.1) months for OCR and 42.7 (SD 19.5) months for RTX. RTX was associated with a significantly shorter time to IgM deficiency (29.6 vs. 40.0 months, p=0.02). Cox analysis confirmed RTX increased IgM deficiency risk (HR=1.54, 95% CI: 1.06-2.23, p=0.02). No differences were seen for lymphocytes, neutrophils, IgG, or IgA. Conclusions: RTX was associated with a shorter time to and increased risk of IgM hypogammaglobulinemia compared to OCR, highlighting the importance of long-term monitoring. Further research is needed to guide treatment decisions.
Background: Amyotrophic Lateral Sclerosis (ALS) leads to progressive functional decline and reduced survival. Identifying clinical predictors like ALSFRS-R and FVC is essential for prognosis and disease management. Understanding progression profiles based on diagnostic characteristics supports clinical trial design and assessment of treatment response. This study evaluates disease progression and survival predictors in ALS patients from the CNDR. Methods: 1565 ALS patients in the CNDR were analyzed to assess baseline ALSFRS-R, FVC, time from symptom onset to diagnosis, and their association with disease progression and survival. Results: At diagnosis, ALSFRS-R was 44.7 (SD = 5.46), with 72.3% scoring ≥44. Mean FVC was 84.2% (SD = 23.3), with 78.3% of patients having FVC ≥65%. ALSFRS-R declined at 1.06 points/month (SD = 1.33), with faster progression in patients diagnosed within 24 months (1.61 points/month). Patients with ALSFRS-R ≥44 had a median survival of 41.8 months, compared to 30.9 months for those <44 (p < 0.001). Similarly, FVC ≥65% was associated with longer survival (35.4 vs. 29.5 months, p = 0.002). Conclusions: ALSFRS-R and FVC at diagnosis predict survival and inform clinical decision-making. These findings highlight the importance of early diagnosis and targeted interventions to slow disease progression and improve patient outcomes.
Diagnosing HIV-Associated Neurocognitive Disorders (HAND) requires attributing neurocognitive impairment and functional decline at least partly to HIV-related brain effects. Depressive symptom severity, whether attributable to HIV or not, may influence self-reported functioning. We examined longitudinal relationships among objective global cognition, depressive symptom severity, and self-reported everyday functioning in people with HIV (PWH).
Methods:
Longitudinal data from 894 PWH were collected at a university-based research center (2002–2016). Participants completed self-report measures of everyday functioning to assess both dependence in instrumental activities of daily living (IADL) and subjective cognitive difficulties at each visit, along with depressive symptom severity (BDI-II). Multilevel modeling examined within- and between-person predictors of self-reported everyday functioning outcomes.
Results:
Participants averaged 6 visits over 5 years. Multilevel regression showed a significant interaction between visit-specific global cognitive performance and mean depression symptom severity on likelihood of dependence in IADL (p = 0.04), such that within-person association between worse cognition and greater likelihood of IADL dependence was strongest among individuals with lower mean depressive symptom severity. In contrast, participants with higher mean depressive symptom severity had higher likelihoods of IADL dependence regardless of cognition. Multilevel modelling of subjective cognitive difficulties showed no significant interaction between global cognition and mean depressive symptom severity (p > 0.05).
Conclusions:
The findings indicate a link between cognitive abilities and IADL dependence in PWH with low to moderate depressive symptoms. However, those with higher depressive symptoms severity report IADL dependence regardless of cognitive status. This is clinically significant because everyday functioning is measured through self-report rather than performance-based assessments.
Threat sensitivity, an individual difference construct reflecting variation in responsiveness to threats of various types, predicts physiological reactivity to aversive stimuli and shares heritable variance with anxiety disorders in adults. However, no research has been conducted yet with youth to examine the heritability of threat sensitivity or evaluate the role of genetic versus environmental influences in its relations with mental health problems. The current study addressed this gap by evaluating the psychometric properties of a measure of this construct, the 20-item Trait Fear scale (TF-20), and examining its phenotypic and genotypic correlations with different forms of psychopathology in a sample of 346 twin pairs (121 monozygotic), aged 9–14 years. Analyses revealed high internal consistency and test-retest reliability for the TF-20. Evidence was also found for its convergent and discriminant validity in terms of phenotypic and genotypic correlations with measures of fear-related psychopathology. By contrast, the TF-20’s associations with depressive conditions were largely attributable to environmental influences. Extending prior work with adults, current study findings provide support for threat sensitivity as a genetically-influenced liability for phobic fear disorders in youth.
Multicenter clinical trials are essential for evaluating interventions but often face significant challenges in study design, site coordination, participant recruitment, and regulatory compliance. To address these issues, the National Institutes of Health’s National Center for Advancing Translational Sciences established the Trial Innovation Network (TIN). The TIN offers a scientific consultation process, providing access to clinical trial and disease experts who provide input and recommendations throughout the trial’s duration, at no cost to investigators. This approach aims to improve trial design, accelerate implementation, foster interdisciplinary teamwork, and spur innovations that enhance multicenter trial quality and efficiency. The TIN leverages resources of the Clinical and Translational Science Awards (CTSA) program, complementing local capabilities at the investigator’s institution. The Initial Consultation process focuses on the study’s scientific premise, design, site development, recruitment and retention strategies, funding feasibility, and other support areas. As of 6/1/2024, the TIN has provided 431 Initial Consultations to increase efficiency and accelerate trial implementation by delivering customized support and tailored recommendations. Across a range of clinical trials, the TIN has developed standardized, streamlined, and adaptable processes. We describe these processes, provide operational metrics, and include a set of lessons learned for consideration by other trial support and innovation networks.
Avocado is a delicious fruit crop having great economic importance. Understanding the extent of variability present in the existing germplasm is important to identify genotypes with specific traits and their utilization in crop improvement. The information on genetic variability with respect to morphological and biochemical traits in Indian avocados is limited and as it has hindered genetic improvement of the crop. In the current study, 83 avocado accessions from different regions of India were assessed for important 17 morphological and 8 biochemical traits. The results showed the existence of wide variability for traits such as fruit weight (75.88–934.12 g), pulp weight (48.08–736.19 g), seed weight (6.37–32.62 g), FRAP activity (27.65–119.81 mg AEAC/100 g), total carotenoids (0.96–7.17 mg/100 g), oil content (4.91–25.49%) and crude fibre (6.85–20.75%) in the studied accessions. The first three components of principal component analysis explained 54.79 per cent of total variance. Traits such as fruit weight, pulp weight, seed weight, moisture and oil content contributed more significantly towards total variance compared to other traits. The dendrogram constructed based on Euclidean distance wards minimum variance method divided 83 accessions into two major groups and nine sub clusters suggesting wide variability in the accessions with respect to studied traits. In this study, superior accessions for important traits such as fruit size (PA-102, PA-012), high pulp recovery (PA-036, PA-082,), thick peel (PA-084, PA-043, PA-011, PA-008), high carotenoids (PA-026, PA-096) and high oil content (PA-044, PA-043, PA-046, PA-045) were identified which have potential utility in further crop improvement programmes.
Recent changes to US research funding are having far-reaching consequences that imperil the integrity of science and the provision of care to vulnerable populations. Resisting these changes, the BJPsych Portfolio reaffirms its commitment to publishing mental science and advancing psychiatric knowledge that improves the mental health of one and all.
The First Large Absorption Survey in H i (FLASH) is a large-area radio survey for neutral hydrogen in and around galaxies in the intermediate redshift range $0.4\lt z\lt1.0$, using the 21-cm H i absorption line as a probe of cold neutral gas. The survey uses the ASKAP radio telescope and will cover 24,000 deg$^2$ of sky over the next five years. FLASH breaks new ground in two ways – it is the first large H i absorption survey to be carried out without any optical preselection of targets, and we use an automated Bayesian line-finding tool to search through large datasets and assign a statistical significance to potential line detections. Two Pilot Surveys, covering around 3000 deg$^2$ of sky, were carried out in 2019-22 to test and verify the strategy for the full FLASH survey. The processed data products from these Pilot Surveys (spectral-line cubes, continuum images, and catalogues) are public and available online. In this paper, we describe the FLASH spectral-line and continuum data products and discuss the quality of the H i spectra and the completeness of our automated line search. Finally, we present a set of 30 new H i absorption lines that were robustly detected in the Pilot Surveys, almost doubling the number of known H i absorption systems at $0.4\lt z\lt1$. The detected lines span a wide range in H i optical depth, including three lines with a peak optical depth $\tau\gt1$, and appear to be a mixture of intervening and associated systems. Interestingly, around two-thirds of the lines found in this untargeted sample are detected against sources with a peaked-spectrum radio continuum, which are only a minor (5–20%) fraction of the overall radio-source population. The detection rate for H i absorption lines in the Pilot Surveys (0.3 to 0.5 lines per 40 deg$^2$ ASKAP field) is a factor of two below the expected value. One possible reason for this is the presence of a range of spectral-line artefacts in the Pilot Survey data that have now been mitigated and are not expected to recur in the full FLASH survey. A future paper in this series will discuss the host galaxies of the H i absorption systems identified here.
The Australian SKA Pathfinder (ASKAP) offers powerful new capabilities for studying the polarised and magnetised Universe at radio wavelengths. In this paper, we introduce the Polarisation Sky Survey of the Universe’s Magnetism (POSSUM), a groundbreaking survey with three primary objectives: (1) to create a comprehensive Faraday rotation measure (RM) grid of up to one million compact extragalactic sources across the southern $\sim50$% of the sky (20,630 deg$^2$); (2) to map the intrinsic polarisation and RM properties of a wide range of discrete extragalactic and Galactic objects over the same area; and (3) to contribute interferometric data with excellent surface brightness sensitivity, which can be combined with single-dish data to study the diffuse Galactic interstellar medium. Observations for the full POSSUM survey commenced in May 2023 and are expected to conclude by mid-2028. POSSUM will achieve an RM grid density of around 30–50 RMs per square degree with a median measurement uncertainty of $\sim$1 rad m$^{-2}$. The survey operates primarily over a frequency range of 800–1088 MHz, with an angular resolution of 20” and a typical RMS sensitivity in Stokes Q or U of 18 $\mu$Jy beam$^{-1}$. Additionally, the survey will be supplemented by similar observations covering 1296–1440 MHz over 38% of the sky. POSSUM will enable the discovery and detailed investigation of magnetised phenomena in a wide range of cosmic environments, including the intergalactic medium and cosmic web, galaxy clusters and groups, active galactic nuclei and radio galaxies, the Magellanic System and other nearby galaxies, galaxy halos and the circumgalactic medium, and the magnetic structure of the Milky Way across a very wide range of scales, as well as the interplay between these components. This paper reviews the current science case developed by the POSSUM Collaboration and provides an overview of POSSUM’s observations, data processing, outputs, and its complementarity with other radio and multi-wavelength surveys, including future work with the SKA.
A key step toward understanding psychiatric disorders that disproportionately impact female mental health is delineating the emergence of sex-specific patterns of brain organisation at the critical transition from childhood to adolescence. Prior work suggests that individual differences in the spatial organisation of functional brain networks across the cortex are associated with psychopathology and differ systematically by sex.
Aims
We aimed to evaluate the impact of sex on the spatial organisation of person-specific functional brain networks.
Method
We leveraged person-specific atlases of functional brain networks, defined using non-negative matrix factorisation, in a sample of n = 6437 youths from the Adolescent Brain Cognitive Development Study. Across independent discovery and replication samples, we used generalised additive models to uncover associations between sex and the spatial layout (topography) of personalised functional networks (PFNs). We also trained support vector machines to classify participants’ sex from multivariate patterns of PFN topography.
Results
Sex differences in PFN topography were greatest in association networks including the frontoparietal, ventral attention and default mode networks. Machine learning models trained on participants’ PFNs were able to classify participant sex with high accuracy.
Conclusions
Sex differences in PFN topography are robust, and replicate across large-scale samples of youth. These results suggest a potential contributor to the female-biased risk in depressive and anxiety disorders that emerge at the transition from childhood to adolescence.
Neuropsychological disorders, including anxiety, depression, and dementia, are significant public health problems among older adults. While psychotropics are effective treatments, long-term treatment often has adverse side effects(1). Many patients often seek healthy food consumption as an alternative preventive strategy. Dietary fibre has been suggested for many health benefits, including cardiometabolic health and anti-inflammation, which may influence neurological health through the gut-brain axis(2). However, fibre’s role in neuropsychological health outcomes in older people is unclear. This study examined the potential role of dietary fibre intake and consumption of fibre-rich foods in neurological health outcomes in older Australians. We utilised data from the Ageing Study (MAS) of 1,037 participants aged 70–90(3). At baseline, dietary fibre, whole grains, fresh fruit, vegetables, and nuts and legumes consumption was estimated using the Cancer Council of Victoria food frequency questionnaire. The intake amount was further derived into tertiles (T), with T1 in the lower 33rd%tile and T3 in the upper 33rd%tile. Depressive symptoms (Geriatric Depression Scale), anxiety symptoms (Goldberg Anxiety Scale), and psychological distress (Kessler Psychological Distress Scale) were assessed. Linear regression models were used to estimate beta coefficients for the associations cross-sectionally. Incident dementia was defined using diagnostic criteria, clinical assessments, and a consensus panel review. Nine hundred and sixty-three participants were followed up from the baseline (2005) until wave 4 (2011) [median: 5.8 (IQR: 3.1–5.9) years; 97 incident cases). Incident depression was defined as diagnoses by healthcare professionals and treatments for depression. Eight hundred and nine participants were followed up from the baseline (2005) until wave 3 (2009) [median: 3.9 (IQR: 1.9–4.0) years; 109 incident cases). Cox proportional hazard models were used to estimate hazard ratios (95% CIs). All models were adjusted for demographic characteristics, lifestyle factors, and health history. Among 963 participants (mean age: 78.5; 5.8% females) in the cross-sectional analysis, compared with T1, higher vegetable intake was associated with fewer depressive symptoms (T2: β = 0.52; T3: β= −0.53; both p < 0.05), psychological distress (T2: β = −0.59; T3: β = −1.13; both p < 0.05), and anxiety symptoms (T3: β = −0.37; p = 0.03). Combined intake of vegetables and fruit was inversely associated with fewer psychological distress symptoms (T2: β = −0.55; p = 0.06; T3: β = −1.3; p < 0.05). In the highest tertile, dietary fibre was associated with fewer depressive symptoms (T3: β = −0.47; p = 0.04). In the longitudinal analysis, dietary fibre intake was associated with a 43–56% lower risk of incident dementia (T2 vs T1: adj.HR = 0.57; 95% CI: 0.31–1.03; T3 vs T1: adj.HR = 0.44; 95% CI: 0.19–1.01). Intakes of whole grains, fruit, nuts and legumes were not associated with the outcomes assessed. In a cohort of older Australians, dietary fibre intake appeared to be protective in reducing depressive symptoms cross-sectionally and the risk of incident dementia longitudinally. Additionally, vegetable consumption was associated with fewer symptoms related to depression, anxiety, and distress cross-sectionally.
Depression and dementia represent significant public health issues, affecting approximately 1 in 10 and 1 in 12 older Australians, respectively. While current pharmacological treatments are effective in relieving symptoms, they often entail undesirable adverse effects, including gastrointestinal issues and bradycardia(1,2). This highlights the need for primary preventative measures, including food- and nutrition-based approaches. Chronic brain inflammation is believed to interfere with the gut–brain axis(3). Consumption of fermented dairy products rich in beneficial gut microbes may attenuate this inflammation and offer protective health benefits. This study aimed to examine whether fermented dairy intake could mitigate the risk of incident depression and dementia. Utilising data from the Sydney Memory and Ageing Study I of 1037 participants 70–90 years, 816 participants (mean age: 76.7) were followed from 2005 until 2012 for incident depression, and 974 participants (mean age: 80.7) were followed up from 2005 until 2014 for incident dementia. Fermented dairy intake was assessed using the Dietary Questionnaire for Epidemiological Studies version 2 and categorised yoghurt and regular cheese into quartiles (Q) and low-fat cheese into consumers/non-consumers, with no consumption as the reference group. Depression diagnoses were assessed via self-reported physician-diagnosed history, medication use, service utilisation, and heavy alcohol use. Dementia diagnoses followed the criteria in the fourth edition of the Diagnostic and Statistical Manual of Mental Disorders. Cox proportional hazards models examined the associations between fermented dairy intake and the risk of incident depression/dementia. Additionally, linear regression models were applied to assess for depressive symptoms score (measured by the Geriatric Depression Scale-15) and psychological distress score (measured by the Kessler Psychological Distress Scale-10). All models were adjusted for sociodemographic, lifestyle factors, and medical histories. Over a median follow-up of 3.9 and 5.8 years, 120 incident depression and 100 incident dementia cases occurred, respectively. Those who consumed high yoghurt (Q4: 145.8–437.4 g/day) and low-fat cheese (consumers: 0.4–103.1g/day) intakes were associated with a lower risk of incident depression, both compared to non-consumers (yoghurt: adj.HR: 0.38, 95% CI: 0.19–0.77; low-fat cheese: adj.HR: 0.50; 95% CI: 0.29–0.86). They were also associated with lower depressive symptom scores (yoghurt: adj.β = −0.46; 95% CI: −0.84, −0.07; low-fat cheese: adj.β = −0.42; 95% CI: −0.73, −0.11). However, those who consumed a higher intake of regular cheese (Q4: 14.7–86.1 g/day) had an elevated risk of incident depression (adj.HR: 1.88; 95% CI: 1.02, 3.47), and those in Q2 (0.1–7.2 g/day) had significantly higher depressive symptom scores (adj.β = 0.42; 95% CI: 0.05, 0.78). No significant findings were found for psychological distress scores or incident dementia. Our findings of a cohort of older Australians suggest that higher yoghurt and low-fat cheese intakes may reduce incident depression and depressive symptoms, while a higher intake of regular cheese may increase these risks.
Improving neonatal piglet survival is a key driver for improving pig production and enhancing animal welfare. Gestational diabetes is a risk factor for neonatal morbidities in humans, such as hypoglycaemia and respiratory distress(1). There is limited knowledge on the association of gestational diabetes with neonatal survival in commercial pigs. An early study suggested that the diabetic condition of late-gestating sows was positively correlated with the first-week newborn piglet mortality(2). Genetic selection in recent decades for heavier birth weight may have increased the prevalence or severity of gestational diabetes in pigs, considering the positive correlation between gestational diabetes and birth weight. We hypothesised that the diabetic condition of late gestating sows positively correlates with the neonatal piglet mortality rate in sows with modern genetics. Mixed-parity sows (1.5 ± 1.6 parity for mean ± standard deviation (SD); Large White × Landrace) from a commercial piggery in Australia were randomly selected and participated in an oral glucose tolerance test (OGTT) during two seasons (118 sows in winter and 118 sows in summer). On the d109 day of gestation, sows were fed 3.0 g dextrose per kg of metabolic body weight after fasting overnight. Tail blood glucose concentrations were measured using a glucometer (Accu-Chek ®, Roche Diabetes Care Australia Pty) at −10, 0, 10, 20, 30, 40, 50, 60, 70, 80, 90, 105, 120 minutes relative to dextrose feeding. The glucose increment (2.5 ± 1.29 mM for mean ± SD) during OGTT was calculated using the maximum concentration substrating the fasting concentration of blood glucose. The 24-hour piglet mortality rate (5% ± 8.8% for mean ± SD) was calculated as the ratio between piglets that died during the first 24 hours and the total number of born alive on a litter basis. The effect of sow glucose increment, season (winter vs summer), glucose increment × season, number of piglets born alive, and sows parity on the 24-h piglet mortality rate as analysed using a Generalised Linear Model (SPSS 27th Version, IBM SPSS Statistics, Armonk). Results showed that the 24-hour piglet mortality rate was numerically higher in winter than in summer although insignificant (5.7% vs 4.2%, p = 0.41). The glucose increment of gestating sows was positively correlated with the 24-hour piglet mortality rate during winter but not summer, as evidenced by an interaction trend between glucose increment and season (p = 0.059). The regression coefficient suggested that every extra unit (mM) of glucose increment during OGTT corresponded to a 1.4% increase in the 24-hour piglet mortality rate in winter. In conclusion, the diabetic condition of late-gestating sows is a risk factor for neonatal piglet mortality in winter. Developing nutritional strategies to mitigate the diabetic condition of late-gestating sows may benefit neonatal piglet survival.
In Australia and other high-income countries, communities are experiencing diet-related diseases due to social inequities and food systems that promote the production and consumption of unhealthy foods(1). Community food hubs have the potential to strengthen local food systems and improve access to healthy, affordable, culturally appropriate food by selling local food to local people(2). The primary aim of this rapid review was to identify short- and medium-term outcomes and long-term impacts associated with community food hubs. In January 2024, four databases and the grey literature were searched for relevant studies and reports published in English between 2013 and 2023. Empirical evaluations of food hubs in high-income countries that included a physical market selling healthy local food were eligible for inclusion. A narrative synthesis was conducted, and descriptive statistics were used to summarise outcomes and impacts under five categories: economic development and viability; ecological sustainability; access to and demand for healthy local food; personal and community wellbeing; and agency and re-localisation of power(3,4). A total of 16 studies/reports were included, reporting on 24 community food hubs (USA n = 16; Australia n = 7; Canada n = 1). Food hubs were often described as farmers’ markets (n = 9, 37% of food hubs), some of which offered financial incentives/subsidies to people living on low incomes. Some food hubs also sold food wholesale and/or provided nutrition education and community gardens. Across the 24 food hubs, a total of 83 short- and medium-term outcomes were assessed. No long-term impacts were evaluated. Outcomes were considered ‘positive’ if evaluation results reflected desirable changes. Overall, 86% of outcomes were positive (n = 71). Within the personal and community wellbeing category, 42 outcomes were assessed, and 83% (n = 35) were positive (e.g., increased fruit and vegetable consumption, increased community connection). Within the access to and demand for healthy local food category, 25 outcomes were assessed, and 96% (n = 24) were positive (e.g., increased access to and/or demand for affordable local produce). Outcomes under the remaining three categories were assessed less frequently. Within the economic development and viability category, 6 outcomes were assessed, and 50% (n = 3) were positive (e.g., access to new markets for food hub suppliers). Within the ecological sustainability category, 6 outcomes were assessed, and 100% (n = 6) were positive (e.g., reduction in food packaging and food waste). Within the agency and re-localisation of power category, 4 outcomes were assessed, and 75% (n = 3) were positive (e.g., integration of community members from low income and cultural minority groups into local food systems). Community food hubs can promote personal and community wellbeing, access to and demand for healthy local food, economic development and viability, ecological sustainability, and agency and re-localisation of power. Future research should focus on methods for evaluating long-term impacts under all five categories.
Posttraumatic stress disorder (PTSD) has been associated with advanced epigenetic age cross-sectionally, but the association between these variables over time is unclear. This study conducted meta-analyses to test whether new-onset PTSD diagnosis and changes in PTSD symptom severity over time were associated with changes in two metrics of epigenetic aging over two time points.
Methods
We conducted meta-analyses of the association between change in PTSD diagnosis and symptom severity and change in epigenetic age acceleration/deceleration (age-adjusted DNA methylation age residuals as per the Horvath and GrimAge metrics) using data from 7 military and civilian cohorts participating in the Psychiatric Genomics Consortium PTSD Epigenetics Workgroup (total N = 1,367).
Results
Meta-analysis revealed that the interaction between Time 1 (T1) Horvath age residuals and new-onset PTSD over time was significantly associated with Horvath age residuals at T2 (meta β = 0.16, meta p = 0.02, p-adj = 0.03). The interaction between T1 Horvath age residuals and changes in PTSD symptom severity over time was significantly related to Horvath age residuals at T2 (meta β = 0.24, meta p = 0.05). No associations were observed for GrimAge residuals.
Conclusions
Results indicated that individuals who developed new-onset PTSD or showed increased PTSD symptom severity over time evidenced greater epigenetic age acceleration at follow-up than would be expected based on baseline age acceleration. This suggests that PTSD may accelerate biological aging over time and highlights the need for intervention studies to determine if PTSD treatment has a beneficial effect on the aging methylome.
Objectives/Goals: To explore the caregivers’ lived experiences related to facilitators of and barriers to effective primary care or neurology follow-up for children discharged from the pediatric emergency department (PED) with headaches. Methods/Study Population: We used the descriptive phenomenology qualitative study design to ascertain caregivers’ lived experiences with making follow-up appointments after their child’s PED visit. We conducted semi-structured interviews with caregivers of children with headaches from 4 large urban PEDs over HIPAA-compliant Zoom conferencing platform. A facilitator/co-facilitator team (JH and SL) guided all interviews, and the audio of which was transcribed using the TRINT software. Conventional content analysis was performed by two coders (JH and AS) to generate new themes, and coding disputes were resolved by team members using Atlas TI (version 24). Results/Anticipated Results: We interviewed a total of 11 caregivers (9 mothers, 1 grandmother, and 1 father). Among interviewees, 45% identified as White non-Hispanic, 45% Hispanic, 9% as African-American, and 37% were publicly insured. Participants described similar experiences in obtaining follow-up care that included long waits to obtain neurology appointments. Participants also described opportunities to overcome wait times that included offering alternative healthcare provider types as well as telehealth options. Last, participants described desired action while awaiting neurology appointments such as obtaining testing and setting treatment plans. Discussion/Significance of Impact: Caregivers perceived time to appointment as too long and identified practical solutions to ease frustrations while waiting. Future research should explore sharing caregiver experiences with primary care providers, PED physicians, and neurologists while developing plans to implement caregiver-informed interventions.
The Early Minimally Invasive Removal of Intracerebral Hemorrhage (ENRICH) trial demonstrated that minimally invasive surgery to treat spontaneous lobar intracerebral hemorrhage (ICH) improved functional outcomes. We aimed to explore current management trends for spontaneous lobar ICH in Canada to assess practice patterns and determine whether further randomized controlled trials are needed to clarify the role of surgical intervention.
Methods:
Neurologists, neurosurgeons, physiatrists and trainees in these specialties were invited to complete a 16-question survey exploring three areas: (1) current management for spontaneous lobar ICH at their institution, (2) perceived influence of ENRICH on their practice and (3) perceived need for additional clinical trial data. Standard descriptive statistics were used to report categorical variables. The χ2 test was used to compare responses across specialties and career stages.
Results:
The survey was sent to 433 physicians, and 101 (23.3%) responded. Sixty-eight percent of participants reported that prior to publication of the ENRICH trial, spontaneous lobar ICH was primarily managed conservatively, with surgery reserved for life-threatening situations. Forty-three percent of participants did not foresee a significant increase in surgical intervention at their institution. Of neurosurgical respondents, 33% remained hesitant to offer surgical intervention beyond lifesaving operations. Only 5% reported routinely using specifically designed technologies to evacuate ICH. Seventy percent reported that another randomized controlled trial comparing nonsurgical to surgical management for spontaneous lobar ICH is needed.
Conclusions:
There is significant practice variability in the management of spontaneous lobar ICH across Canadian institutions, stressing the need for additional clinical trial data to determine the role of surgical intervention.
Duchenne muscular dystrophy is a devastating neuromuscular disorder characterized by the loss of dystrophin, inevitably leading to cardiomyopathy. Despite publications on prophylaxis and treatment with cardiac medications to mitigate cardiomyopathy progression, gaps remain in the specifics of medication initiation and optimization.
Method:
This document is an expert opinion statement, addressing a critical gap in cardiac care for Duchenne muscular dystrophy. It provides thorough recommendations for the initiation and titration of cardiac medications based on disease progression and patient response. Recommendations are derived from the expertise of the Advance Cardiac Therapies Improving Outcomes Network and are informed by established guidelines from the American Heart Association, American College of Cardiology, and Duchenne Muscular Dystrophy Care Considerations. These expert-derived recommendations aim to navigate the complexities of Duchenne muscular dystrophy-related cardiac care.
Results:
Comprehensive recommendations for initiation, titration, and optimization of critical cardiac medications are provided to address Duchenne muscular dystrophy-associated cardiomyopathy.
Discussion:
The management of Duchenne muscular dystrophy requires a multidisciplinary approach. However, the diversity of healthcare providers involved in Duchenne muscular dystrophy can result in variations in cardiac care, complicating treatment standardization and patient outcomes. The aim of this report is to provide a roadmap for managing Duchenne muscular dystrophy-associated cardiomyopathy, by elucidating timing and dosage nuances crucial for optimal therapeutic efficacy, ultimately improving cardiac outcomes, and improving the quality of life for individuals with Duchenne muscular dystrophy.
Conclusion:
This document seeks to establish a standardized framework for cardiac care in Duchenne muscular dystrophy, aiming to improve cardiac prognosis.