We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Posttraumatic stress disorder (PTSD) has been associated with advanced epigenetic age cross-sectionally, but the association between these variables over time is unclear. This study conducted meta-analyses to test whether new-onset PTSD diagnosis and changes in PTSD symptom severity over time were associated with changes in two metrics of epigenetic aging over two time points.
Methods
We conducted meta-analyses of the association between change in PTSD diagnosis and symptom severity and change in epigenetic age acceleration/deceleration (age-adjusted DNA methylation age residuals as per the Horvath and GrimAge metrics) using data from 7 military and civilian cohorts participating in the Psychiatric Genomics Consortium PTSD Epigenetics Workgroup (total N = 1,367).
Results
Meta-analysis revealed that the interaction between Time 1 (T1) Horvath age residuals and new-onset PTSD over time was significantly associated with Horvath age residuals at T2 (meta β = 0.16, meta p = 0.02, p-adj = 0.03). The interaction between T1 Horvath age residuals and changes in PTSD symptom severity over time was significantly related to Horvath age residuals at T2 (meta β = 0.24, meta p = 0.05). No associations were observed for GrimAge residuals.
Conclusions
Results indicated that individuals who developed new-onset PTSD or showed increased PTSD symptom severity over time evidenced greater epigenetic age acceleration at follow-up than would be expected based on baseline age acceleration. This suggests that PTSD may accelerate biological aging over time and highlights the need for intervention studies to determine if PTSD treatment has a beneficial effect on the aging methylome.
Objectives/Goals: We describe the prevalence of individuals with household exposure to SARS-CoV-2, who subsequently report symptoms consistent with COVID-19, while having PCR results persistently negative for SARS-CoV-2 (S[+]/P[-]). We assess whether paired serology can assist in identifying the true infection status of such individuals. Methods/Study Population: In a multicenter household transmission study, index patients with SARS-CoV-2 were identified and enrolled together with their household contacts within 1 week of index’s illness onset. For 10 consecutive days, enrolled individuals provided daily symptom diaries and nasal specimens for polymerase chain reaction (PCR). Contacts were categorized into 4 groups based on presence of symptoms (S[+/-]) and PCR positivity (P[+/-]). Acute and convalescent blood specimens from these individuals (30 days apart) were subjected to quantitative serologic analysis for SARS-CoV-2 anti-nucleocapsid, spike, and receptor-binding domain antibodies. The antibody change in S[+]/P[-] individuals was assessed by thresholds derived from receiver operating characteristic (ROC) analysis of S[+]/P[+] (infected) versusS[-]/P[-] (uninfected). Results/Anticipated Results: Among 1,433 contacts, 67% had ≥1 SARS-CoV-2 PCR[+] result, while 33% remained PCR[-]. Among the latter, 55% (n = 263) reported symptoms for at least 1 day, most commonly congestion (63%), fatigue (63%), headache (62%), cough (59%), and sore throat (50%). A history of both previous infection and vaccination was present in 37% of S[+]/P[-] individuals, 38% of S[-]/P[-], and 21% of S[+]/P[+] (P<0.05). Vaccination alone was present in 37%, 41%, and 52%, respectively. ROC analyses of paired serologic testing of S[+]/P[+] (n = 354) vs. S[-]/P[-] (n = 103) individuals found anti-nucleocapsid data had the highest area under the curve (0.87). Based on the 30-day antibody change, 6.9% of S[+]/P[-] individuals demonstrated an increased convalescent antibody signal, although a similar seroresponse in 7.8% of the S[-]/P[-] group was observed. Discussion/Significance of Impact: Reporting respiratory symptoms was common among household contacts with persistent PCR[-] results. Paired serology analyses found similar seroresponses between S[+]/P[-] and S[-]/P[-] individuals. The symptomatic-but-PCR-negative phenomenon, while frequent, is unlikely attributable to true SARS-CoV-2 infections that go missed by PCR.
Inappropriate diagnosis and treatment of urinary tract infections (UTIs) contribute to antibiotic overuse. The Inappropriate Diagnosis of UTI (ID-UTI) measure uses a standard definition of asymptomatic bacteriuria (ASB) and was validated in large hospitals. Critical access hospitals (CAHs) have different resources which may make ASB stewardship challenging. To address this inequity, we adapted the ID-UTI metric for use in CAHs and assessed the adapted measure’s feasibility, validity, and reliability.
Design:
Retrospective observational study
Participants:
10 CAHs
Methods:
From October 2022 to July 2023, CAHs submitted clinical information for adults admitted or discharged from the emergency department who received antibiotics for a positive urine culture. Feasibility of case submission was assessed as the number of CAHs achieving the goal of 59 cases. Validity (sensitivity/specificity) and reliability of the ID-UTI definition were assessed by dual-physician review of a random sample of submitted cases.
Results:
Among 10 CAHs able to participate throughout the study period, only 40% (4/10) submitted >59 cases (goal); an additional 3 submitted >35 cases (secondary goal). Per the ID-UTI metric, 28% (16/58) of cases were ASB. Compared to physician review, the ID-UTI metric had 100% specificity (ie all cases called ASB were ASB on clinical review) but poor sensitivity (48.5%; ie did not identify all ASB cases). Measure reliability was high (93% [54/58] agreement).
Conclusions:
Similar to measure performance in non-CAHs, the ID-UTI measure had high reliability and specificity—all cases identified as ASB were considered ASB—but poor sensitivity. Though feasible for a subset of CAHs, barriers remain.
Asymptomatic bacteriuria (ASB) treatment is a common form of antibiotic overuse and diagnostic error. Antibiotic stewardship using the inappropriate diagnosis of urinary tract infection (ID-UTI) measure has reduced ASB treatment in diverse hospitals. However, critical access hospitals (CAHs) have differing resources that could impede stewardship. We aimed to determine if stewardship including the ID-UTI measure could reduce ASB treatment in CAHs.
Methods:
From October 2022 to July 2023, ten CAHs participated in an Intensive Quality Improvement Cohort (IQIC) program including 3 interventions to reduce ASB treatment: 1) learning labs (ie, didactics with shared learning), 2) mentoring, and 3) data-driven performance reports including hospital peer comparison based on the ID-UTI measure. To assess effectiveness of the IQIC program, change in the ID-UTI measure (ie, percentage of patients treated for a UTI who had ASB) was compared to two non-equivalent control outcomes (antibiotic duration and unjustified fluoroquinolone use).
Results:
Ten CAHs abstracted a total of 608 positive urine culture cases. Over the cohort period, the percentage of patients treated for a UTI who had ASB declined (aOR per month = 0.935, 95% CI: 0.873, 1.001, P = 0.055) from 28.4% (range across hospitals, 0%-63%) in the first to 18.6% (range, 0%-33%) in the final month. In contrast, antibiotic duration and unjustified fluoroquinolone use were unchanged (P = 0.768 and 0.567, respectively).
Conclusions:
The IQIC intervention, including learning labs, mentoring, and performance reports using the ID-UTI measure, was associated with a non-significant decrease in treatment of ASB, while control outcomes (duration and unjustified fluoroquinolone use) did not change.
Complications following the Fontan procedure include prolonged pleural drainage and readmission for effusions. To address these complications, a post-Fontan management pathway was implemented with primary goals of reducing chest tube duration/reinsertion rates and decreasing hospital length of stay and readmissions.
Methods:
Fontan patients were identified by retrospective chart review (2017–2019) to obtain baseline data for chest tube duration/reinsertion rates, hospital length of stay, and readmission rates for effusion. A post-Fontan management pathway was implemented (2020–2021) utilising post-operative vasopressin, nasal cannula oxygen until chest tube removal, and discharge regimen of three times daily diuretics, sildenafil, and afterload reducing medications. Patients were followed to evaluate primary outcomes.
Results:
The pre- and post-pathway groups were similar in single ventricle morphology, demographics, and pre-operative haemodynamics. Forty-three and 36 patients were included in the pre- and post-pathway cohorts, respectively. There were statistically significant reductions in chest tube duration (8 vs. 5 days, p ≤ 0.001), chest tube output on post-operative day 4 (20.4 vs. 9.9 mL/kg/day, p = 0.003), and hospital readmission rates for effusion (13[30%] vs. 3[8%], p = 0.02) compared to baseline. There was an absolute reduction in hospital length of stay (11 vs. 9.5 days, p = 0.052). When combining average cost savings for the Fontan hospitalisations, readmissions for effusion, and cardiac catheterisations within 6 months of Fontan completion, there was a $325,144 total cost savings for 36 patients following pathway implementation.
Conclusion:
Implementation of a post-Fontan management pathway resulted in significant reductions in chest tube duration and output, and readmission rates for effusion in the perioperative period.
Diagnostic criteria for major depressive disorder allow for heterogeneous symptom profiles but genetic analysis of major depressive symptoms has the potential to identify clinical and etiological subtypes. There are several challenges to integrating symptom data from genetically informative cohorts, such as sample size differences between clinical and community cohorts and various patterns of missing data.
Methods
We conducted genome-wide association studies of major depressive symptoms in three cohorts that were enriched for participants with a diagnosis of depression (Psychiatric Genomics Consortium, Australian Genetics of Depression Study, Generation Scotland) and three community cohorts who were not recruited on the basis of diagnosis (Avon Longitudinal Study of Parents and Children, Estonian Biobank, and UK Biobank). We fit a series of confirmatory factor models with factors that accounted for how symptom data was sampled and then compared alternative models with different symptom factors.
Results
The best fitting model had a distinct factor for Appetite/Weight symptoms and an additional measurement factor that accounted for the skip-structure in community cohorts (use of Depression and Anhedonia as gating symptoms).
Conclusion
The results show the importance of assessing the directionality of symptoms (such as hypersomnia versus insomnia) and of accounting for study and measurement design when meta-analyzing genetic association data.
Epilepsy is one of the most common serious brain illness, with symptoms influenced by multiple risk factors and a strong genetic predisposition, rather than having a single expression and cause¹. Neuropsychiatric symptoms in epilepsy can encompass manifestations such as mood alterations, anxiety, sleep disturbances, psychosis, and behavioral disorders. While the motor and sensory manifestations of epileptic seizures are widely recognized, neuropsychiatric symptoms accompanying epilepsy are often underestimated. Therefore, it is essential to understand the most prevalent epidemiological profile of these patients to improve the diagnosis and management of these symptoms.
Objectives
Our goal was to evaluate the neuropsychiatric behavior of epilepsy patients in Brazilian over the past 3 years through hospitalization data in order to outline an epidemiological and behavioral profile.
Methods
A cross-sectional, descriptive, retrospective, and quantitative study was conducted on hospitalizations of individuals simultaneously diagnosed with epilepsy, schizotypal and delusional disorders, and mood disorders in all five regions of Brazil (South, Southeast, Midwest, North, and Northeast) between February 2020 and December 2022. Data from January 2020 were not available. The data used were collected through the Department of Health Informatics of the Brazilian Unified Health System (DATASUS) in the “Hospital Information System of SUS” section, gathering information regarding the nature of care, age range, gender, and ethnicity of the patients.
Results
The analysis covers the years 2020 to 2022, totaling 503,045 hospitalizations. In 2022, the highest number of cases occurred (≈ 37.55%), followed by 2021 (≈ 33.62%) and 2020 (≈ 28.81%). Urgent hospitalizations represented ≈ 90.85% of the total. The most affected age group was 30 to 39 years old (≈ 18.30%). Men were more affected than women (≈ 52.03% and ≈ 47.96%, respectively), and Caucasians accounted for ≈ 36.07% of the hospitalizations. The average length of stay was 19.1 days, and the mortality rate was 1.4%.
Conclusions
Thus, there is a gradual and annual increase in the number of hospitalizations during the observed period. While there is a minimal disparity between the affected genders, it is evident that the profile of male, caucasian, and adult patients is the most prevalent. Moreover, the predominantly urgent nature of hospitalizations points to an alarming scenario regarding this issue. From the analysis of the data obtained in the study, there is a clear need for interventions capable of reducing the prevalence of hospitalizations for neuropsychiatric symptoms in epilepsy patients in Brazil.
Neuropsychiatric disorders are the leading cause of disability worldwide, as seen in cases such as depression, anxiety, bipolar mood disorder and schizophrenia, which can be developed or exacerbated by the use of psychoactive substances. Most mental disorders have an early onset, often leading to early and/or permanent disability, increasing the need and cost of healthcare. Therefore, it is necessary to improve the identification of the epidemiological profile of these cases in the South of Brazil in order to enhance the diagnosis and reduce the costs associated with managing these disorders.
Objectives
The present study aimed to analyze statistical data regarding hospitalizations related to mental disorders caused by the use of psychoactive substances and alcohol in the southern region of Brazil, highlighting the pathological scenario and identifying the most prevalent profiles of these disorders in this region.
Methods
A cross-sectional, descriptive, retrospective, and quantitative study was conducted on hospitalizations of individuals diagnosed with mental and behavioral disorders due to the use of psychoactive substances and alcohol in the states of the Southern region of Brazil (Paraná, Santa Catarina, and Rio Grande do Sul) between February 2020 and December 2022. Data of January 2020 were not available. The data used were collected through the Department of Health Informatics of the Brazilian Unified Health System (DATASUS) in the “Hospital Information System of SUS” section, gathering information regarding the nature of the care, age range, gender, and ethnicity of the patients.
Results
The study covers the years 2020 to 2022, indicating a total of 81,608 hospitalizations, with the year 2022 having the highest number of cases (≈ 37.13%), followed by 2021 (≈ 33.30%) and 2020 (≈ 29.55%). The states with the highest number of hospitalizations were Rio Grande do Sul (≈ 54.90%), Paraná (≈ 29.29%), and Santa Catarina (≈ 15.79%). Urgent hospitalizations accounted for ≈ 87.29% of the total. The most affected age group was 30 to 39 years old (≈ 25.61%). Men were more affected than women (≈ 81.70% and ≈ 18.28%, respectively). Caucasians accounted for ≈ 64.29% of the hospitalizations. The average length of stay was 20.8 days, and the mortality rate was 0.32%.
Conclusions
There is a clear increase in the number of hospitalizations related to mental disorders caused by the use of psychoactive substances in the period from 2020 to 2022 in the southern region of Brazil, with the highest number of cases in the state of Rio Grande do Sul. The most affected population consisted of Caucasian men aged 30 to 39 years old. Furthermore, these results may be related to the increasing trend of psychoactive substance use among the Brazilian population and also the COVID-19 pandemic, which led to a period of underreporting due to social isolation.
In recent years, mental health has gained prominence in public health, prompting thorough investigations into psychiatric condition trends. This study conducts a comprehensive epidemiological analysis of hospitalizations for Schizophrenia, Schizotypal, and Delirium Disorders in Rio Grande do Sul (RS) over the past five years. By revealing these patterns, it enhances our understanding of regional mental health dynamics and offers insights for intervention strategies, resource planning, and improved mental healthcare. The ultimate goal is to advance more effective and accessible mental healthcare in RS and beyond.
Objectives
This study aims to analyze the prevalence and epidemiological profile of hospitalizations due to psychiatric disorders to assist in the diagnosis and outcome of affected patients.
Methods
A cross-sectional, descriptive, retrospective, and quantitative study was conducted regarding hospitalizations for Schizophrenia, Schizotypal Disorders, and Delirium in the state of RS between January 2018 and November 2022. Data were collected from the Department of Informatics of the Brazilian Unified Health System (DATASUS) in the “Hospital Information System of SUS” section, focusing on the nature of care, age group, gender, and ethnicity of the patients. The information was aggregated over the five-year period based on the four mentioned descriptors and subsequently analyzed to establish a profile of hospitalizations during that period.
Results
The analysis spans from 2018 to 2022, encompassing a total of 28,345 hospitalizations. In 2019, there was the highest number of cases (22.21%), followed by 2018 (21.08%). Urgent care admissions constituted 85.34% of the total. The age group most affected was 35 to 39 years (11.8%). Men were more affected than women (60.18%), and the majority of hospitalizations were among the Caucasian ethnicity (75.12%). The average length of stay was 23.7 days, and the mortality rate stood at 0.26%.
Conclusions
The increasing trend in hospitalizations, peaking in 2019, highlights the need for preventive measures. Urgent admissions (85.34%) underscore the demand for accessible mental health resources. Men in the 35 to 39 age group are disproportionately affected, suggesting specific risk factors. The predominance of Caucasian ethnicity emphasizes the need for culturally sensitive care. A longer average length of stay (23.7 days) underscores treatment complexity, while a low mortality rate (0.26%) signals effective medical care. In essence, these findings inform tailored mental health policies to enhance service quality and prioritize patient-centered approaches.
Digital Mental Health Interventions (DMHIs) that meet the definition of a medical device are regulated by the Medicines and Healthcare products Regulatory Agency (MHRA) in the UK. The MHRA uses procedures that were originally developed for pharmaceuticals to assess the safety of DMHIs. There is recognition that this may not be ideal, as is evident by an ongoing consultation for reform led by the MHRA and the National Institute for Health and Care Excellence.
Aims
The aim of this study was to generate an experts’ consensus on how the medical regulatory method used for assessing safety could best be adapted for DMHIs.
Method
An online Delphi study containing three rounds was conducted with an international panel of 20 experts with experience/knowledge in the field of UK digital mental health.
Results
Sixty-four items were generated, of which 41 achieved consensus (64%). Consensus emerged around ten recommendations, falling into five main themes: Enhancing the quality of adverse events data in DMHIs; Re-defining serious adverse events for DMHIs; Reassessing short-term symptom deterioration in psychological interventions as a therapeutic risk; Maximising the benefit of the Yellow Card Scheme; and Developing a harmonised approach for assessing the safety of psychological interventions in general.
Conclusion
The implementation of the recommendations provided by this consensus could improve the assessment of safety of DMHIs, making them more effective in detecting and mitigating risk.
Medical researchers are increasingly prioritizing the inclusion of underserved communities in clinical studies. However, mere inclusion is not enough. People from underserved communities frequently experience chronic stress that may lead to accelerated biological aging and early morbidity and mortality. It is our hope and intent that the medical community come together to engineer improved health outcomes for vulnerable populations. Here, we introduce Health Equity Engineering (HEE), a comprehensive scientific framework to guide research on the development of tools to identify individuals at risk of poor health outcomes due to chronic stress, the integration of these tools within existing healthcare system infrastructures, and a robust assessment of their effectiveness and sustainability. HEE is anchored in the premise that strategic intervention at the individual level, tailored to the needs of the most at-risk people, can pave the way for achieving equitable health standards at a broader population level. HEE provides a scientific framework guiding health equity research to equip the medical community with a robust set of tools to enhance health equity for current and future generations.
The acid-catalyzed reaction between methanol and isobutene to give methyl-t-butyl ether may be carried out using a cation-exchanged smectite as the catalyst. In 1,4-dioxan solvent at 60°C smectites exchanged with Al3+, Fe3+, or Cr3+ give yields of ∼60% after 4 hr, whereas smectites exchanged with Cu2+, Pb2+, Ni2+, Co2+, Ca2+, and Na+ give less than ∼8% yield. The reaction is efficient only when certain solvents are used, e.g., with Al3+-smectite the yield is ∼5% when using 1,2-dimethoxyethane, diethyleneglycol diethylether, n-pentane, tetrahydropyran, N-methylmorpholine, or tetrahydrofuran solvents compared with ∼60% using 1,4-dioxan solvent (4 hr). Moreover, the effective solvents depend somewhat on the clay interlayer cation. The use of tetrahydrofuran and tetrahydropyran gives ∼35% yields at 60°C (4 hr) with Fe3+- or Cr3+-smectites but ∼4% yield with Al3+-smectite.
The reaction of 2-methyl pent-2-ene with primary alcohols (C1-C18) at 95°C over an Al-montmorillonite gave yields of 20–90% of ethers of the type R-O-C(CH3)2C3H7. Lower yields were produced if secondary alcohols were employed, and tertiary alcohols gave only a trace of this ether. When a variety of alkenes was reacted with butan-1-ol at 95°C over a similar catalyst, no reaction occurred unless the alkene was capable of forming a tertiary carbonium ion immediately upon protonation. In this case the product was the tertiary ether t-R-O-nC4H9. However, at a reaction temperature of 150°C a variety of products were formed including (1) ether by the attack of butanol on the carbonium ions produced either directly from protonation of the alkenes or by hydride shift from such an ion, (2) alkenes by the attack of n-C4H9+ ions (derived from protonation and dehydration of butanol) on the alkene, (3) di-(but-1-yl) ether by dehydration of the butanol, and (4) small amounts of alcohol by hydration of the alkene. The differences in reactivity below and above 100°C are related directly to the amount of water present in the interlayer space of the clay and the degree of acidity found there. Although the clay behaves as an acid catalyst, the reactions are far cleaner (more selective) than comparable reactions catalyzed by sulfuric acid.
Biodiversity monitoring programmes should be designed with sufficient statistical power to detect population change. Here we evaluated the statistical power of monitoring to detect declines in the occupancy of forest birds on Christmas Island, Australia. We fitted zero-inflated binomial models to 3 years of repeat detection data (2011, 2013 and 2015) to estimate single-visit detection probabilities for four species of concern: the Christmas Island imperial pigeon Ducula whartoni, Christmas Island white-eye Zosterops natalis, Christmas Island thrush Turdus poliocephalus erythropleurus and Christmas Island emerald dove Chalcophaps indica natalis. We combined detection probabilities with maps of occupancy to simulate data collected over the next 10 years for alternative monitoring designs and for different declines in occupancy (10–50%). Specifically, we explored how the number of sites (60, 128, 300, 500), the interval between surveys (1–5 years), the number of repeat visits (2–4 visits) and the location of sites influenced power. Power was high (> 80%) for the imperial pigeon, white-eye and thrush for most scenarios, except for when only 60 sites were surveyed or a 10% decline in occupancy was simulated over 10 years. For the emerald dove, which is the rarest of the four species and has a patchy distribution, power was low in almost all scenarios tested. Prioritizing monitoring towards core habitat for this species only slightly improved power to detect declines. Our study demonstrates how data collected during the early stages of monitoring can be analysed in simulation tools to fine-tune future survey design decisions.
Cognitive training is a non-pharmacological intervention aimed at improving cognitive function across a single or multiple domains. Although the underlying mechanisms of cognitive training and transfer effects are not well-characterized, cognitive training has been thought to facilitate neural plasticity to enhance cognitive performance. Indeed, the Scaffolding Theory of Aging and Cognition (STAC) proposes that cognitive training may enhance the ability to engage in compensatory scaffolding to meet task demands and maintain cognitive performance. We therefore evaluated the effects of cognitive training on working memory performance in older adults without dementia. This study will help begin to elucidate non-pharmacological intervention effects on compensatory scaffolding in older adults.
Participants and Methods:
48 participants were recruited for a Phase III randomized clinical trial (Augmenting Cognitive Training in Older Adults [ACT]; NIH R01AG054077) conducted at the University of Florida and University of Arizona. Participants across sites were randomly assigned to complete cognitive training (n=25) or an education training control condition (n=23). Cognitive training and the education training control condition were each completed during 60 sessions over 12 weeks for 40 hours total. The education training control condition involved viewing educational videos produced by the National Geographic Channel. Cognitive training was completed using the Posit Science Brain HQ training program, which included 8 cognitive training paradigms targeting attention/processing speed and working memory. All participants also completed demographic questionnaires, cognitive testing, and an fMRI 2-back task at baseline and at 12-weeks following cognitive training.
Results:
Repeated measures analysis of covariance (ANCOVA), adjusted for training adherence, transcranial direct current stimulation (tDCS) condition, age, sex, years of education, and Wechsler Test of Adult Reading (WTAR) raw score, revealed a significant 2-back by training group interaction (F[1,40]=6.201, p=.017, η2=.134). Examination of simple main effects revealed baseline differences in 2-back performance (F[1,40]=.568, p=.455, η2=.014). After controlling for baseline performance, training group differences in 2-back performance was no longer statistically significant (F[1,40]=1.382, p=.247, η2=.034).
Conclusions:
After adjusting for baseline performance differences, there were no significant training group differences in 2-back performance, suggesting that the randomization was not sufficient to ensure adequate distribution of participants across groups. Results may indicate that cognitive training alone is not sufficient for significant improvement in working memory performance on a near transfer task. Additional improvement may occur with the next phase of this clinical trial, such that tDCS augments the effects of cognitive training and results in enhanced compensatory scaffolding even within this high performing cohort. Limitations of the study include a highly educated sample with higher literacy levels and the small sample size was not powered for transfer effects analysis. Future analyses will include evaluation of the combined intervention effects of a cognitive training and tDCS on nback performance in a larger sample of older adults without dementia.
Previous research established that white matter hyperintensities (WMH), a biomarker of small vessel cerebrovascular disease, are strong predictors of cognitive function in older adults and associated with clinical presentation of Alzheimer’s disease (AD), particularly when distributed in posterior brain regions. Secondary prevention clinical trials, such as the Anti-Amyloid Treatment in Asymptomatic Alzheimer’s (A4) study, target amyloid accumulation in asymptomatic amyloid positive individuals, but it is unclear the extent to which small vessel cerebrovascular disease accounts for performance on the primary cognitive outcomes in these trials. The purpose of this study was to examine the relationship between regional WMH volume and performance on the Preclinical Alzheimer Cognitive Composite (PACC) among participants screened for participation in the A4 trial. We also determined whether the association between WMH and cognition is moderated by amyloid positivity status.
Participants and Methods:
We assessed demographic, amyloid PET status, cognitive screening, and raw MRI data for participants in the A4 trial and quantitated regional (by cerebral lobe) WMH volumes from T2-weighted FLAIR in amyloid positive and amyloid negative participants at screening. Cognition was assessed using PACC scores, a z-score sum of four cognitive tests: The Mini-Mental State Examination (MMSE), the Free and Cued Selective Reminding Test, Logical Memory Test, and Digit Symbol Substitution Test. We included 1329 amyloid positive and 329 amyloid negative individuals (981 women; mean age=71.79 years; mean education=16.58 years) at the time of the analysis. The sample included Latinx (n=50; 3%), non-Latinx (n=1590; 95.9%), or unspecified ethnicity (n=18; 1.1%) individuals who identified as American Indian/Alaskan Native (n=7; 0.4%), Asian (n=38; 2.3%), Black/African American (n=41; 2.5%), White (n=1551 ; 93.5%), or unspecified (n=21; 1.3%) race. We first examined the associations of total and regional WMH volume and amyloid positivity on PACC scores (the primary cognitive outcome measure for A4) using separate general linear models and then determined whether amyloid positivity status and regional WMH statistically interacted for those WMH regions that showed significant main effects.
Results:
Both increased WMH, in the frontal and parietal lobes particularly, and amyloid positivity were independently associated with poorer performance on the PACC, with similar magnitude. In subsequent models, WMH volume did not interact with amyloid positivity status on PACC scores.
Conclusions:
Regionally distributed WMH are independently associated with cognitive functioning in typical participants enrolled in a secondary prevention clinical trial for AD. These effects are of similar magnitude to the effects of amyloid positivity on cognition, highlighting the extent to which small vessel cerebrovascular disease potentially drives AD-related cognitive profiles. Measures of small vessel cerebrovascular disease should be considered explicitly when evaluating outcomes in trials, both as potential effect modifiers and as possible targets for intervention or prevention. The findings from this study cannot be generalized widely, as the participants are not representative of the overall population.
Cognitive training has shown promise for improving cognition in older adults. Aging involves a variety of neuroanatomical changes that may affect response to cognitive training. White matter hyperintensities (WMH) are one common age-related brain change, as evidenced by T2-weighted and Fluid Attenuated Inversion Recovery (FLAIR) MRI. WMH are associated with older age, suggestive of cerebral small vessel disease, and reflect decreased white matter integrity. Higher WMH load associates with reduced threshold for clinical expression of cognitive impairment and dementia. The effects of WMH on response to cognitive training interventions are relatively unknown. The current study assessed (a) proximal cognitive training performance following a 3-month randomized control trial and (b) the contribution of baseline whole-brain WMH load, defined as total lesion volume (TLV), on pre-post proximal training change.
Participants and Methods:
Sixty-two healthy older adults ages 65-84 completed either adaptive cognitive training (CT; n=31) or educational training control (ET; n=31) interventions. Participants assigned to CT completed 20 hours of attention/processing speed training and 20 hours of working memory training delivered through commercially-available Posit Science BrainHQ. ET participants completed 40 hours of educational videos. All participants also underwent sham or active transcranial direct current stimulation (tDCS) as an adjunctive intervention, although not a variable of interest in the current study. Multimodal MRI scans were acquired during the baseline visit. T1- and T2-weighted FLAIR images were processed using the Lesion Segmentation Tool (LST) for SPM12. The Lesion Prediction Algorithm of LST automatically segmented brain tissue and calculated lesion maps. A lesion threshold of 0.30 was applied to calculate TLV. A log transformation was applied to TLV to normalize the distribution of WMH. Repeated-measures analysis of covariance (RM-ANCOVA) assessed pre/post change in proximal composite (Total Training Composite) and sub-composite (Processing Speed Training Composite, Working Memory Training Composite) measures in the CT group compared to their ET counterparts, controlling for age, sex, years of education and tDCS group. Linear regression assessed the effect of TLV on post-intervention proximal composite and sub-composite, controlling for baseline performance, intervention assignment, age, sex, years of education, multisite scanner differences, estimated total intracranial volume, and binarized cardiovascular disease risk.
Results:
RM-ANCOVA revealed two-way group*time interactions such that those assigned cognitive training demonstrated greater improvement on proximal composite (Total Training Composite) and sub-composite (Processing Speed Training Composite, Working Memory Training Composite) measures compared to their ET counterparts. Multiple linear regression showed higher baseline TLV associated with lower pre-post change on Processing Speed Training sub-composite (ß = -0.19, p = 0.04) but not other composite measures.
Conclusions:
These findings demonstrate the utility of cognitive training for improving postintervention proximal performance in older adults. Additionally, pre-post proximal processing speed training change appear to be particularly sensitive to white matter hyperintensity load versus working memory training change. These data suggest that TLV may serve as an important factor for consideration when planning processing speed-based cognitive training interventions for remediation of cognitive decline in older adults.
Cognitive training using a visual speed-of-processing task, called the Useful Field of View (UFOV) task, reduced dementia risk and reduced decline in activities of daily living at a 10-year follow-up in older adults. However, there is variability in the level of cognitive gains after cognitive training across studies. One potential explanation for this variability could be moderating factors. Prior studies suggest variables moderating cognitive training gains share features of the training task. Learning trials of the Hopkins Verbal Learning Test-Revised (HVLT-R) and Brief Visuospatial Memory Test-Revised (BVMT-R) recruit similar cognitive abilities and have overlapping neural correlates with the UFOV task and speed-ofprocessing/working memory tasks and therefore could serve as potential moderators. Exploring moderating factors of cognitive training gains may boost the efficacy of interventions, improve rigor in the cognitive training literature, and eventually help provide tailored treatment recommendations. This study explored the association between the HVLT-R and BVMT-R learning and the UFOV task, and assessed the moderation of HVLT-R and BVMT-R learning on UFOV improvement after a 3-month speed-ofprocessing/attention and working memory cognitive training intervention in cognitively healthy older adults.
Participants and Methods:
75 healthy older adults (M age = 71.11, SD = 4.61) were recruited as part of a larger clinical trial through the Universities of Florida and Arizona. Participants were randomized into a cognitive training (n=36) or education control (n=39) group and underwent a 40-hour, 12-week intervention. Cognitive training intervention consisted of practicing 4 attention/speed-of-processing (including the UFOV task) and 4 working memory tasks. Education control intervention consisted of watching 40-minute educational videos. The HVLT-R and BVMT-R were administered at the pre-intervention timepoint as part of a larger neurocognitive battery. The learning ratio was calculated as: trial 3 total - trial 1 total/12 - trial 1 total. UFOV performance was measured at pre- and post-intervention time points via the POSIT Brain HQ Double Decision Assessment. Multiple linear regressions predicted baseline Double Decision performance from HVLT-R and BVMT-R learning ratios controlling for study site, age, sex, and education. A repeated measures moderation analysis assessed the moderation of HVLT-R and BVMT-R learning ratio on Double Decision change from pre- to post-intervention for cognitive training and education control groups.
Results:
Baseline Double Decision performance significantly associated with BVMT-R learning ratio (β=-.303, p=.008), but not HVLT-R learning ratio (β=-.142, p=.238). BVMT-R learning ratio moderated gains in Double Decision performance (p<.01); for each unit increase in BVMT-R learning ratio, there was a .6173 unit decrease in training gains. The HVLT-R learning ratio did not moderate gains in Double Decision performance (p>.05). There were no significant moderations in the education control group.
Conclusions:
Better visuospatial learning was associated with faster Double Decision performance at baseline. Those with poorer visuospatial learning improved most on the Double Decision task after training, suggesting that healthy older adults who perform below expectations may show the greatest training gains. Future cognitive training research studying visual speed-of-processing interventions should account for differing levels of visuospatial learning at baseline, as this could impact the magnitude of training outcomes.