We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: The WHO grade of meningioma was updated in 2021 to include homozygous deletions of CDKN2A/B and TERT promotor mutations. Previous work including the recent cIMPACT-NOW statement have discussed the potential value of including chromosomal copy number alterations to help refine the current grading system. Methods: Chromosomal copy number profiles were inferred from from 1964 meningiomas using DNA methylation. Regularized Cox regresssion was used to identify CNAs independenly associated with post-surgical and post-RT PFS. Outcomes were stratified by WHO grade and novel CNAs to assess their potential value in WHO critiera. Results: Patients with WHO grade 1 tumours and chromosome 1p loss had similar outcomes to those with WHO grade 2 tumours (median PFS 5.83 [95% CI 4.36-Inf] vs 4.48 [4.09-5.18] years). Those with chromosome 1p loss and 1q gain had similar outcomes to those with WHO grade 3 cases regardless of initial grade (median PFS 2.23 [1.28-Inf] years WHO grade 1, 1.90 [1.23-2.25] years WHO grade 2, compared to 2.27 [1.68-3.05] years in WHO grade 3 cases overall). Conclusions: We advocate for chromosome 1p loss being added as a criterion for a CNS WHO grade of 2 meningioma and addition of 1q gain as a criterion for a CNS WHO grade of 3.
Background: We previously developed a DNA methylation-based risk predictor for meningioma, which has been used locally in a prospective fashion. As a follow-up, we validate this model using a large prospective cohort and introduce a streamlined next-generation model compatible with newer methylation arrays. Methods: The performance of our next-generation predictor was compared with our original model and standard-of-care 2021 WHO grade using time-dependent receiver operating characteristic curves. A nomogram was generated by incorporating our methylation predictor with WHO grade and extent of resection. Results: A total of 1347 meningioma cases were utilized in the study, including 469 prospective cases from 3 institutions and a retrospective cohort of 100 WHO grade 2 cases for model validation. Both the original and next-generation models significantly outperformed 2021 WHO grade in predicting postoperative recurrence. Dichotomizing into grade-specific risk subgroups was predictive of outcome within both WHO grades 1 and 2 tumours (log-rank p<0.05). Multivariable Cox regression demonstrated benefit of adjuvant radiotherapy in high-risk cases specifically, reinforcing its informative role in clinical decision making. Conclusions: This next-generation DNA methylation-based meningioma outcome predictor significantly outperforms 2021 WHO grading in predicting time to recurrence. This will help improve prognostication and inform patient selection for RT.
Background: Meningiomas exhibit considerable heterogeneity. We previously identified four distinct molecular groups (immunogenic, NF2-wildtype, hypermetabolic, proliferative) which address much of this heterogeneity. Despite their utility, the stochasticity of clustering methods and the requirement of multi-omics data limits the potential for classifying cases in the clinical setting. Methods: Using an international cohort of 1698 meningiomas, we constructed and validated a machine learning-based molecular classifier using DNA methylation alone. Original and newly-predicted molecular groups were compared using DNA methylation, RNA sequencing, whole exome sequencing, and clinical outcomes. Results: Group-specific outcomes in the validation cohort were nearly identical to those originally described, with median PFS of 7.4 (4.9-Inf) years in hypermetabolic tumors and 2.5 (2.3-5.3) years in proliferative tumors (not reached in the other groups). Predicted NF2-wildtype cases had no NF2 mutations, and 51.4% had others mutations previously described in this group. RNA pathway analysis revealed upregulation of immune-related pathways in the immunogenic group, metabolic pathways in the hypermetabolic group and cell-cycle programs in the proliferative group. Bulk deconvolution similarly revealed enrichment of macrophages in immunogenic tumours and neoplastic cells in hypermetabolic/proliferative tumours. Conclusions: Our DNA methylation-based classifier faithfully recapitulates the biology and outcomes of the original molecular groups allowing for their widespread clinical implementation.
Multicenter clinical trials are essential for evaluating interventions but often face significant challenges in study design, site coordination, participant recruitment, and regulatory compliance. To address these issues, the National Institutes of Health’s National Center for Advancing Translational Sciences established the Trial Innovation Network (TIN). The TIN offers a scientific consultation process, providing access to clinical trial and disease experts who provide input and recommendations throughout the trial’s duration, at no cost to investigators. This approach aims to improve trial design, accelerate implementation, foster interdisciplinary teamwork, and spur innovations that enhance multicenter trial quality and efficiency. The TIN leverages resources of the Clinical and Translational Science Awards (CTSA) program, complementing local capabilities at the investigator’s institution. The Initial Consultation process focuses on the study’s scientific premise, design, site development, recruitment and retention strategies, funding feasibility, and other support areas. As of 6/1/2024, the TIN has provided 431 Initial Consultations to increase efficiency and accelerate trial implementation by delivering customized support and tailored recommendations. Across a range of clinical trials, the TIN has developed standardized, streamlined, and adaptable processes. We describe these processes, provide operational metrics, and include a set of lessons learned for consideration by other trial support and innovation networks.
To examine the association of posttraumatic headache (PTH) type with postconcussive symptoms (PCS), pain intensity, and fluid cognitive function across recovery after pediatric concussion.
Methods:
This prospective, longitudinal study recruited children (aged 8–16.99 years) within 24 hours of sustaining a concussion or mild orthopedic injury (OI) from two pediatric hospital emergency departments. Based on parent-proxy ratings of pre- and postinjury headache, children were classified as concussion with no PTH (n = 18), new PTH (n = 43), worse PTH (n = 58), or non-worsening chronic PTH (n = 19), and children with OI with no PTH (n = 58). Children and parents rated PCS and children rated pain intensity weekly up to 6 months. Children completed computerized testing of fluid cognition 10 days, 3 months, and 6- months postinjury. Mixed effects models compared groups across time on PCS, pain intensity, and cognition, controlling for preinjury scores and covariates.
Results:
Group differences in PCS decreased over time. Cognitive and somatic PCS were higher in new, chronic, and worse PTH relative to no PTH (up to 8 weeks postinjury; d = 0.34 to 0.87 when significant) and OI (up to 5 weeks postinjury; d = 0.30 to 1.28 when significant). Pain intensity did not differ by group but declined with time postinjury. Fluid cognition was lower across time in chronic PTH versus no PTH (d = −0.76) and OI (d = −0.61) and in new PTH versus no PTH (d = −0.51).
Conclusions:
Onset of PTH was associated with worse PCS up to 8 weeks after pediatric concussion. Chronic PTH and new PTH were associated with moderately poorer fluid cognitive functioning up to 6 months postinjury. Pain declined over time regardless of PTH type.
Why is there no NATO in Asia? Literature on this question is selective and incomplete. This paper develops a new theory with determinate predictions regarding patron and clients’ alliance design preferences, the alliances that result, the commitments therein, and alliance duration. A subtle but nonetheless persistent form of entrapment problem exists with clients that don’t want a war yet fear adversary aggression. Clients’ commitment to collective security is a hand-tying costly signal that assures the patron of client resolve to defend the status quo and reduces the probability and costs of entrapment. Patrons will rationally prefer to join in an alliance clients who have already made a collective security commitment. Clients are more likely, the paper shows, to make such commitments when their adversary credibly threatens to militarily occupy at least one of them. This is more likely in land than in sea theatres. When clients fail to realise collective security, patron efforts to impose it on them will fail, will result in short lived multilateralism, and will force bilateralism on the patron. The paper uses new archival evidence from Britain and Australia to show how this strategic framework explains variation in alliance design in Europe and Asia.
To understand caregivers’ perceptions about their children’s mealtime social experiences at school and how they believe these social experiences impact their children’s consumption of meals at school (both meals brought from home and school meals).
Design:
Qualitative data were originally collected as part of a larger mixed methods study using an embedded-QUAN dominant research design.
Setting:
Semi-structured interviews were conducted with United States (U.S.) caregivers over ZoomTM in English and Spanish during the 2021–2022 school year. The interview guide contained 14 questions on caregivers’ perceptions about their children’s experiences with school meals.
Participants:
Caregivers of students in elementary, middle and high schools in rural, suburban and urban communities in California (n 46) and Maine (n 20) were interviewed. Most (60·6 %) were caregivers of children who were eligible for free or reduced-price meals.
Results:
Caregivers reported that an important benefit of eating meals at school is their child’s opportunity to socialise with their peers. Caregivers also stated that their child’s favourite aspect of school lunch is socialising with friends. However, some caregivers reported the cafeteria environment caused their children to feel anxious and not eat. Other caregivers reported that their children sometimes skipped lunch and chose to socialise with friends rather than wait in long lunch lines.
Conclusions:
Socialising during school meals is important to both caregivers and students. Policies such as increasing lunch period lengths and holding recess before lunch have been found to promote school meal consumption and could reinforce the positive social aspects of mealtime for students.
Despite the recognised links between food insecurity and parenting, few studies have evaluated the perceived impacts of livelihood or food security interventions on parental practices, intra-household functioning, adolescent behaviour and psychosocial outcomes in HIV-affected households in sub-Saharan Africa.
Aims
The study aimed to understand the perceived effects of food security on parenting practices and how this was experienced by both adolescent girls (aged 13–19 years) and their caregivers in rural Kenya.
Method
We conducted semi-structured, individual interviews with 62 caregiver–adolescent dyads who were participants in the adolescent Shamba Maisha (NCT03741634), a sub-study of adolescent girls and caregivers with a household member participating in the Shamba Maisha agricultural and finance intervention trial (NCT01548599). Data were analysed following the principles of thematic analysis.
Results
Compared to control households, the Shamba Maisha intervention households had improved food security and strengthened economic security, which, in turn, improved parenting practices. Intervention households described changes in parenting experiences, including decreased parental stress, reduced absenteeism and harsh parenting and improved caregiver– adolescent relationships. These positive caregiving practices, in turn, contributed to improved mental health and fewer behavioural problems among adolescent girls. Changes in the control households were less noticeable.
Conclusion
These findings demonstrate how an income-generating agricultural intervention may improve food security and positively affect parenting practices, intra-household dynamics and adolescent psychosocial well-being and behaviour. Further research is needed to explore how to harness the social benefits of agricultural interventions to best address the critical intersections among food insecurity, parenting practices and adolescent mental health.
Individuals with major depressive disorder (MDD) can experience reduced motivation and cognitive function, leading to challenges with goal-directed behavior. When selecting goals, people maximize ‘expected value’ by selecting actions that maximize potential reward while minimizing associated costs, including effort ‘costs’ and the opportunity cost of time. In MDD, differential weighing of costs and benefits are theorized mechanisms underlying changes in goal-directed cognition and may contribute to symptom heterogeneity.
Methods
We used the Effort Foraging Task to quantify cognitive and physical effort costs, and patch leaving thresholds in low effort conditions (reflecting perceived opportunity cost of time) and investigated their shared versus distinct relationships to clinical features in participants with MDD (N = 52, 43 in-episode) and comparisons (N = 27).
Results
Contrary to our predictions, none of the decision-making measures differed with MDD diagnosis. However, each of the measures was related to symptom severity, over and above effects of ability (i.e. performance). Greater anxiety symptoms were selectively associated with lower cognitive effort cost (i.e. greater willingness to exert effort). Anhedonia and behavioral apathy were associated with increased physical effort costs. Finally, greater overall depression was related to decreased patch leaving thresholds.
Conclusions
Markers of effort-based decision-making may inform understanding of MDD heterogeneity. Increased willingness to exert cognitive effort may contribute to anxiety symptoms such as worry. Decreased leaving threshold associations with symptom severity are consistent with reward rate-based accounts of reduced vigor in MDD. Future research should address subtypes of depression with or without anxiety, which may relate differentially to cognitive effort decisions.
Epidemiological studies show that despite the episodic nature, the long-term trajectory of depression can be variable. This study evaluated the heterogeneity of 10-year trajectory of major depressive disorder (MDD) related service utilization and associated clinical characteristics among US Veterans with a first diagnosis after 9/11.
Methods
Using a cohort design, electronic health record data for 293,265 Operation Enduring Freedom and Iraqi Freedom (OEF/OIF) Veterans were extracted to identify those with MDD between 2001 and 2021 with a full preceding year of clinical data and 10 years following the diagnosis. Latent class growth analysis compared clinical characteristics associated with four depression trajectories. Across all Veterans Affairs (VA)hospitals, 25,307 Veterans met our inclusion criteria. Demographic and clinical information from medical records was extracted and used as predictors of depression 10-year trajectories.
Results
Among the study cohort (N = 25,307), 27.7% were characterized by brief contact, 41.7% were later re-entry, 17.6% were persistent contact and 12.9% were prolonged initial contact for depression related services. Compared to Veterans with trajectories showing brief contact, those with protracted treatment (persistent or prolonged initial contact) were more likely to be diagnosed with comorbid posttraumatic stress disorder (PTSD) and with MDD that was moderate to severe or recurrent.
Conclusions
Depression is associated with a range of treatment trajectories. The persistent and prolonged initial contact trajectories may have distinct characteristics and uniquely high resource utilization and disability income. We can anticipate that patients with comorbid PTSD may need longer-term care which has implications for brief models of care.
Interpersonal psychotherapy (IPT) and antidepressant medications are both first-line interventions for adult depression, but their relative efficacy in the long term and on outcome measures other than depressive symptomatology is unknown. Individual participant data (IPD) meta-analyses can provide more precise effect estimates than conventional meta-analyses. This IPD meta-analysis compared the efficacy of IPT and antidepressants on various outcomes at post-treatment and follow-up (PROSPERO: CRD42020219891). A systematic literature search conducted May 1st, 2023 identified randomized trials comparing IPT and antidepressants in acute-phase treatment of adults with depression. Anonymized IPD were requested and analyzed using mixed-effects models. The prespecified primary outcome was post-treatment depression symptom severity. Secondary outcomes were all post-treatment and follow-up measures assessed in at least two studies. IPD were obtained from 9 of 15 studies identified (N = 1536/1948, 78.9%). No significant comparative treatment effects were found on post-treatment measures of depression (d = 0.088, p = 0.103, N = 1530) and social functioning (d = 0.026, p = 0.624, N = 1213). In smaller samples, antidepressants performed slightly better than IPT on post-treatment measures of general psychopathology (d = 0.276, p = 0.023, N = 307) and dysfunctional attitudes (d = 0.249, p = 0.029, N = 231), but not on any other secondary outcomes, nor at follow-up. This IPD meta-analysis is the first to examine the acute and longer-term efficacy of IPT v. antidepressants on a broad range of outcomes. Depression treatment trials should routinely include multiple outcome measures and follow-up assessments.
It is well established that there is a substantial genetic component to eating disorders (EDs). Polygenic risk scores (PRSs) can be used to quantify cumulative genetic risk for a trait at an individual level. Recent studies suggest PRSs for anorexia nervosa (AN) may also predict risk for other disordered eating behaviors, but no study has examined if PRS for AN can predict disordered eating as a global continuous measure. This study aimed to investigate whether PRS for AN predicted overall levels of disordered eating, or specific lifetime disordered eating behaviors, in an Australian adolescent female population.
Methods
PRSs were calculated based on summary statistics from the largest Psychiatric Genomics Consortium AN genome-wide association study to date. Analyses were performed using genome-wide complex trait analysis to test the associations between AN PRS and disordered eating global scores, avoidance of eating, objective bulimic episodes, self-induced vomiting, and driven exercise in a sample of Australian adolescent female twins recruited from the Australian Twin Registry (N = 383).
Results
After applying the false-discovery rate correction, the AN PRS was significantly associated with all disordered eating outcomes.
Conclusions
Findings suggest shared genetic etiology across disordered eating presentations and provide insight into the utility of AN PRS for predicting disordered eating behaviors in the general population. In the future, PRSs for EDs may have clinical utility in early disordered eating risk identification, prevention, and intervention.
The figure of Anthony Comstock may seem like an odd historical relic: a repressed, puritanical, anti-sex reformer from a bygone past. And yet, because his namesake act has been revived as a potential strategy for limiting access to reproductive healthcare, Comstock is no joke. Today, some Americans see the Comstock Act, passed by Congress in 1873, as a pathway to banning abortion and other reproductive care, effectively jettisoning any need for new Supreme Court abortion rulings or congressional legislation. As scholars of the Gilded Age and Progressive Era, we are uniquely situated to intervene in this dialogue and ensure that contemporary conversations are grounded in historical context. We present this forum not as an exhaustive account of the Comstock Act and its architect, but as aopportunity to highlight the context in which this law, which holds so much potential relevance for our present, was created, enacted, enforced, and challenged. We hope this forum will stimulate further scholarly and public conversations around the nation’s long history of regulating reproductive rights and how that history became entangled with other social anxieties.
The aim of this project is to study to which extent salience alterations influence the severity of psychotic symptoms. However, rather than studying them individually, we decided to focus on their interplay with two additional variables, that is: observing their effect in a vulnerability phase (adolescence) and with another added, well-recognized risk factor (cannabis use).
The reason for this study design lies in the fact that, in our opinion, it is fundamental to observe the trajectory of psychotic symptoms over a continuum; however, rather than adopting a longitudinal approach, we decided to structure it as a cross-sectional study confronting patients from two age brackets - adolescence and adulthood.
Objectives
The primary purpose of this study was to assess a difference between THC-abusing and non-abusing patients in adolescent and adult cohorts, using the Italian version of the psychometric scale “Aberrant Salience Inventory” (ASI), and the possible correlation with more severe psychotic symptoms. The employment of several different psychometric scales and the inclusion of a variegated cohort allowed to pursue multiple secondary objectives.
Methods
We recruited 192 patients, subsequently divided into six subgroups based on age and department of recruitment (whether adolescent or adult psychiatric or neurologic units - the latter serving as controls). Each individual was administered a set of questionnaires and a socio-demographic survey; the set included: Aberrant Salience Inventory (ASI), Community Assessment of Psychic Experiences (CAPE), Positive and Negative Syndrome Scale (PANSS), Montgomery-Asberg Depression Rating Scale (MADRS), Mania Rating Scale (MRS), Hamilton Anxiety Scale (HAM-A), Association for Methodology and Documentation in Psychiatry (AMDP) and Cannabis Experience Questionnaire (CEQ).
Results
The data analysis showed statistically significant (p<0.05) differences between adolescents and adults with psychotic symptoms in all of the three scales of PANSS and in MADRS. These two groups were homogenous for both cannabis use and ASI score. The intra-group comparison (either adolescent or adult) showed a hierarchical pattern in the scores of psychometric scales according to the diagnostic subgroup of allocation: patients with psychotic symptoms showed an higher level of psychopathology in all measures when compared to patients from the psychiatric unit without psychotic symptoms, which in turn scored higher than the patients from the neurologic unit.
Image:
Conclusions
The results of the present study may suggest that when salience alterations occur in adolescents with cannabis exposure, we might observe worsened positive and negative psychotic symptoms; their influence might be relevant also in other domains, especially regarding the depressive and anxiety spectrums.
Challenges to communication between families and care providers of paediatric patients in intensive care units (ICU) include variability of communication preferences, mismatched goals of care, and difficulties carrying forward family preferences from provider to provider. Our objectives were to develop and test an assessment tool that queries parents of children requiring cardiac intensive care about their communication preferences and to determine if this tool facilitates patient-centred care and improves families’ ICU experience.
Design:
In this quality improvement initiative, a novel tool was developed, the Parental Communication Assessment (PCA), which asked parents with children hospitalised in the cardiac ICU about their communication preferences. Participants were prospectively randomised to the intervention group, which received the PCA, or to standard care. All participants completed a follow-up survey evaluating satisfaction with communication.
Main Results:
One hundred thirteen participants enrolled and 56 were randomised to the intervention group. Participants who received the PCA preferred detail-oriented communication over big picture. Most parents understood the daily discussions on rounds (64%) and felt comfortable expressing concerns (68%). Eighty-six percent reported the PCA was worthwhile. Parents were generally satisfied with communication. However, an important proportion felt unprepared for difficult decisions or setbacks, inadequately included or supported in decision-making, and that they lacked control over their child’s care. There were no significant differences between the intervention and control groups in their communication satisfaction results.
Conclusions:
Parents with children hospitalised in the paediatric ICU demonstrated diverse communication preferences. Most participants felt overall satisfied with communication, but individualising communication with patients’ families according to their preferences may improve their experience.
The timing of tracheostomy for intensive care unit patients is controversial, with conflicting findings on early versus late tracheostomy.
Methods
Patients undergoing tracheostomy from 2001through 2012 were identified from the Medical Information Mart for Intensive Care-III database. Early tracheostomy was defined as less than the 25th percentile of time from intensive care unit admission to tracheostomy (time to tracheostomy). Statistical analysis for tracheostomy timing on intensive care unit length of stay and mortality were conducted.
Results
Of the 1,566 patients that were included, patients with early tracheostomy had shorter intensive care unit length of stay (27.32 vs 12.55 days, p < 0.001) and lower mortality (12.9 per cent vs 9.0 per cent, p = 0.039). Multivariate logistic regression analysis found an association between increasing time to tracheostomy and mortality (odds ratio: 1.029, 95 per cent confidence interval 1.007–1.051, p = 0.009).
Conclusion
Our analysis revealed that patients with early tracheostomy were more likely to have shorter intensive care unit lengths of stay and lower mortality. Our data suggest that early tracheostomy should be given strong consideration in appropriately selected patients.
Cognitive training is a non-pharmacological intervention aimed at improving cognitive function across a single or multiple domains. Although the underlying mechanisms of cognitive training and transfer effects are not well-characterized, cognitive training has been thought to facilitate neural plasticity to enhance cognitive performance. Indeed, the Scaffolding Theory of Aging and Cognition (STAC) proposes that cognitive training may enhance the ability to engage in compensatory scaffolding to meet task demands and maintain cognitive performance. We therefore evaluated the effects of cognitive training on working memory performance in older adults without dementia. This study will help begin to elucidate non-pharmacological intervention effects on compensatory scaffolding in older adults.
Participants and Methods:
48 participants were recruited for a Phase III randomized clinical trial (Augmenting Cognitive Training in Older Adults [ACT]; NIH R01AG054077) conducted at the University of Florida and University of Arizona. Participants across sites were randomly assigned to complete cognitive training (n=25) or an education training control condition (n=23). Cognitive training and the education training control condition were each completed during 60 sessions over 12 weeks for 40 hours total. The education training control condition involved viewing educational videos produced by the National Geographic Channel. Cognitive training was completed using the Posit Science Brain HQ training program, which included 8 cognitive training paradigms targeting attention/processing speed and working memory. All participants also completed demographic questionnaires, cognitive testing, and an fMRI 2-back task at baseline and at 12-weeks following cognitive training.
Results:
Repeated measures analysis of covariance (ANCOVA), adjusted for training adherence, transcranial direct current stimulation (tDCS) condition, age, sex, years of education, and Wechsler Test of Adult Reading (WTAR) raw score, revealed a significant 2-back by training group interaction (F[1,40]=6.201, p=.017, η2=.134). Examination of simple main effects revealed baseline differences in 2-back performance (F[1,40]=.568, p=.455, η2=.014). After controlling for baseline performance, training group differences in 2-back performance was no longer statistically significant (F[1,40]=1.382, p=.247, η2=.034).
Conclusions:
After adjusting for baseline performance differences, there were no significant training group differences in 2-back performance, suggesting that the randomization was not sufficient to ensure adequate distribution of participants across groups. Results may indicate that cognitive training alone is not sufficient for significant improvement in working memory performance on a near transfer task. Additional improvement may occur with the next phase of this clinical trial, such that tDCS augments the effects of cognitive training and results in enhanced compensatory scaffolding even within this high performing cohort. Limitations of the study include a highly educated sample with higher literacy levels and the small sample size was not powered for transfer effects analysis. Future analyses will include evaluation of the combined intervention effects of a cognitive training and tDCS on nback performance in a larger sample of older adults without dementia.
Cognitive training has shown promise for improving cognition in older adults. Aging involves a variety of neuroanatomical changes that may affect response to cognitive training. White matter hyperintensities (WMH) are one common age-related brain change, as evidenced by T2-weighted and Fluid Attenuated Inversion Recovery (FLAIR) MRI. WMH are associated with older age, suggestive of cerebral small vessel disease, and reflect decreased white matter integrity. Higher WMH load associates with reduced threshold for clinical expression of cognitive impairment and dementia. The effects of WMH on response to cognitive training interventions are relatively unknown. The current study assessed (a) proximal cognitive training performance following a 3-month randomized control trial and (b) the contribution of baseline whole-brain WMH load, defined as total lesion volume (TLV), on pre-post proximal training change.
Participants and Methods:
Sixty-two healthy older adults ages 65-84 completed either adaptive cognitive training (CT; n=31) or educational training control (ET; n=31) interventions. Participants assigned to CT completed 20 hours of attention/processing speed training and 20 hours of working memory training delivered through commercially-available Posit Science BrainHQ. ET participants completed 40 hours of educational videos. All participants also underwent sham or active transcranial direct current stimulation (tDCS) as an adjunctive intervention, although not a variable of interest in the current study. Multimodal MRI scans were acquired during the baseline visit. T1- and T2-weighted FLAIR images were processed using the Lesion Segmentation Tool (LST) for SPM12. The Lesion Prediction Algorithm of LST automatically segmented brain tissue and calculated lesion maps. A lesion threshold of 0.30 was applied to calculate TLV. A log transformation was applied to TLV to normalize the distribution of WMH. Repeated-measures analysis of covariance (RM-ANCOVA) assessed pre/post change in proximal composite (Total Training Composite) and sub-composite (Processing Speed Training Composite, Working Memory Training Composite) measures in the CT group compared to their ET counterparts, controlling for age, sex, years of education and tDCS group. Linear regression assessed the effect of TLV on post-intervention proximal composite and sub-composite, controlling for baseline performance, intervention assignment, age, sex, years of education, multisite scanner differences, estimated total intracranial volume, and binarized cardiovascular disease risk.
Results:
RM-ANCOVA revealed two-way group*time interactions such that those assigned cognitive training demonstrated greater improvement on proximal composite (Total Training Composite) and sub-composite (Processing Speed Training Composite, Working Memory Training Composite) measures compared to their ET counterparts. Multiple linear regression showed higher baseline TLV associated with lower pre-post change on Processing Speed Training sub-composite (ß = -0.19, p = 0.04) but not other composite measures.
Conclusions:
These findings demonstrate the utility of cognitive training for improving postintervention proximal performance in older adults. Additionally, pre-post proximal processing speed training change appear to be particularly sensitive to white matter hyperintensity load versus working memory training change. These data suggest that TLV may serve as an important factor for consideration when planning processing speed-based cognitive training interventions for remediation of cognitive decline in older adults.
Interventions using a cognitive training paradigm called the Useful Field of View (UFOV) task have shown to be efficacious in slowing cognitive decline. However, no studies have looked at the engagement of functional networks during UFOV task completion. The current study aimed to (a) assess if regions activated during the UFOV fMRI task were functionally connected and related to task performance (henceforth called the UFOV network), (b) compare connectivity of the UFOV network to 7 resting-state functional connectivity networks in predicting proximal (UFOV) and near-transfer (Double Decision) performance, and (c) explore the impact of network segregation between higher-order networks and UFOV performance.
Participants and Methods:
336 healthy older adults (mean age=71.6) completed the UFOV fMRI task in a Siemens 3T scanner. UFOV fMRI accuracy was calculated as the number of correct responses divided by 56 total trials. Double Decision performance was calculated as the average presentation time of correct responses in log ms, with lower scores equating to better processing speed. Structural and functional MRI images were processed using the default pre-processing pipeline within the CONN toolbox. The Artifact Rejection Toolbox was set at a motion threshold of 0.9mm and participants were excluded if more than 50% of volumes were flagged as outliers. To assess connectivity of regions associated with the UFOV task, we created 10 spherical regions of interest (ROIs) a priori using the WFU PickAtlas in SPM12. These include the bilateral pars triangularis, supplementary motor area, and inferior temporal gyri, as well as the left pars opercularis, left middle occipital gyrus, right precentral gyrus and right superior parietal lobule. We used a weighted ROI-to-ROI connectivity analysis to model task-based within-network functional connectivity of the UFOV network, and its relationship to UFOV accuracy. We then used weighted ROI-to-ROI connectivity analysis to compare the efficacy of the UFOV network versus 7 resting-state networks in predicting UFOV fMRI task performance and Double Decision performance. Finally, we calculated network segregation among higher order resting state networks to assess its relationship with UFOV accuracy. All functional connectivity analyses were corrected at a false discovery threshold (FDR) at p<0.05.
Results:
ROI-to-ROI analysis showed significant within-network functional connectivity among the 10 a priori ROIs (UFOV network) during task completion (all pFDR<.05). After controlling for covariates, greater within-network connectivity of the UFOV network associated with better UFOV fMRI performance (pFDR=.008). Regarding the 7 resting-state networks, greater within-network connectivity of the CON (pFDR<.001) and FPCN (pFDR=. 014) were associated with higher accuracy on the UFOV fMRI task. Furthermore, greater within-network connectivity of only the UFOV network associated with performance on the Double Decision task (pFDR=.034). Finally, we assessed the relationship between higher-order network segregation and UFOV accuracy. After controlling for covariates, no significant relationships between network segregation and UFOV performance remained (all p-uncorrected>0.05).
Conclusions:
To date, this is the first study to assess task-based functional connectivity during completion of the UFOV task. We observed that coherence within 10 a priori ROIs significantly predicted UFOV performance. Additionally, enhanced within-network connectivity of the UFOV network predicted better performance on the Double Decision task, while conventional resting-state networks did not. These findings provide potential targets to optimize efficacy of UFOV interventions.