We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Vancomycin-resistant enterococci (VRE) can cause serious healthcare-associated infections. Patients can become colonized and infected through contact with healthcare workers, hospital surfaces, equipment, and other patients. We evaluated the utility of broadly applied whole-genome sequencing (WGS) surveillance of vancomycin-resistant Enterococcus faecium (VREfm) for detection of hospital-based transmission.
Design:
Retrospective genomic and epidemiologic analysis of clinical VREfm isolates
Setting:
Brigham and Women’s Hospital, an 800-bed tertiary care center in Boston, MA, USA
Methods:
VREfm was isolated from patient screening and diagnostic specimens. We sequenced the genomes of 156 VREfm isolates, 12 at the request of infection control and 144 as a convenience sample, and used single nucleotide polymorphism (SNP) differences to assess relatedness. For isolate pairs separated by 15 or fewer SNPs by two orthogonal comparison methods, we mapped epidemiologic connections to identify putative transmission clusters.
Results:
We found evidence for 16 putative transmission clusters comprising between two and four isolates each and involving 41/156 isolates (26.3%). Our analysis discovered 14 clusters that were missed by traditional surveillance methods and additional members of two clusters that were detected by traditional methods. Patients in four transmission clusters were linked only by exposure to the postanesthesia care unit.
Conclusions:
We show that WGS surveillance for VREfm can support infection control investigations and detect transmission events missed by routine surveillance methods. We identify the postanesthesia care unit as a locus for VREfm transmission, which demonstrates how WGS surveillance could inform targeted interventions to prevent the spread of VREfm.
Posttraumatic stress disorder (PTSD) has been associated with advanced epigenetic age cross-sectionally, but the association between these variables over time is unclear. This study conducted meta-analyses to test whether new-onset PTSD diagnosis and changes in PTSD symptom severity over time were associated with changes in two metrics of epigenetic aging over two time points.
Methods
We conducted meta-analyses of the association between change in PTSD diagnosis and symptom severity and change in epigenetic age acceleration/deceleration (age-adjusted DNA methylation age residuals as per the Horvath and GrimAge metrics) using data from 7 military and civilian cohorts participating in the Psychiatric Genomics Consortium PTSD Epigenetics Workgroup (total N = 1,367).
Results
Meta-analysis revealed that the interaction between Time 1 (T1) Horvath age residuals and new-onset PTSD over time was significantly associated with Horvath age residuals at T2 (meta β = 0.16, meta p = 0.02, p-adj = 0.03). The interaction between T1 Horvath age residuals and changes in PTSD symptom severity over time was significantly related to Horvath age residuals at T2 (meta β = 0.24, meta p = 0.05). No associations were observed for GrimAge residuals.
Conclusions
Results indicated that individuals who developed new-onset PTSD or showed increased PTSD symptom severity over time evidenced greater epigenetic age acceleration at follow-up than would be expected based on baseline age acceleration. This suggests that PTSD may accelerate biological aging over time and highlights the need for intervention studies to determine if PTSD treatment has a beneficial effect on the aging methylome.
Objectives/Goals: Early childhood obesity is a major concern for Latin American children in the U.S., with gut barrier dysfunction as a key risk factor. Diet plays a role in gut development, but few studies have focused on Latin American infants. Our objective is to identify culturally relevant introductory foods that promote in vitro gut barrier development and function. Methods/Study Population: Pooled human milk (2.5 mL) from 6-month postpartum Hispanic mothers was combined with fruit and vegetable baby food products (2.5 g) and subjected to a 3-phase in vitro digestion system that simulates oral, gastric, and intestinal digestion. Digesta products were then anaerobically fermented for 24-hours using human stool inoculum, centrifuged, and filter sterilized. Intestinal epithelial cells (Caco-2, ATCC) were grown to confluence on 0.4 μm polystyrene transwell inserts using a DMEM + 10% FBS medium and allowed to differentiate for 21-days. Highly differentiated monolayers were treated with a 1:4 dilution of fermenta with medium in triplicate. The cell experiment was conducted twice. Cell layer integrity was measured using transepithelial electrical resistance (TEER) 24- and 48-hours after treatment. Results/Anticipated Results: Dietary intake data from the What We Eat in America database indicated that the top 3 fruit and vegetable exposures for infants with Mexican or Hispanic ethnicity were banana, apple, and carrot. Commercial baby food purees of these fruits and vegetables, in addition to baby foods with blueberry and spinach (Natural for Baby, Gerber Products Company) were acquired for digestion and fermentation experiments. Caco-2 cell experiments with these foods are ongoing. We expect Caco-2 monolayer incubated with fermenta from human milk and fruit or vegetables will have greater TEER values due to increased integrity of the cell layer as compared to those with breast milk alone. We also expect that exposure to fruit and vegetable fermenta will increase gene expression of tight junctions compared to exposure to media and human milk. Discussion/Significance of Impact: Using an in vitro digestion and fermentation system coupled with cell culture studies, we are identifying cellular mechanisms that link individual fruits and vegetables to gut barrier function. This will support translational work focused on mitigating obesity development in vulnerable populations.
The recommended first-line treatment for insomnia is cognitive behavioral therapy for insomnia (CBTi), but access is limited. Telehealth- or internet-delivered CBTi are alternative ways to increase access. To date, these intervention modalities have never been compared within a single study. Further, few studies have examined (a) predictors of response to the different modalities, (b) whether successfully treating insomnia can result in improvement of health-related biomarkers, and (c) mechanisms of change in CBTi. This protocol was designed to compare the three CBTi modalities to each other and a waitlist control for adults aged 50–65 years (N = 100). Participants are randomly assigned to one of four study arms: in-person- (n = 30), telehealth- (n = 30) internet-delivered (n = 30) CBTi, or 12-week waitlist control (n = 10). Outcomes include self-reported insomnia symptom severity, polysomnography, circadian rhythms of activity and core body temperature, blood- and sweat-based biomarkers, cognitive functioning and magnetic resonance imaging.
Psychopathology assessed across the lifespan often can be summarized with a few broad dimensions: internalizing, externalizing, and psychosis/thought disorder. Extensive overlap between internalizing and externalizing symptoms has garnered interest in bifactor models comprised of a general co-occurring factor and specific internalizing and externalizing factors. We focus on internalizing and externalizing symptoms and compare a bifactor model to a correlated two-factor model of psychopathology at three timepoints in a large adolescent community sample (N = 387; 55 % female; 83% Caucasian; M age = 12.1 at wave 1) using self- and parent-reports. Each model was tested within each time-point with 25–28 validators. The bifactor models demonstrated better fit to the data. Child report had stronger invariance across time. Parent report had stronger reliability over time. Cross-informant correlations between the factors at each wave indicated that the bifactor model had slightly poorer convergent validity but stronger discriminant validity than the two-factor model. With notable exceptions, this pattern of results replicated across informants and waves. The overlap between internalizing and externalizing pathology is systematically and, sometimes, non-linearly related to risk factors and maladaptive outcomes. Strengths and weaknesses to modeling psychopathology as two or three factors and clinical and developmental design implications are discussed.
Animal foods, especially dairy products, eggs and fish, are the main source of iodine in the UK. However, the use of plant-based alternative products (PBAP) is increasing owing to issues of environmental sustainability. We previously measured the iodine content of milk-alternatives(1) but data are lacking on the iodine content of other plant-based products and there is now a greater number of iodine-fortified products. We aimed to compare: (i) the iodine concentration of fortified and unfortified PBAP and (ii) the iodine concentration of PBAP with their animal-product equivalents, including those not previously measured such as egg and fish alternatives.
The iodine concentration of 50 PBAP was analysed in March 2022 at LGC using ICP-MS. The products were selected from a market survey of six UK supermarkets in December 2021. Samples of matrix-matched (e.g. soya/oat) fortified and unfortified alternatives to milk (n = 13 and n = 11), yoghurt (n = 2 and n = 7) and cream (n = 1 and n = 5) were selected for analysis, as well as egg- (n = 1) and fish-alternatives (n = 10). We compared the iodine concentration between PBAPs and data on their animal-product equivalents(2).
The iodine concentration of fortified PBAPs was significantly higher than that of unfortified products; the median iodine concentration of fortified vs. unfortified milk alternatives was 321 vs. 0.84 µg/kg (p<0.001) and of fortified and unfortified yoghurt alternatives was 212 µg/kg vs 3.03 µg/kg (p = 0.04). The fortified cream alternative had a higher iodine concentration than the unfortified alternatives (259 vs. 26.5 µg/kg). The measured iodine concentration of the fortified products differed from that of the product label (both lower and higher); overall, the measured iodine concentration was significantly higher than that stated on the label (mean difference 49.1 µg/kg; p = 0.018).
Compared to the animal-product equivalents, the iodine concentration of unfortified PBAPs was significantly lower for milk (p<0.001) and yoghurt (p<0.001), while there was no difference with fortified versions of milk (p = 0.28) and yoghurt (p = 0.09). The egg alternative had an iodine concentration that was just 0.6% of that of chicken eggs (3.38 vs. 560 µg/kg). Three (30%) of the fish alternatives had kelp/seaweed as ingredients and the median iodine concentration of these products was (non-significantly) higher than those without (126 vs 75 μg/kg; p = 0.83). However, the iodine content of all fish-alternative products was ten-times lower than that of fish (median 99 vs. 995 µg/kg; p<0.001).
The majority of PBAP are not fortified with iodine but those that are fortified have a significantly higher iodine concentration than unfortified products and are closer to the value of their animal equivalents. From an iodine perspective, unfortified plant-based alternatives are not suitable replacements and consumers should ensure adequate iodine from other dietary sources. Manufacturers should consider iodine fortification of a greater number of plant-based alternatives.
Being married may protect late-life cognition. Less is known about living arrangement among unmarried adults and mechanisms such as brain health (BH) and cognitive reserve (CR) across race and ethnicity or sex/gender. The current study examines (1) associations between marital status, BH, and CR among diverse older adults and (2) whether one’s living arrangement is linked to BH and CR among unmarried adults.
Method:
Cross-sectional data come from the Washington Heights-Inwood Columbia Aging Project (N = 778, 41% Hispanic, 33% non-Hispanic Black, 25% non-Hispanic White; 64% women). Magnetic resonance imaging (MRI) markers of BH included cortical thickness in Alzheimer’s disease signature regions and hippocampal, gray matter, and white matter hyperintensity volumes. CR was residual variance in an episodic memory composite after partialing out MRI markers. Exploratory analyses stratified by race and ethnicity and sex/gender and included potential mediators.
Results:
Marital status was associated with CR, but not BH. Compared to married individuals, those who were previously married (i.e., divorced, widowed, and separated) had lower CR than their married counterparts in the full sample, among White and Hispanic subgroups, and among women. Never married women also had lower CR than married women. These findings were independent of age, education, physical health, and household income. Among never married individuals, living with others was negatively linked to BH.
Conclusions:
Marriage may protect late-life cognition via CR. Findings also highlight differential effects across race and ethnicity and sex/gender. Marital status could be considered when assessing the risk of cognitive impairment during routine screenings.
Background: Treatment of generalized myasthenia gravis (gMG) with reduced steroid dosages may minimize steroid-associated AEs. Corticosteroid dosage changes were not permitted during the 26-week, CHAMPION MG study of ravulizumab in adults with anti-acetylcholine receptor antibody-positive (AChRAb+) gMG. Participants who completed the study could receive ravulizumab in the open-label extension (OLE; NCT03920293); corticosteroid adjustments were permitted. Methods: Patients could receive intravenous ravulizumab (blind induction or bridging dose at Week 26 [OLE start] for those previously receiving placebo or ravulizumab, respectively, then 3000–3600 mg at Week 28 and every 8 weeks thereafter) for ≤4 years. Results: Among 161 patients (78 ravulizumab, 83 placebo) who entered the OLE and received ravulizumab for ≤164 weeks, 113 received oral or enteral corticosteroids during the OLE; the proportion treated with >10 mg/day corticosteroids decreased from 58% (n=66) at first OLE dose to 37% (n=42) (35 [31%] received ≤5 mg/day and 71 [63%] received ≤10 mg/day) at last reported dose. Fourteen patients (12%) discontinued corticosteroids. The mean (SD) corticosteroid dosage/patient decreased from 17.5 (11.9) mg/day at first OLE dose to 11.7 (10.9) mg/day at last assessment. Conclusions: Ravulizumab decreased corticosteroid use in patients with AChRAb+ gMG, suggesting a steroid-sparing role for ravulizumab.
To provide a systematic synthesis of primary care practice-based interventions and their effect on participation in population-based cancer screening programs.
Background:
Globally, population-based cancer screening programs (bowel, breast, and cervical) have sub-optimal participation rates. Primary healthcare workers (PHCWs) have an important role in facilitating a patient’s decision to screen; however, barriers exist to their engagement. It remains unclear how to best optimize the role of PHCWs to increase screening participation.
Methods:
A comprehensive search was conducted from January 2010 until November 2023 in the following databases: Medline (OVID), EMBASE, and CINAHL. Data extraction, quality assessment, and synthesis were conducted. Studies were separated by whether they assessed the effect of a single-component or multi-component intervention and study type.
Findings:
Forty-nine studies were identified, of which 36 originated from the USA. Fifteen studies were investigations of single-component interventions, and 34 studies were of multi-component interventions. Interventions with a positive effect on screening participation were predominantly multi-component, and most included combinations of audit and feedback, provider reminders, practice-facilitated assessment and improvement, and patient education across all screening programs. Regarding bowel screening, provision of screening kits at point-of-care was an effective strategy to increase participation. Taking a ‘whole-of-practice approach’ and identifying a ‘practice champion’ were found to be contextual factors of effective interventions.
The findings suggest that complex interventions comprised of practitioner-focused and patient-focused components are required to increase cancer screening participation in primary care settings. This study provides novel understanding as to what components and contextual factors should be included in primary care practice-based interventions.
Bentonites are readily available clays used in the livestock industry as feed additives to reduce aflatoxin (AF) exposure; their potential interaction with nutrients is the main concern limiting their use, however. The objective of the present study was to determine the safety of a dietary sodium-bentonite (Na-bentonite) supplement as a potential AF adsorbent, using juvenile Sprague Dawley (SD) rats as a research model. Animals were fed either a control diet or a diet containing Na-bentonite at 0.25% and 2% (w/w) inclusion rate. Growth, serum, and blood biochemical parameters, including selected serum vitamins (A and E) and elements such as calcium (Ca), potassium (K), iron (Fe), and zinc (Zn) were measured. The mineral characteristics and the aflatoxin B1 sorption capacity of Na-bentonite were also determined. By the end of the study, males gained more weight than females in control and Na-bentonite groups (p ≤ 0.0001); the interaction between treatment and sex was not significant (p = 0.6780), however. Some significant differences between the control group and bentonite treatments were observed in serum biochemistry and vitamin and minerals measurements; however, parameters fell within reference clinical values reported for SD rats and no evidence of dose-dependency was found. Serum Na and Na/K ratios were increased, while K levels were decreased in males and females from Na-bentonite groups. Serum Zn levels were decreased only in males from Na-bentonite treatments. Overall, results showed that inclusion of Na-bentonite at 0.25% and 2% did not cause any observable toxicity in a 3-month rodent study.
Cognitive training is a non-pharmacological intervention aimed at improving cognitive function across a single or multiple domains. Although the underlying mechanisms of cognitive training and transfer effects are not well-characterized, cognitive training has been thought to facilitate neural plasticity to enhance cognitive performance. Indeed, the Scaffolding Theory of Aging and Cognition (STAC) proposes that cognitive training may enhance the ability to engage in compensatory scaffolding to meet task demands and maintain cognitive performance. We therefore evaluated the effects of cognitive training on working memory performance in older adults without dementia. This study will help begin to elucidate non-pharmacological intervention effects on compensatory scaffolding in older adults.
Participants and Methods:
48 participants were recruited for a Phase III randomized clinical trial (Augmenting Cognitive Training in Older Adults [ACT]; NIH R01AG054077) conducted at the University of Florida and University of Arizona. Participants across sites were randomly assigned to complete cognitive training (n=25) or an education training control condition (n=23). Cognitive training and the education training control condition were each completed during 60 sessions over 12 weeks for 40 hours total. The education training control condition involved viewing educational videos produced by the National Geographic Channel. Cognitive training was completed using the Posit Science Brain HQ training program, which included 8 cognitive training paradigms targeting attention/processing speed and working memory. All participants also completed demographic questionnaires, cognitive testing, and an fMRI 2-back task at baseline and at 12-weeks following cognitive training.
Results:
Repeated measures analysis of covariance (ANCOVA), adjusted for training adherence, transcranial direct current stimulation (tDCS) condition, age, sex, years of education, and Wechsler Test of Adult Reading (WTAR) raw score, revealed a significant 2-back by training group interaction (F[1,40]=6.201, p=.017, η2=.134). Examination of simple main effects revealed baseline differences in 2-back performance (F[1,40]=.568, p=.455, η2=.014). After controlling for baseline performance, training group differences in 2-back performance was no longer statistically significant (F[1,40]=1.382, p=.247, η2=.034).
Conclusions:
After adjusting for baseline performance differences, there were no significant training group differences in 2-back performance, suggesting that the randomization was not sufficient to ensure adequate distribution of participants across groups. Results may indicate that cognitive training alone is not sufficient for significant improvement in working memory performance on a near transfer task. Additional improvement may occur with the next phase of this clinical trial, such that tDCS augments the effects of cognitive training and results in enhanced compensatory scaffolding even within this high performing cohort. Limitations of the study include a highly educated sample with higher literacy levels and the small sample size was not powered for transfer effects analysis. Future analyses will include evaluation of the combined intervention effects of a cognitive training and tDCS on nback performance in a larger sample of older adults without dementia.
Cognitive training has shown promise for improving cognition in older adults. Aging involves a variety of neuroanatomical changes that may affect response to cognitive training. White matter hyperintensities (WMH) are one common age-related brain change, as evidenced by T2-weighted and Fluid Attenuated Inversion Recovery (FLAIR) MRI. WMH are associated with older age, suggestive of cerebral small vessel disease, and reflect decreased white matter integrity. Higher WMH load associates with reduced threshold for clinical expression of cognitive impairment and dementia. The effects of WMH on response to cognitive training interventions are relatively unknown. The current study assessed (a) proximal cognitive training performance following a 3-month randomized control trial and (b) the contribution of baseline whole-brain WMH load, defined as total lesion volume (TLV), on pre-post proximal training change.
Participants and Methods:
Sixty-two healthy older adults ages 65-84 completed either adaptive cognitive training (CT; n=31) or educational training control (ET; n=31) interventions. Participants assigned to CT completed 20 hours of attention/processing speed training and 20 hours of working memory training delivered through commercially-available Posit Science BrainHQ. ET participants completed 40 hours of educational videos. All participants also underwent sham or active transcranial direct current stimulation (tDCS) as an adjunctive intervention, although not a variable of interest in the current study. Multimodal MRI scans were acquired during the baseline visit. T1- and T2-weighted FLAIR images were processed using the Lesion Segmentation Tool (LST) for SPM12. The Lesion Prediction Algorithm of LST automatically segmented brain tissue and calculated lesion maps. A lesion threshold of 0.30 was applied to calculate TLV. A log transformation was applied to TLV to normalize the distribution of WMH. Repeated-measures analysis of covariance (RM-ANCOVA) assessed pre/post change in proximal composite (Total Training Composite) and sub-composite (Processing Speed Training Composite, Working Memory Training Composite) measures in the CT group compared to their ET counterparts, controlling for age, sex, years of education and tDCS group. Linear regression assessed the effect of TLV on post-intervention proximal composite and sub-composite, controlling for baseline performance, intervention assignment, age, sex, years of education, multisite scanner differences, estimated total intracranial volume, and binarized cardiovascular disease risk.
Results:
RM-ANCOVA revealed two-way group*time interactions such that those assigned cognitive training demonstrated greater improvement on proximal composite (Total Training Composite) and sub-composite (Processing Speed Training Composite, Working Memory Training Composite) measures compared to their ET counterparts. Multiple linear regression showed higher baseline TLV associated with lower pre-post change on Processing Speed Training sub-composite (ß = -0.19, p = 0.04) but not other composite measures.
Conclusions:
These findings demonstrate the utility of cognitive training for improving postintervention proximal performance in older adults. Additionally, pre-post proximal processing speed training change appear to be particularly sensitive to white matter hyperintensity load versus working memory training change. These data suggest that TLV may serve as an important factor for consideration when planning processing speed-based cognitive training interventions for remediation of cognitive decline in older adults.
Interventions using a cognitive training paradigm called the Useful Field of View (UFOV) task have shown to be efficacious in slowing cognitive decline. However, no studies have looked at the engagement of functional networks during UFOV task completion. The current study aimed to (a) assess if regions activated during the UFOV fMRI task were functionally connected and related to task performance (henceforth called the UFOV network), (b) compare connectivity of the UFOV network to 7 resting-state functional connectivity networks in predicting proximal (UFOV) and near-transfer (Double Decision) performance, and (c) explore the impact of network segregation between higher-order networks and UFOV performance.
Participants and Methods:
336 healthy older adults (mean age=71.6) completed the UFOV fMRI task in a Siemens 3T scanner. UFOV fMRI accuracy was calculated as the number of correct responses divided by 56 total trials. Double Decision performance was calculated as the average presentation time of correct responses in log ms, with lower scores equating to better processing speed. Structural and functional MRI images were processed using the default pre-processing pipeline within the CONN toolbox. The Artifact Rejection Toolbox was set at a motion threshold of 0.9mm and participants were excluded if more than 50% of volumes were flagged as outliers. To assess connectivity of regions associated with the UFOV task, we created 10 spherical regions of interest (ROIs) a priori using the WFU PickAtlas in SPM12. These include the bilateral pars triangularis, supplementary motor area, and inferior temporal gyri, as well as the left pars opercularis, left middle occipital gyrus, right precentral gyrus and right superior parietal lobule. We used a weighted ROI-to-ROI connectivity analysis to model task-based within-network functional connectivity of the UFOV network, and its relationship to UFOV accuracy. We then used weighted ROI-to-ROI connectivity analysis to compare the efficacy of the UFOV network versus 7 resting-state networks in predicting UFOV fMRI task performance and Double Decision performance. Finally, we calculated network segregation among higher order resting state networks to assess its relationship with UFOV accuracy. All functional connectivity analyses were corrected at a false discovery threshold (FDR) at p<0.05.
Results:
ROI-to-ROI analysis showed significant within-network functional connectivity among the 10 a priori ROIs (UFOV network) during task completion (all pFDR<.05). After controlling for covariates, greater within-network connectivity of the UFOV network associated with better UFOV fMRI performance (pFDR=.008). Regarding the 7 resting-state networks, greater within-network connectivity of the CON (pFDR<.001) and FPCN (pFDR=. 014) were associated with higher accuracy on the UFOV fMRI task. Furthermore, greater within-network connectivity of only the UFOV network associated with performance on the Double Decision task (pFDR=.034). Finally, we assessed the relationship between higher-order network segregation and UFOV accuracy. After controlling for covariates, no significant relationships between network segregation and UFOV performance remained (all p-uncorrected>0.05).
Conclusions:
To date, this is the first study to assess task-based functional connectivity during completion of the UFOV task. We observed that coherence within 10 a priori ROIs significantly predicted UFOV performance. Additionally, enhanced within-network connectivity of the UFOV network predicted better performance on the Double Decision task, while conventional resting-state networks did not. These findings provide potential targets to optimize efficacy of UFOV interventions.
Aging is associated with disruptions in functional connectivity within the default mode (DMN), frontoparietal control (FPCN), and cingulo-opercular (CON) resting-state networks. Greater within-network connectivity predicts better cognitive performance in older adults. Therefore, strengthening network connectivity, through targeted intervention strategies, may help prevent age-related cognitive decline or progression to dementia. Small studies have demonstrated synergistic effects of combining transcranial direct current stimulation (tDCS) and cognitive training (CT) on strengthening network connectivity; however, this association has yet to be rigorously tested on a large scale. The current study leverages longitudinal data from the first-ever Phase III clinical trial for tDCS to examine the efficacy of an adjunctive tDCS and CT intervention on modulating network connectivity in older adults.
Participants and Methods:
This sample included 209 older adults (mean age = 71.6) from the Augmenting Cognitive Training in Older Adults multisite trial. Participants completed 40 hours of CT over 12 weeks, which included 8 attention, processing speed, and working memory tasks. Participants were randomized into active or sham stimulation groups, and tDCS was administered during CT daily for two weeks then weekly for 10 weeks. For both stimulation groups, two electrodes in saline-soaked 5x7 cm2 sponges were placed at F3 (cathode) and F4 (anode) using the 10-20 measurement system. The active group received 2mA of current for 20 minutes. The sham group received 2mA for 30 seconds, then no current for the remaining 20 minutes.
Participants underwent resting-state fMRI at baseline and post-intervention. CONN toolbox was used to preprocess imaging data and conduct region of interest (ROI-ROI) connectivity analyses. The Artifact Detection Toolbox, using intermediate settings, identified outlier volumes. Two participants were excluded for having greater than 50% of volumes flagged as outliers. ROI-ROI analyses modeled the interaction between tDCS group (active versus sham) and occasion (baseline connectivity versus postintervention connectivity) for the DMN, FPCN, and CON controlling for age, sex, education, site, and adherence.
Results:
Compared to sham, the active group demonstrated ROI-ROI increases in functional connectivity within the DMN following intervention (left temporal to right temporal [T(202) = 2.78, pFDR < 0.05] and left temporal to right dorsal medial prefrontal cortex [T(202) = 2.74, pFDR < 0.05]. In contrast, compared to sham, the active group demonstrated ROI-ROI decreases in functional connectivity within the FPCN following intervention (left dorsal prefrontal cortex to left temporal [T(202) = -2.96, pFDR < 0.05] and left dorsal prefrontal cortex to left lateral prefrontal cortex [T(202) = -2.77, pFDR < 0.05]). There were no significant interactions detected for CON regions.
Conclusions:
These findings (a) demonstrate the feasibility of modulating network connectivity using tDCS and CT and (b) provide important information regarding the pattern of connectivity changes occurring at these intervention parameters in older adults. Importantly, the active stimulation group showed increases in connectivity within the DMN (a network particularly vulnerable to aging and implicated in Alzheimer’s disease) but decreases in connectivity between left frontal and temporal FPCN regions. Future analyses from this trial will evaluate the association between these changes in connectivity and cognitive performance post-intervention and at a one-year timepoint.
Non-Hispanic Black older adults experience a disproportionate burden of Alzheimer’s Disease and related dementias (ADRD) risk compared to non-Hispanic White older adults. It is necessary to identify mechanisms that may be contributing to inequities in cognitive aging. Psychosocial stressors that disproportionately affect Black adults (e.g., discrimination) have the potential to impact brain health through stress pathways. The brain’s white matter, which appears to be particularly important for ADRD risk among Black older adults, may be uniquely vulnerable to stress-related physiological dysfunction. To further understand whether and how discrimination can affect ADRD risk, this study aimed to examine associations between multiple forms of racial discrimination and white matter integrity, operationalized through diffusion tensor imaging.
Participants and Methods:
Cross-sectional data were obtained from 190 non-Hispanic Black residents aged 65+ without dementia in northern Manhattan. Racial discrimination was self-reported using the Everyday Discrimination and Major Experiences of Lifetime Discrimination scales. Example items from the Everyday Discrimination Scale include: “You are treated with less respect than other people”; “You are called names or insulted.” Example items from the Major Experiences of Lifetime Discrimination Scale include: “At any time in your life, have you ever been unfairly fired from a job?”; “Have you ever been unfairly denied a bank loan?” Racial discrimination was operationalized as experiences attributed to “race” or “skin color.” White matter integrity was assessed using fractional anisotropy (FA) via diffusion tensor imaging. Multivariable regression models evaluated the unique effects of everyday and major experiences of lifetime racial discrimination on mean FA in the whole brain and specific regions. Initial models controlled for age, sex/gender, intracranial volume, and white matter hyperintensities. Subsequent models additionally controlled for socioeconomic and health factors to consider potential confounders or mediators of the relationship between discrimination and white matter integrity.
Results:
Major experiences of lifetime discrimination were negatively associated with mean FA within the left cingulum cingulate gyrus and the right inferior fronto-occipital fasciculus. These associations persisted when controlling for additional covariates (i.e., education, depression, and cardiovascular diseases). In contrast, major experiences of lifetime discrimination were positively associated with mean FA within the right superior longitudinal fasciculus (temporal part). This association was attenuated when controlling for additional covariates. Everyday racial discrimination was not associated with mean FA in any regions.
Conclusions:
These results extend prior work linking racial discrimination to brain health and provide evidence for both risk and resilience among Black older adults. Major experiences of lifetime racial discrimination, a proxy for institutional racism, may have a stronger effect on white matter integrity than everyday racial discrimination, a proxy for interpersonal racism. Educational opportunities and cardiovascular risk factors may represent mediators between racial discrimination and white matter integrity. White matter integrity within specific brain regions may be a mechanism through which racially patterned social stressors contribute to racial disparities in ADRD. Future research should characterize within-group heterogeneity in order to identify factors that promote resilience among Black older adults.
Nonpathological aging has been linked to decline in both verbal and visuospatial memory abilities in older adults. Disruptions in resting-state functional connectivity within well-characterized, higherorder cognitive brain networks have also been coupled with poorer memory functioning in healthy older adults and in older adults with dementia. However, there is a paucity of research on the association between higherorder functional connectivity and verbal and visuospatial memory performance in the older adult population. The current study examines the association between resting-state functional connectivity within the cingulo-opercular network (CON), frontoparietal control network (FPCN), and default mode network (DMN) and verbal and visuospatial learning and memory in a large sample of healthy older adults. We hypothesized that greater within-network CON and FPCN functional connectivity would be associated with better immediate verbal and visuospatial memory recall. Additionally, we predicted that within-network DMN functional connectivity would be associated with improvements in delayed verbal and visuospatial memory recall. This study helps to glean insight into whether within-network CON, FPCN, or DMN functional connectivity is associated with verbal and visuospatial memory abilities in later life.
Participants and Methods:
330 healthy older adults between 65 and 89 years old (mean age = 71.6 ± 5.2) were recruited at the University of Florida (n = 222) and the University of Arizona (n = 108). Participants underwent resting-state fMRI and completed verbal memory (Hopkins Verbal Learning Test - Revised [HVLT-R]) and visuospatial memory (Brief Visuospatial Memory Test - Revised [BVMT-R]) measures. Immediate (total) and delayed recall scores on the HVLT-R and BVMT-R were calculated using each test manual’s scoring criteria. Learning ratios on the HVLT-R and BVMT-R were quantified by dividing the number of stimuli (verbal or visuospatial) learned between the first and third trials by the number of stimuli not recalled after the first learning trial. CONN Toolbox was used to extract average within-network connectivity values for CON, FPCN, and DMN. Hierarchical regressions were conducted, controlling for sex, race, ethnicity, years of education, number of invalid scans, and scanner site.
Results:
Greater CON connectivity was significantly associated with better HVLT-R immediate (total) recall (ß = 0.16, p = 0.01), HVLT-R learning ratio (ß = 0.16, p = 0.01), BVMT-R immediate (total) recall (ß = 0.14, p = 0.02), and BVMT-R delayed recall performance (ß = 0.15, p = 0.01). Greater FPCN connectivity was associated with better BVMT-R learning ratio (ß = 0.13, p = 0.04). HVLT-R delayed recall performance was not associated with connectivity in any network, and DMN connectivity was not significantly related to any measure.
Conclusions:
Connectivity within CON demonstrated a robust relationship with different components of memory function as well across verbal and visuospatial domains. In contrast, FPCN only evidenced a relationship with visuospatial learning, and DMN was not significantly associated with memory measures. These data suggest that CON may be a valuable target in longitudinal studies of age-related memory changes, but also a possible target in future non-invasive interventions to attenuate memory decline in older adults.
Randomised controlled trials (RCTs) of psilocybin have reported large antidepressant effects in adults with major depressive disorder and treatment-resistant depression (TRD). Given psilocybin's psychedelic effects, all published studies have included psychological support. These effects depend on serotonin 2A (5-HT2A) receptor activation, which can be blocked by 5-HT2A receptor antagonists like ketanserin or risperidone. In an animal model of depression, ketanserin followed by psilocybin had similar symptomatic effects as psilocybin alone.
Aims
To conduct a proof-of-concept RCT to (a) establish feasibility and tolerability of combining psilocybin and risperidone in adults with TRD, (b) show that this combination blocks the psychedelic effects of psilocybin and (c) provide pilot data on the antidepressant effect of this combination (compared with psilocybin alone).
Method
In a 4-week, three-arm, ‘double dummy’ trial, 60 adults with TRD will be randomised to psilocybin 25 mg plus risperidone 1 mg, psilocybin 25 mg plus placebo, or placebo plus risperidone 1 mg. All participants will receive 12 h of manualised psychotherapy. Measures of feasibility will include recruitment and retention rates; tolerability and safety will be assessed by rates of drop-out attributed to adverse events and rates of serious adverse events. The 5-Dimensional Altered States of Consciousness Rating Scale will be a secondary outcome measure.
Results
This trial will advance the understanding of psilocybin's mechanism of antidepressant action.
Conclusions
This line of research could increase acceptability and access to psilocybin as a novel treatment for TRD without the need for a psychedelic experience and continuous monitoring.
Frontal ablation, the combination of submarine melting and iceberg calving, changes the geometry of a glacier's terminus, influencing glacier dynamics, the fate of upwelling plumes and the distribution of submarine meltwater input into the ocean. Directly observing frontal ablation and terminus morphology below the waterline is difficult, however, limiting our understanding of these coupled ice–ocean processes. To investigate the evolution of a tidewater glacier's submarine terminus, we combine 3-D multibeam point clouds of the subsurface ice face at LeConte Glacier, Alaska, with concurrent observations of environmental conditions during three field campaigns between 2016 and 2018. We observe terminus morphology that was predominately overcut (52% in August 2016, 63% in May 2017 and 74% in September 2018), accompanied by high multibeam sonar-derived melt rates (4.84 m d−1 in 2016, 1.13 m d−1 in 2017 and 1.85 m d−1 in 2018). We find that periods of high subglacial discharge lead to localized undercut discharge outlets, but adjacent to these outlets the terminus maintains significantly overcut geometry, with an ice ramp that protrudes 75 m into the fjord in 2017 and 125 m in 2018. Our data challenge the assumption that tidewater glacier termini are largely undercut during periods of high submarine melting.
We examined the possible sex and age differences in the proportion of experienced Coronavirus Disease 2019 (COVID-19) symptoms in unaware (previously) infected adults, and their uninfected counterparts, estimated by serostatus prior to vaccination, at the end of 2020 (Wuhan strain). A cross-sectional community-based study using a convenience sample of 10 001 adult inhabitants of a southern Dutch province, heavily affected by COVID-19, was conducted. Participants donated a blood sample to indicate past infection by serostatus (positive/negative). Experienced symptoms were assessed by questionnaire, before the availability of the serological test result. Only participants without confirmed SARS-CoV-2 infection were included (n = 9715, age range 18–90 years). The seroprevalence was comparable between men (17.3%) and women (18.0%), and participants aged 18–60 years (17.3%) and aged 60 years and older (18.6%). We showed sex and age differences in the proportion experienced symptoms by serostatus in a large cohort of both unaware (untested) seropositive compared with seronegative reference participants. Irritability only differed by serostatus in men (independent of age), while stomach ache, nausea and dizziness only differed by serostatus in women aged 60 years and older. Besides, the proportion of experiencing pain when breathing and headache differed by serostatus in men aged 18–60 years only. Our study highlights the importance of taking possible sex and age differences into account with respect to acute and long-term COVID-19 outcomes. Identifying symptom profiles for sex and age subgroups can contribute to timely identification of infection, gaining importance once governments currently move away from mass testing again.