We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This note seeks to bring awareness to the wide variety of archival documents available for research in urban history in Kumase, Ghana’s second city and capital of the historic Asante Kingdom. We draw mainly on our experiences researching the history of Jackson Park, one of colonial Kumase’s earliest public parks.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
Recent theories have implicated inflammatory biology in the development of psychopathology and maladaptive behaviors in adolescence, including suicidal thoughts and behaviors (STB). Examining specific biological markers related to inflammation is thus warranted to better understand risk for STB in adolescents, for whom suicide is a leading cause of death.
Method:
Participants were 211 adolescent females (ages 9–14 years; Mage = 11.8 years, SD = 1.8 years) at increased risk for STB. This study examined the prospective association between basal levels of inflammatory gene expression (average of 15 proinflammatory mRNA transcripts) and subsequent risk for suicidal ideation and suicidal behavior over a 12-month follow-up period.
Results:
Controlling for past levels of STB, greater proinflammatory gene expression was associated with prospective risk for STB in these youth. Similar effects were observed for CD14 mRNA level, a marker of monocyte abundance within the blood sample. Sensitivity analyses controlling for other relevant covariates, including history of trauma, depressive symptoms, and STB prior to data collection, yielded similar patterns of results.
Conclusions:
Upregulated inflammatory signaling in the immune system is prospectively associated with STB among at-risk adolescent females, even after controlling for history of trauma, depressive symptoms, and STB prior to data collection. Additional research is needed to identify the sources of inflammatory up-regulation in adolescents (e.g., stress psychobiology, physiological development, microbial exposures) and strategies for mitigating such effects to reduce STB.
To understand healthcare workers’ (HCWs) beliefs and practices toward blood culture (BCx) use.
Design:
Cross-sectional electronic survey and semi-structured interviews.
Setting:
Academic hospitals in the United States.
Participants:
HCWs involved in BCx ordering and collection in adult intensive care units (ICU) and wards.
Methods:
We administered an anonymous electronic survey to HCWs and conducted semi-structured interviews with unit staff and quality improvement (QI) leaders in these institutions to understand their perspectives regarding BCx stewardship between February and November 2023.
Results:
Of 314 HCWs who responded to the survey, most (67.4%) were physicians and were involved in BCx ordering (82.3%). Most survey respondents reported that clinicians had a low threshold to culture patients for fever (84.4%) and agreed they could safely reduce the number of BCx obtained in their units (65%). However, only half of them believed BCx was overused. Although most made BCx decisions as a team (74.1%), a minority reported these team discussions occurred daily (42.4%). A third of respondents reported not usually collecting the correct volume per BCx bottle, half were unaware of the improved sensitivity of 2 BCx sets, and most were unsure of the nationally recommended BCx contamination threshold (87.5%). Knowledge regarding the utility of BCx for common infections was limited.
Conclusions:
HCWs’ understanding of best collection practices and yield of BCx was limited.
Affective responses to the menstrual cycle vary widely. Some individuals experience severe symptoms like those with premenstrual dysphoric disorder, while others have minimal changes. The reasons for these differences are unclear, but prior studies suggest stressor exposure may play a role. However, research in at-risk psychiatric samples is lacking.
Methods
In a large clinical sample, we conducted a prospective study of how lifetime stressors relate to degree of affective change across the cycle. 114 outpatients with past-month suicidal ideation (SI) provided daily ratings (n = 6187) of negative affect and SI across 1–3 menstrual cycles. Participants completed the Stress and Adversity Inventory (STRAIN), which measures different stressor exposures (i.e. interpersonal loss, physical danger) throughout the life course, including before and after menarche. Multilevel polynomial growth models tested the relationship between menstrual cycle time and symptoms, moderated by stressor exposure.
Results
Greater lifetime stressor exposure predicted a more pronounced perimenstrual increase in active SI, along with marginally significant similar patterns for negative affect and passive SI. Additionally, pre-menarche stressors significantly increased the cyclicity of active SI compared to post-menarche stressors. Exposure to more interpersonal loss stressors predicted greater perimenstrual symptom change of negative affect, passive SI and active SI. Exploratory item-level analyses showed that lifetime stressors moderated a more severe perimenstrual symptom trajectory for mood swings, anger/irritability, rejection sensitivity, and interpersonal conflict.
Conclusion
These findings suggest that greater lifetime stressor exposure may lead to heightened emotional reactivity to ovarian hormone fluctuations, elevating the risk of psychopathology.
Campylobacter spp. are leading bacterial gastroenteritis pathogens. Infections are largely underreported, and the burden of outbreaks may be underestimated. Current strategies of testing as few as one isolate per sample can affect attribution of cases to epidemiologically important sources with high Campylobacter diversity, such as chicken meat. Multiple culture method combinations were utilized to recover and sequence Campylobacter from 45 retail chicken samples purchased across Norwich, UK, selecting up to 48 isolates per sample. Simulations based on resampling were used to assess the impact of Campylobacter sequence type (ST) diversity on outbreak detection. Campylobacter was recovered from 39 samples (87%), although only one sample was positive through all broth, temperature, and plate combinations. Three species were identified (Campylobacter jejuni, Campylobacter coli, and Campylobacter lari), and 33% of samples contained two species. Positive samples contained 1–8 STs. Simulation revealed that up to 87 isolates per sample would be required to detect 95% of the observed ST diversity, and 26 isolates would be required for the average probability of detecting a random theoretical outbreak ST to reach 95%. An optimized culture approach and selecting multiple isolates per sample are essential for more complete Campylobacter recovery to support outbreak investigation and source attribution.
The goal of this chapter is to show the reader a systematic approach to the assessment and treatment of aggression and violence arising from psychosis and a review of evidence-based pharmacological interventions for aggression and violence arising from impulsivity in the context of traumatic brain injury or neurocognitive disorder. In turn, we consider an algorithmic approach to the assessment and treatment of psychotically driven aggression and violence, the approach to treatment-resistance in schizophrenia spectrum disorders, data-supported treatment of aggression and violence related to traumatic brain injury, and, finally, data-supported pharmacological treatment of aggression and violence in the context of major neurocognitive disorder.
Inhibitory control plays an important role in children’s cognitive and socioemotional development, including their psychopathology. It has been established that contextual factors such as socioeconomic status (SES) and parents’ psychopathology are associated with children’s inhibitory control. However, the relations between the neural correlates of inhibitory control and contextual factors have been rarely examined in longitudinal studies. In the present study, we used both event-related potential (ERP) components and time-frequency measures of inhibitory control to evaluate the neural pathways between contextual factors, including prenatal SES and maternal psychopathology, and children’s behavioral and emotional problems in a large sample of children (N = 560; 51.75% females; Mage = 7.13 years; Rangeage = 4–11 years). Results showed that theta power, which was positively predicted by prenatal SES and was negatively related to children’s externalizing problems, mediated the longitudinal and negative relation between them. ERP amplitudes and latencies did not mediate the longitudinal association between prenatal risk factors (i.e., prenatal SES and maternal psychopathology) and children’s internalizing and externalizing problems. Our findings increase our understanding of the neural pathways linking early risk factors to children’s psychopathology.
OBJECTIVES/GOALS: Antibiotic treatment sets the stage for intestinal domination by Candida albicanswhich is necessary for development of invasive disease, but the resources driving this bloom remain poorly defined. We sought to determine these factors in order to design novel prophylaxis strategies for reducing gastrointestinal (GI) colonization. METHODS/STUDY POPULATION: We initially developed a generalizable framework, termed metabolic footprinting to determine the metabolites C. albicanspreferentially uses in the mouse GI tract. After identifying the metabolites C. albicansutilizes, we usedin vitro growth assays in the presence and absence of oxygen to validate out metabolomics findings. We next determined if a probiotic E. coli that utilizes oxygen would reduce C. albicanscolonization compared to a mutant E. coli that could not respire oxygen. Finding that oxygen was a necessary resource, we utilized germ-free mice to determine if Clostridiaspp. known to reduce GI oxygen would prevent C. albicanscolonization. Lastly, we sought to see if 5-aminosalicylic acid (5-ASA) could prevent C. albicanscolonization. RESULTS/ANTICIPATED RESULTS: We found that C. albicans preferentially utilizes simple carbohydrates including fructo-oligosaccharides (e.g., 1-kestose), disaccharides (e.g., β-gentiobiose), and alcoholic sugars (e.g., sorbitol) and is able to grow in vitro on minimal media supplemented with either of these nutrients. However, in the hypoxic environment that is found in the “healthy” colon, C. albicans cannot utilize these nutrients. We next found that pre-colonization in a mouse model with a probiotic E. coli significantly reduced C. albicanscolonization, but the mutant E. coli had no effect on colonization. We next showed that Clostridia supplementation restored GI hypoxia and reduced C. albicanscolonization. Remarkably, we found that 5-ASA significantly reduced GI colonization of C. albicans. DISCUSSION/SIGNIFICANCE: We have shown that C. albicans requires oxygen to colonize the GI tract. Importantly, we found that 5-ASA can prevent an antibiotic mediated bloom of C. albicans by restoring GI hypoxia, which warrants additional studies to determine if 5-ASA can be used as an adjunctive prophylactic treatment in high risk patients.
There are numerous challenges pertaining to epilepsy care across Ontario, including Epilepsy Monitoring Unit (EMU) bed pressures, surgical access and community supports. We sampled the current clinical, community and operational state of Ontario epilepsy centres and community epilepsy agencies post COVID-19 pandemic. A 44-item survey was distributed to all 11 district and regional adult and paediatric Ontario epilepsy centres. Qualitative responses were collected from community epilepsy agencies. Results revealed ongoing gaps in epilepsy care across Ontario, with EMU bed pressures and labour shortages being limiting factors. A clinical network advising the Ontario Ministry of Health will improve access to epilepsy care.
Cognitive training is a non-pharmacological intervention aimed at improving cognitive function across a single or multiple domains. Although the underlying mechanisms of cognitive training and transfer effects are not well-characterized, cognitive training has been thought to facilitate neural plasticity to enhance cognitive performance. Indeed, the Scaffolding Theory of Aging and Cognition (STAC) proposes that cognitive training may enhance the ability to engage in compensatory scaffolding to meet task demands and maintain cognitive performance. We therefore evaluated the effects of cognitive training on working memory performance in older adults without dementia. This study will help begin to elucidate non-pharmacological intervention effects on compensatory scaffolding in older adults.
Participants and Methods:
48 participants were recruited for a Phase III randomized clinical trial (Augmenting Cognitive Training in Older Adults [ACT]; NIH R01AG054077) conducted at the University of Florida and University of Arizona. Participants across sites were randomly assigned to complete cognitive training (n=25) or an education training control condition (n=23). Cognitive training and the education training control condition were each completed during 60 sessions over 12 weeks for 40 hours total. The education training control condition involved viewing educational videos produced by the National Geographic Channel. Cognitive training was completed using the Posit Science Brain HQ training program, which included 8 cognitive training paradigms targeting attention/processing speed and working memory. All participants also completed demographic questionnaires, cognitive testing, and an fMRI 2-back task at baseline and at 12-weeks following cognitive training.
Results:
Repeated measures analysis of covariance (ANCOVA), adjusted for training adherence, transcranial direct current stimulation (tDCS) condition, age, sex, years of education, and Wechsler Test of Adult Reading (WTAR) raw score, revealed a significant 2-back by training group interaction (F[1,40]=6.201, p=.017, η2=.134). Examination of simple main effects revealed baseline differences in 2-back performance (F[1,40]=.568, p=.455, η2=.014). After controlling for baseline performance, training group differences in 2-back performance was no longer statistically significant (F[1,40]=1.382, p=.247, η2=.034).
Conclusions:
After adjusting for baseline performance differences, there were no significant training group differences in 2-back performance, suggesting that the randomization was not sufficient to ensure adequate distribution of participants across groups. Results may indicate that cognitive training alone is not sufficient for significant improvement in working memory performance on a near transfer task. Additional improvement may occur with the next phase of this clinical trial, such that tDCS augments the effects of cognitive training and results in enhanced compensatory scaffolding even within this high performing cohort. Limitations of the study include a highly educated sample with higher literacy levels and the small sample size was not powered for transfer effects analysis. Future analyses will include evaluation of the combined intervention effects of a cognitive training and tDCS on nback performance in a larger sample of older adults without dementia.
Interventions using a cognitive training paradigm called the Useful Field of View (UFOV) task have shown to be efficacious in slowing cognitive decline. However, no studies have looked at the engagement of functional networks during UFOV task completion. The current study aimed to (a) assess if regions activated during the UFOV fMRI task were functionally connected and related to task performance (henceforth called the UFOV network), (b) compare connectivity of the UFOV network to 7 resting-state functional connectivity networks in predicting proximal (UFOV) and near-transfer (Double Decision) performance, and (c) explore the impact of network segregation between higher-order networks and UFOV performance.
Participants and Methods:
336 healthy older adults (mean age=71.6) completed the UFOV fMRI task in a Siemens 3T scanner. UFOV fMRI accuracy was calculated as the number of correct responses divided by 56 total trials. Double Decision performance was calculated as the average presentation time of correct responses in log ms, with lower scores equating to better processing speed. Structural and functional MRI images were processed using the default pre-processing pipeline within the CONN toolbox. The Artifact Rejection Toolbox was set at a motion threshold of 0.9mm and participants were excluded if more than 50% of volumes were flagged as outliers. To assess connectivity of regions associated with the UFOV task, we created 10 spherical regions of interest (ROIs) a priori using the WFU PickAtlas in SPM12. These include the bilateral pars triangularis, supplementary motor area, and inferior temporal gyri, as well as the left pars opercularis, left middle occipital gyrus, right precentral gyrus and right superior parietal lobule. We used a weighted ROI-to-ROI connectivity analysis to model task-based within-network functional connectivity of the UFOV network, and its relationship to UFOV accuracy. We then used weighted ROI-to-ROI connectivity analysis to compare the efficacy of the UFOV network versus 7 resting-state networks in predicting UFOV fMRI task performance and Double Decision performance. Finally, we calculated network segregation among higher order resting state networks to assess its relationship with UFOV accuracy. All functional connectivity analyses were corrected at a false discovery threshold (FDR) at p<0.05.
Results:
ROI-to-ROI analysis showed significant within-network functional connectivity among the 10 a priori ROIs (UFOV network) during task completion (all pFDR<.05). After controlling for covariates, greater within-network connectivity of the UFOV network associated with better UFOV fMRI performance (pFDR=.008). Regarding the 7 resting-state networks, greater within-network connectivity of the CON (pFDR<.001) and FPCN (pFDR=. 014) were associated with higher accuracy on the UFOV fMRI task. Furthermore, greater within-network connectivity of only the UFOV network associated with performance on the Double Decision task (pFDR=.034). Finally, we assessed the relationship between higher-order network segregation and UFOV accuracy. After controlling for covariates, no significant relationships between network segregation and UFOV performance remained (all p-uncorrected>0.05).
Conclusions:
To date, this is the first study to assess task-based functional connectivity during completion of the UFOV task. We observed that coherence within 10 a priori ROIs significantly predicted UFOV performance. Additionally, enhanced within-network connectivity of the UFOV network predicted better performance on the Double Decision task, while conventional resting-state networks did not. These findings provide potential targets to optimize efficacy of UFOV interventions.
Neurocognitive deficits commonly occur following pediatric stroke and can impact many neuropsychological domains. Despite awareness of these deleterious effects, neurocognitive outcome after pediatric stroke, especially hemorrhagic stroke, is understudied. This clinical study aimed to elucidate the impact of eight factors identified in the scientific literature as possible predictors of neurocognitive outcome following pediatric stroke: age at stroke, stroke type (i.e., ischemic vs. hemorrhagic), lesion size, lesion location (i.e., brain region, structures impacted, and laterality), time since stroke, neurologic severity, seizures post-stroke, and socioeconomic status.
Participants and Methods:
Ninety-two patients, ages six to 25 and with a history of pediatric stroke, chose to participate in the study and were administered standardized neuropsychological tests assessing verbal reasoning, abstract reasoning, working memory, processing speed, attention, learning ability, long-term memory, and visuomotor integration. A standardized parent questionnaire provided an estimate of executive functioning. Statistical analyses included spline regressions to examine the impact of age at stroke and lesion size, further divided by stroke type; a series of one-way analysis of variance to examine differences in variables with three levels; Welch’s t-tests to examine dichotomous variables; and simple linear regressions for continuous variables.
Results:
Lesion size, stroke type, age at stroke, and socioeconomic status were identified as predictors of neurocognitive outcome in our sample. Large lesions were associated with worse neurocognitive outcomes compared to small to medium lesions across neurocognitive domains. Exploratory spline regressions suggested that ischemic stroke was associated with worse neurocognitive outcomes than hemorrhagic stroke. Based on patterns shown in graphs, age at stroke appeared to have an impact on outcome depending on the neurocognitive domain and stroke type, with U-shaped trends suggesting worse outcome across most domains when stroke occurred at approximately 5 to 10 years of age. Socioeconomic status positively predicted outcomes across most neurocognitive domains. Participants with seizures had more severe executive functioning impairments than youth without seizures. Youth with combined cortical-subcortical lesions scored lower on abstract reasoning than youth with cortical and youth with subcortical lesions, and lower on attention than youth with cortical lesions. Neurologic severity predicted scores on abstract reasoning, attention, processing speed, and visuomotor integration, depending on stroke type. There was no evidence of differences on outcome measures based on time since stroke, lesion laterality, or lesion region defined as supra-versus infratentorial.
Conclusions:
The current study contributed to the scientific literature by identifying lesion size, stroke type, age at stroke, and socioeconomic status as predictors of neurocognitive outcome following pediatric stroke. Future research should examine other possible predictors of neurocognitive outcome that remain unexplored. Multisite collaborations would provide larger sample sizes and allow teams to build models with better statistical power and more predictors. Enhancing understanding of neurocognitive outcomes following pediatric stroke is a first step towards improving appraisals of prognosis.
Findings are clinically applicable as they provide professionals with information that can help assess individual expected patterns of recovery and thus refer patients to appropriate support services.
Aging is associated with disruptions in functional connectivity within the default mode (DMN), frontoparietal control (FPCN), and cingulo-opercular (CON) resting-state networks. Greater within-network connectivity predicts better cognitive performance in older adults. Therefore, strengthening network connectivity, through targeted intervention strategies, may help prevent age-related cognitive decline or progression to dementia. Small studies have demonstrated synergistic effects of combining transcranial direct current stimulation (tDCS) and cognitive training (CT) on strengthening network connectivity; however, this association has yet to be rigorously tested on a large scale. The current study leverages longitudinal data from the first-ever Phase III clinical trial for tDCS to examine the efficacy of an adjunctive tDCS and CT intervention on modulating network connectivity in older adults.
Participants and Methods:
This sample included 209 older adults (mean age = 71.6) from the Augmenting Cognitive Training in Older Adults multisite trial. Participants completed 40 hours of CT over 12 weeks, which included 8 attention, processing speed, and working memory tasks. Participants were randomized into active or sham stimulation groups, and tDCS was administered during CT daily for two weeks then weekly for 10 weeks. For both stimulation groups, two electrodes in saline-soaked 5x7 cm2 sponges were placed at F3 (cathode) and F4 (anode) using the 10-20 measurement system. The active group received 2mA of current for 20 minutes. The sham group received 2mA for 30 seconds, then no current for the remaining 20 minutes.
Participants underwent resting-state fMRI at baseline and post-intervention. CONN toolbox was used to preprocess imaging data and conduct region of interest (ROI-ROI) connectivity analyses. The Artifact Detection Toolbox, using intermediate settings, identified outlier volumes. Two participants were excluded for having greater than 50% of volumes flagged as outliers. ROI-ROI analyses modeled the interaction between tDCS group (active versus sham) and occasion (baseline connectivity versus postintervention connectivity) for the DMN, FPCN, and CON controlling for age, sex, education, site, and adherence.
Results:
Compared to sham, the active group demonstrated ROI-ROI increases in functional connectivity within the DMN following intervention (left temporal to right temporal [T(202) = 2.78, pFDR < 0.05] and left temporal to right dorsal medial prefrontal cortex [T(202) = 2.74, pFDR < 0.05]. In contrast, compared to sham, the active group demonstrated ROI-ROI decreases in functional connectivity within the FPCN following intervention (left dorsal prefrontal cortex to left temporal [T(202) = -2.96, pFDR < 0.05] and left dorsal prefrontal cortex to left lateral prefrontal cortex [T(202) = -2.77, pFDR < 0.05]). There were no significant interactions detected for CON regions.
Conclusions:
These findings (a) demonstrate the feasibility of modulating network connectivity using tDCS and CT and (b) provide important information regarding the pattern of connectivity changes occurring at these intervention parameters in older adults. Importantly, the active stimulation group showed increases in connectivity within the DMN (a network particularly vulnerable to aging and implicated in Alzheimer’s disease) but decreases in connectivity between left frontal and temporal FPCN regions. Future analyses from this trial will evaluate the association between these changes in connectivity and cognitive performance post-intervention and at a one-year timepoint.
Anterior temporal lobectomy is a common surgical approach for medication-resistant temporal lobe epilepsy (TLE). Prior studies have shown inconsistent findings regarding the utility of presurgical intracarotid sodium amobarbital testing (IAT; also known as Wada test) and neuroimaging in predicting postoperative seizure control. In the present study, we evaluated the predictive utility of IAT, as well as structural magnetic resonance imaging (MRI) and positron emission tomography (PET), on long-term (3-years) seizure outcome following surgery for TLE.
Participants and Methods:
Patients consisted of 107 adults (mean age=38.6, SD=12.2; mean education=13.3 years, SD=2.0; female=47.7%; White=100%) with TLE (mean epilepsy duration =23.0 years, SD=15.7; left TLE surgery=50.5%). We examined whether demographic, clinical (side of resection, resection type [selective vs. non-selective], hemisphere of language dominance, epilepsy duration), and presurgical studies (normal vs. abnormal MRI, normal vs. abnormal PET, correctly lateralizing vs. incorrectly lateralizing IAT) were associated with absolute (cross-sectional) seizure outcome (i.e., freedom vs. recurrence) with a series of chi-squared and t-tests. Additionally, we determined whether presurgical evaluations predicted time to seizure recurrence (longitudinal outcome) over a three-year period with univariate Cox regression models, and we compared survival curves with Mantel-Cox (log rank) tests.
Results:
Demographic and clinical variables (including type [selective vs. whole lobectomy] and side of resection) were not associated with seizure outcome. No associations were found among the presurgical variables. Presurgical MRI was not associated with cross-sectional (OR=1.5, p=.557, 95% CI=0.4-5.7) or longitudinal (HR=1.2, p=.641, 95% CI=0.4-3.9) seizure outcome. Normal PET scan (OR= 4.8, p=.045, 95% CI=1.0-24.3) and IAT incorrectly lateralizing to seizure focus (OR=3.9, p=.018, 95% CI=1.2-12.9) were associated with higher odds of seizure recurrence. Furthermore, normal PET scan (HR=3.6, p=.028, 95% CI =1.0-13.5) and incorrectly lateralized IAT (HR= 2.8, p=.012, 95% CI=1.2-7.0) were presurgical predictors of earlier seizure recurrence within three years of TLE surgery. Log rank tests indicated that survival functions were significantly different between patients with normal vs. abnormal PET and incorrectly vs. correctly lateralizing IAT such that these had seizure relapse five and seven months earlier on average (respectively).
Conclusions:
Presurgical normal PET scan and incorrectly lateralizing IAT were associated with increased risk of post-surgical seizure recurrence and shorter time-to-seizure relapse.
Nonpathological aging has been linked to decline in both verbal and visuospatial memory abilities in older adults. Disruptions in resting-state functional connectivity within well-characterized, higherorder cognitive brain networks have also been coupled with poorer memory functioning in healthy older adults and in older adults with dementia. However, there is a paucity of research on the association between higherorder functional connectivity and verbal and visuospatial memory performance in the older adult population. The current study examines the association between resting-state functional connectivity within the cingulo-opercular network (CON), frontoparietal control network (FPCN), and default mode network (DMN) and verbal and visuospatial learning and memory in a large sample of healthy older adults. We hypothesized that greater within-network CON and FPCN functional connectivity would be associated with better immediate verbal and visuospatial memory recall. Additionally, we predicted that within-network DMN functional connectivity would be associated with improvements in delayed verbal and visuospatial memory recall. This study helps to glean insight into whether within-network CON, FPCN, or DMN functional connectivity is associated with verbal and visuospatial memory abilities in later life.
Participants and Methods:
330 healthy older adults between 65 and 89 years old (mean age = 71.6 ± 5.2) were recruited at the University of Florida (n = 222) and the University of Arizona (n = 108). Participants underwent resting-state fMRI and completed verbal memory (Hopkins Verbal Learning Test - Revised [HVLT-R]) and visuospatial memory (Brief Visuospatial Memory Test - Revised [BVMT-R]) measures. Immediate (total) and delayed recall scores on the HVLT-R and BVMT-R were calculated using each test manual’s scoring criteria. Learning ratios on the HVLT-R and BVMT-R were quantified by dividing the number of stimuli (verbal or visuospatial) learned between the first and third trials by the number of stimuli not recalled after the first learning trial. CONN Toolbox was used to extract average within-network connectivity values for CON, FPCN, and DMN. Hierarchical regressions were conducted, controlling for sex, race, ethnicity, years of education, number of invalid scans, and scanner site.
Results:
Greater CON connectivity was significantly associated with better HVLT-R immediate (total) recall (ß = 0.16, p = 0.01), HVLT-R learning ratio (ß = 0.16, p = 0.01), BVMT-R immediate (total) recall (ß = 0.14, p = 0.02), and BVMT-R delayed recall performance (ß = 0.15, p = 0.01). Greater FPCN connectivity was associated with better BVMT-R learning ratio (ß = 0.13, p = 0.04). HVLT-R delayed recall performance was not associated with connectivity in any network, and DMN connectivity was not significantly related to any measure.
Conclusions:
Connectivity within CON demonstrated a robust relationship with different components of memory function as well across verbal and visuospatial domains. In contrast, FPCN only evidenced a relationship with visuospatial learning, and DMN was not significantly associated with memory measures. These data suggest that CON may be a valuable target in longitudinal studies of age-related memory changes, but also a possible target in future non-invasive interventions to attenuate memory decline in older adults.
The Australian SKA Pathfinder (ASKAP) has surveyed the sky at multiple frequencies as part of the Rapid ASKAP Continuum Survey (RACS). The first two RACS observing epochs, at 887.5 (RACS-low) and 1 367.5 (RACS-mid) MHz, have been released (McConnell, et al. 2020, PASA, 37, e048; Duchesne, et al. 2023, PASA, 40, e034). A catalogue of radio sources from RACS-low has also been released, covering the sky south of declination $+30^{\circ}$ (Hale, et al., 2021, PASA, 38, e058). With this paper, we describe and release the first set of catalogues from RACS-mid, covering the sky below declination $+49^{\circ}$. The catalogues are created in a similar manner to the RACS-low catalogue, and we discuss this process and highlight additional changes. The general purpose primary catalogue covering 36 200 deg$^2$ features a variable angular resolution to maximise sensitivity and sky coverage across the catalogued area, with a median angular resolution of $11.2^{\prime\prime} \times 9.3^{\prime\prime}$. The primary catalogue comprises 3 105 668 radio sources, including those in the Galactic Plane (2 861 923 excluding Galactic latitudes of $|b|<5^{\circ}$), and we estimate the catalogue to be 95% complete for sources above 2 mJy. With the primary catalogue, we also provide two auxiliary catalogues. The first is a fixed-resolution, 25-arcsec catalogue approximately matching the sky coverage of the RACS-low catalogue. This 25-arcsec catalogue is constructed identically to the primary catalogue, except images are convolved to a less-sensitive 25-arcsec angular resolution. The second auxiliary catalogue is designed for time-domain science and is the concatenation of source lists from the original RACS-mid images with no additional convolution, mosaicking, or de-duplication of source entries to avoid losing time-variable signals. All three RACS-mid catalogues, and all RACS data products, are available through the CSIRO ASKAP Science Data Archive (https://research.csiro.au/casda/).
Our objective was to evaluate the psychometric properties of the culturally adapted NIH Toolbox African Languages® when used in Swahili and Dholuo-speaking children in western Kenya.
Method:
Swahili-speaking participants were recruited from Eldoret and Dholuo-speaking participants from Ajigo; all were <14 years of age and enrolled in primary school. Participants completed a demographics questionnaire and five fluid cognition tests of the NIH Toolbox® African Languages program, including Flanker, Dimensional Change Card Sort (DCCS), Picture Sequence Memory, Pattern Comparison, and List Sorting tests. Statistical analyses examined aspects of reliability, including internal consistency (in both languages) and test–retest reliability (in Dholuo only).
Results:
Participants included 479 children (n = 239, Swahili-speaking; n = 240, Dholuo-speaking). Generally, the tests had acceptable psychometric properties for research use within Swahili- and Dholuo-speaking populations (mean age = 10.5; SD = 2.3). Issues related to shape identification and accuracy over speed limited the utility of DCCS for many participants, with approximately 25% of children unable to match based on shape. These cultural differences affected outcomes of reliability testing among the Dholuo-speaking cohort, where accuracy improved across all five tests, including speed.
Conclusions:
There is preliminary evidence that the NIH Toolbox ® African Languages potentially offers a valid assessment of development and performance using tests of fluid cognition in Swahili and Dholuo among research settings. With piloting underway across other diverse settings, future research should gather additional evidence on the clinical utility and acceptability of these tests, specifically through the establishment of norming data among Kenyan regions and evaluating these psychometric properties.
Remitted psychotic depression (MDDPsy) has heterogeneity of outcome. The study's aims were to identify subgroups of persons with remitted MDDPsy with distinct trajectories of depression severity during continuation treatment and to detect predictors of membership to the worsening trajectory.
Method
One hundred and twenty-six persons aged 18–85 years participated in a 36-week randomized placebo-controlled trial (RCT) that examined the clinical effects of continuing olanzapine once an episode of MDDPsy had remitted with sertraline plus olanzapine. Latent class mixed modeling was used to identify subgroups of participants with distinct trajectories of depression severity during the RCT. Machine learning was used to predict membership to the trajectories based on participant pre-trajectory characteristics.
Results
Seventy-one (56.3%) participants belonged to a subgroup with a stable trajectory of depression scores and 55 (43.7%) belonged to a subgroup with a worsening trajectory. A random forest model with high prediction accuracy (AUC of 0.812) found that the strongest predictors of membership to the worsening subgroup were residual depression symptoms at onset of remission, followed by anxiety score at RCT baseline and age of onset of the first lifetime depressive episode. In a logistic regression model that examined depression score at onset of remission as the only predictor variable, the AUC (0.778) was close to that of the machine learning model.
Conclusions
Residual depression at onset of remission has high accuracy in predicting membership to worsening outcome of remitted MDDPsy. Research is needed to determine how best to optimize the outcome of psychotic MDDPsy with residual symptoms.