We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Efgartigimod, a human immunoglobulin (Ig)G1 antibody Fc fragment, blocks the neonatal Fc receptor, reducing IgGs involved in chronic inflammatory demyelinating polyneuropathy (CIDP). The multi-stage, double-blinded, placebo-controlled ADHERE (NCT04281472) and open-label extension ADHERE+ (NCT04280718) trials (interim analysis cutoff: February 16, 2024) assessed efgartigimod PH20 SC in participants with CIDP. Methods: Participants with active CIDP received open-label, weekly efgartigimod PH20 SC 1000 mg during ≤12-week run-in (stage-A). Responders were randomized (1:1) to efgartigimod or placebo for ≤48 weeks (stage-B). Participants with clinical deterioration in stage-B or who completed ADHERE entered ADHERE+. Week 36 changes from run-in baseline (CFB) in adjusted Inflammatory Neuropathy Cause and Treatment (aINCAT), Inflammatory Rasch-built Overall Disability Scale (I-RODS), and grip strength scores were evaluated. Results: Of 322 stage-A participants, 221 were randomized and treated in stage-B, and 99% entered ADHERE+. Mean CFB (SE) in aINCAT, I-RODS, and grip strength scores were -1.2 (0.15) and 8.8 (1.46) and 17.5 (2.02), respectively, at ADHERE+ Week 36 (N=150). Half the participants with clinical deterioration during ADHERE stage-B restabilized on efgartigimod from ADHERE+ Week 4. Conclusions: Interim results from ADHERE+ indicate long-term effectiveness of efgartigimod PH20 SC in clinical outcomes in participants with CIDP.
Background: Efgartigimod, a human immunoglobulin (Ig)G1 antibody Fc fragment, blocks the neonatal Fc receptor, reducing IgGs involved in chronic inflammatory demyelinating polyneuropathy (CIDP), a rare, progressive, immune-mediated disease that can lead to irreversible disability. The multi-stage, double-blinded, placebo-controlled ADHERE (NCT04281472) trial assessed efgartigimod PH20 SC in participants with CIDP. Methods: Participants with active CIDP received open-label, weekly efgartigimod PH20 SC 1000 mg during ≤12-week run-in (stage-A). Responders were randomized (1:1) to weekly efgartigimod or placebo for ≤48 weeks (stage-B). This posthoc analysis evaluated changes from run-in baseline (study enrollment) to stage-B last assessment and items of the Inflammatory Rasch-built Overall Disability Scale (I-RODS). Results: Of 322 participants who entered stage-A, 221 were randomized and treated in stage-B, and 191/221 had data for run-in baseline and post–stage-B timepoints. Mean (SE) I-RODS change at stage-B last assessment vs run-in baseline was 5.7 (1.88) and -4.9 (1.82) in participants randomized to efgartigimod and placebo, respectively. 37/97 (38.1%) and 24/92 (26.1%) participants randomized to efgartigimod and placebo, respectively, experienced ≥4-point improvements in I-RODS score. Efgartigimod-treated participants improved ≥1 point in I-RODS items of clinical interest. Conclusions: Participants who received efgartigimod in stage-B experienced improvements in I-RODS score from study enrollment to stage-B last assessment.
The First Large Absorption Survey in H i (FLASH) is a large-area radio survey for neutral hydrogen in and around galaxies in the intermediate redshift range $0.4\lt z\lt1.0$, using the 21-cm H i absorption line as a probe of cold neutral gas. The survey uses the ASKAP radio telescope and will cover 24,000 deg$^2$ of sky over the next five years. FLASH breaks new ground in two ways – it is the first large H i absorption survey to be carried out without any optical preselection of targets, and we use an automated Bayesian line-finding tool to search through large datasets and assign a statistical significance to potential line detections. Two Pilot Surveys, covering around 3000 deg$^2$ of sky, were carried out in 2019-22 to test and verify the strategy for the full FLASH survey. The processed data products from these Pilot Surveys (spectral-line cubes, continuum images, and catalogues) are public and available online. In this paper, we describe the FLASH spectral-line and continuum data products and discuss the quality of the H i spectra and the completeness of our automated line search. Finally, we present a set of 30 new H i absorption lines that were robustly detected in the Pilot Surveys, almost doubling the number of known H i absorption systems at $0.4\lt z\lt1$. The detected lines span a wide range in H i optical depth, including three lines with a peak optical depth $\tau\gt1$, and appear to be a mixture of intervening and associated systems. Interestingly, around two-thirds of the lines found in this untargeted sample are detected against sources with a peaked-spectrum radio continuum, which are only a minor (5–20%) fraction of the overall radio-source population. The detection rate for H i absorption lines in the Pilot Surveys (0.3 to 0.5 lines per 40 deg$^2$ ASKAP field) is a factor of two below the expected value. One possible reason for this is the presence of a range of spectral-line artefacts in the Pilot Survey data that have now been mitigated and are not expected to recur in the full FLASH survey. A future paper in this series will discuss the host galaxies of the H i absorption systems identified here.
The stars of the Milky Way carry the chemical history of our Galaxy in their atmospheres as they journey through its vast expanse. Like barcodes, we can extract the chemical fingerprints of stars from high-resolution spectroscopy. The fourth data release (DR4) of the Galactic Archaeology with HERMES (GALAH) Survey, based on a decade of observations, provides the chemical abundances of up to 32 elements for 917 588 stars that also have exquisite astrometric data from the Gaia satellite. For the first time, these elements include life-essential nitrogen to complement carbon, and oxygen as well as more measurements of rare-earth elements critical to modern-life electronics, offering unparalleled insights into the chemical composition of the Milky Way. For this release, we use neural networks to simultaneously fit stellar parameters and abundances across the whole wavelength range, leveraging synthetic grids computed with Spectroscopy Made Easy. These grids account for atomic line formation in non-local thermodynamic equilibrium for 14 elements. In a two-iteration process, we first fit stellar labels to all 1 085 520 spectra, then co-add repeated observations and refine these labels using astrometric data from Gaia and 2MASS photometry, improving the accuracy and precision of stellar parameters and abundances. Our validation thoroughly assesses the reliability of spectroscopic measurements and highlights key caveats. GALAH DR4 represents yet another milestone in Galactic archaeology, combining detailed chemical compositions from multiple nucleosynthetic channels with kinematic information and age estimates. The resulting dataset, covering nearly a million stars, opens new avenues for understanding not only the chemical and dynamical history of the Milky Way but also the broader questions of the origin of elements and the evolution of planets, stars, and galaxies.
To examine if the current taught undergraduate psychiatry syllabus at an Irish University relates to what doctors in psychiatry consider to be clinically relevant and important.
Methods:
Doctors of different clinical grades were invited to rate their views on 216 items on a 10-point Likert scale ranging from ‘0 = not relevant’ to ‘10 = very relevant’. Participants were invited to comment on topics that should be excluded or included in a new syllabus. Thematic analysis was conducted on this free-text to identify particular themes.
Results:
The doctors surveyed rated that knowledge of diagnostic criteria was important for medical students. This knowledge attained high scores across all disorders with particularly high scores for a number of disorders including major depressive disorder (mean = 9.64 (SD = 0.86)), schizophrenia (mean = 9.55 (SD = 0.95)) and attention deficit hyperactivity disorder (Attention Deficit Hyperactivity Disorder (ADHD); mean = 9.26 (SD = 1.40)). Lower scores were noted for less frequently utilised management strategies (transcranial magnetic stimulation (mean = 4.97 (SD = 2.60)), an awareness of the difference in criteria for use disorder and dependence from psychoactive substances (mean = 5.56 (SD = 2.26)), and some theories pertaining to psychotherapy (i.e. Freud’s drive theory (mean = 4.59 (SD = 2.42)).
Conclusions:
This study highlights the importance of an undergraduate programme that is broad based, practical and relevant to student’s future medical practice. An emphasis on diagnosis and management of major psychiatry disorders, and knowledge of the interface between mental health services, other medical specialities and support services was also deemed important.
Estimating the population size of shy and elusive species is challenging but necessary to inform appropriate conservation actions for threatened or declining species. Using camera-trap surveys conducted during 2017–2021, we estimated and compared African clawless otter Aonyx capensis population densities and activity times in six conserved areas in southern Africa. We used two different models to estimate densities: random encounter models and camera-trap distance sampling. Our results highlight a general pattern of higher estimated densities and narrower confidence intervals using random encounter models compared to camera-trap distance sampling. We found substantial variation in densities between study areas, with random encounter model estimates ranging between 0.9 and 4.2 otters/km2. Our camera-trap distance sampling estimates supported the relative density estimates obtained from random encounter models but were generally lower and more variable, ranging from 0.8 to 4.0 otters/km2. We found significant differences in otter activity patterns, with populations either being nocturnal, mostly nocturnal or cathemeral. As all study areas experience little human disturbance, our results suggest that there are large natural variations in otter densities and activity patterns between regions. When densities are converted to metrics that are comparable to previous studies, our estimates suggest that African clawless otter population numbers are generally lower than previously reported. This highlights a need for broader spatial coverage of otter population assessments and future studies to assess potential environmental drivers of spatial, and potentially temporal, variation in population numbers and activity patterns.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
Two algorithms based on a latent class model are presented for discovering hierarchical relations that exist among a set of K dichotomous items. The two algorithms, stepwise forward selection and backward elimination, incorporate statistical criteria for selecting (or deleting) 0-1 response pattern vectors to form the subset of the total possible 2k vectors that uniquely describe the hierarchy. The performances of the algorithms are compared, using computer-constructed data, with those of three competing deterministic approaches based on ordering theory and the calculation of Phi/Phi-max coefficients. The discovery algorithms are also demonstrated on real data sets investigated in the literature.
The association between cannabis and psychosis is established, but the role of underlying genetics is unclear. We used data from the EU-GEI case-control study and UK Biobank to examine the independent and combined effect of heavy cannabis use and schizophrenia polygenic risk score (PRS) on risk for psychosis.
Methods
Genome-wide association study summary statistics from the Psychiatric Genomics Consortium and the Genomic Psychiatry Cohort were used to calculate schizophrenia and cannabis use disorder (CUD) PRS for 1098 participants from the EU-GEI study and 143600 from the UK Biobank. Both datasets had information on cannabis use.
Results
In both samples, schizophrenia PRS and cannabis use independently increased risk of psychosis. Schizophrenia PRS was not associated with patterns of cannabis use in the EU-GEI cases or controls or UK Biobank cases. It was associated with lifetime and daily cannabis use among UK Biobank participants without psychosis, but the effect was substantially reduced when CUD PRS was included in the model. In the EU-GEI sample, regular users of high-potency cannabis had the highest odds of being a case independently of schizophrenia PRS (OR daily use high-potency cannabis adjusted for PRS = 5.09, 95% CI 3.08–8.43, p = 3.21 × 10−10). We found no evidence of interaction between schizophrenia PRS and patterns of cannabis use.
Conclusions
Regular use of high-potency cannabis remains a strong predictor of psychotic disorder independently of schizophrenia PRS, which does not seem to be associated with heavy cannabis use. These are important findings at a time of increasing use and potency of cannabis worldwide.
Diagnostic criteria for major depressive disorder allow for heterogeneous symptom profiles but genetic analysis of major depressive symptoms has the potential to identify clinical and etiological subtypes. There are several challenges to integrating symptom data from genetically informative cohorts, such as sample size differences between clinical and community cohorts and various patterns of missing data.
Methods
We conducted genome-wide association studies of major depressive symptoms in three cohorts that were enriched for participants with a diagnosis of depression (Psychiatric Genomics Consortium, Australian Genetics of Depression Study, Generation Scotland) and three community cohorts who were not recruited on the basis of diagnosis (Avon Longitudinal Study of Parents and Children, Estonian Biobank, and UK Biobank). We fit a series of confirmatory factor models with factors that accounted for how symptom data was sampled and then compared alternative models with different symptom factors.
Results
The best fitting model had a distinct factor for Appetite/Weight symptoms and an additional measurement factor that accounted for the skip-structure in community cohorts (use of Depression and Anhedonia as gating symptoms).
Conclusion
The results show the importance of assessing the directionality of symptoms (such as hypersomnia versus insomnia) and of accounting for study and measurement design when meta-analyzing genetic association data.
Mössbauer spectra of 9 glauconite samples from Upper Cretaceous and Lower Tertiary strata in the South Island of New Zealand contain a broad shoulder due to low intensity absorption continuous between 1.0 and 2.5 mm/sec when the absorber is at room temperature; the shoulder is absent, and sharp peaks are apparent in spectra taken with the absorber at 80°K. The data suggest that electron transfer occurs between adjacent Fe3+ and Fe2+ ions at room temperature. The low temperature spectra indicate that all Fe in the glauconites is in octahedral coordination. Fe3+ and Fe2+ ions occur in both eis and trans sites; Fe3+ shows a strong preference for eis sites whereas Fe2+ shows an even stronger preference for trans sites.
The partially variable oxidation state of Fe in glauconite is interpreted in terms of a geochemical model for glauconitization of a degraded or incomplete progenitor phyllosilicate. The model involves exchange of Fe2+ for other cations which temporarily stabilize the progenitor, followed by Fe2+-Fe3+ charge transfer reactions. Each reaction results from the system's tendency towards equilibrium. The model is supported by the observation that artificially leached glauconite increases both its Fe3+ and its Fe2+ content when placed in a solution containing Fe2+ as the only Fe ion present.
Background: Efgartigimod, a human immunoglobulin G (IgG)1 antibody Fc fragment, blocks the neonatal Fc receptor, decreasing IgG recycling and reducing pathogenic IgG autoantibody levels. ADHERE assessed the efficacy and safety of efgartigimod PH20 subcutaneous (SC; co-formulated with recombinant human hyaluronidase PH20) in chronic inflammatory demyelinating polyneuropathy (CIDP). Methods: ADHERE enrolled participants with CIDP (treatment naive or on standard treatments withdrawn during run-in period) and consisted of open-label Stage A (efgartigimod PH20 SC once weekly [QW]), and randomized (1:1) Stage B (efgartigimod or placebo QW). Primary outcomes were clinical improvement (assessed with aINCAT, I-RODS, or mean grip strength; Stage A) and time to first aINCAT score deterioration (relapse; Stage B). Secondary outcomes included treatment-emergent adverse events (TEAEs) incidence. Results: 322 participants entered Stage A. 214 (66.5%) were considered responders, randomized, and treated in Stage B. Efgartigimod significantly reduced the risk of relapse (HR: 0.394; 95% CI: 0.25–0.61) versus placebo (p=0.000039). Reduced risk of relapse occurred in participants receiving corticosteroids, intravenous or SC immunoglobulin, or no treatment before study entry. Most TEAEs were mild to moderate; 3 deaths occurred, none related to efgartigimod. Conclusions: Participants treated with efgartigimod PH20 SC maintained a clinical response and remained relapse-free longer than those treated with placebo.
Marine litter poses a complex challenge in Indonesia, necessitating a well-informed and coordinated strategy for effective mitigation. This study investigates the seasonality of plastic concentrations around Sulawesi Island in central Indonesia during monsoon-driven wet and dry seasons. By using open data and methodologies including the HYCOM and Parcels models, we simulated the dispersal of plastic waste over 3 months during both the southwest and northeast monsoons. Our research extended beyond data analysis, as we actively engaged with local communities, researchers and policymakers through a range of outreach initiatives, including the development of a web application to visualize model results. Our findings underscore the substantial influence of monsoon-driven currents on surface plastic concentrations, highlighting the seasonal variation in the risk to different regional seas. This study adds to the evidence provided by coarser resolution regional ocean modelling studies, emphasizing that seasonality is a key driver of plastic pollution within the Indonesian archipelago. Inclusive international collaboration and a community-oriented approach were integral to our project, and we recommend that future initiatives similarly engage researchers, local communities and decision-makers in marine litter modelling results. This study aims to support the application of model results in solutions to the marine litter problem.
European chub Leuciscus cephalus collected from five localities in the lowland and subalpine regions of Austria were analysed for oestrogenic effects of endocrine-disrupting chemicals and the presence of the plerocercoid of the tapeworm Ligula intestinalis. Of 1494 chub analysed, only seven (six males, one female) were found to be infected with single, but large plerocercoids up to 15 cm in length. Ligula-infected fish showed comparatively immature gonads, as demonstrated by the gonadosomatic index and gamete developmental stages. Plasma levels of the egg precursor protein vitellogenin also showed concentrations ranging below the detection limit. The present results indicate that chub infected with L. intestinalis and exposed to exogenous oestrogenic compounds can result in reduced gonadal maturation and produce false oestrogen-positive diagnoses in male fish. For plasma vitellogenin levels, L. intestinalis infections can result in false oestrogen-negative diagnoses in male and female fish.
To evaluate the comparative epidemiology of hospital-onset bloodstream infection (HOBSI) and central line-associated bloodstream infection (CLABSI)
Design and Setting:
Retrospective observational study of HOBSI and CLABSI across a three-hospital healthcare system from 01/01/2017 to 12/31/2021
Methods:
HOBSIs were identified as any non-commensal positive blood culture event on or after hospital day 3. CLABSIs were identified based on National Healthcare Safety Network (NHSN) criteria. We performed a time-series analysis to assess comparative temporal trends among HOBSI and CLABSI incidence. Using univariable and multivariable regression analyses, we compared demographics, risk factors, and outcomes between non-CLABSI HOBSI and CLABSI, as HOBSI and CLABSI are not exclusive entities.
Results:
HOBSI incidence increased over the study period (IRR 1.006 HOBSI/1,000 patient days; 95% CI 1.001–1.012; P = .03), while no change in CLABSI incidence was observed (IRR .997 CLABSIs/1,000 central line days, 95% CI .992–1.002, P = .22). Differing demographic, microbiologic, and risk factor profiles were observed between CLABSIs and non-CLABSI HOBSIs. Multivariable analysis found lower odds of mortality among patients with CLABSIs when adjusted for covariates that approximate severity of illness (OR .27; 95% CI .11–.64; P < .01).
Conclusions:
HOBSI incidence increased over the study period without a concurrent increase in CLABSI in our study population. Furthermore, risk factor and outcome profiles varied between CLABSI and non-CLABSI HOBSI, which suggest that these metrics differ in important ways worth considering if HOBSI is adopted as a quality metric.
To evaluate the economic costs of reducing the University of Virginia Hospital’s present “3-negative” policy, which continues methicillin-resistant Staphylococcus aureus (MRSA) contact precautions until patients receive 3 consecutive negative test results, to either 2 or 1 negative.
Design:
Cost-effective analysis.
Settings:
The University of Virginia Hospital.
Patients:
The study included data from 41,216 patients from 2015 to 2019.
Methods:
We developed a model for MRSA transmission in the University of Virginia Hospital, accounting for both environmental contamination and interactions between patients and providers, which were derived from electronic health record (EHR) data. The model was fit to MRSA incidence over the study period under the current 3-negative clearance policy. A counterfactual simulation was used to estimate outcomes and costs for 2- and 1-negative policies compared with the current 3-negative policy.
Results:
Our findings suggest that 2-negative and 1-negative policies would have led to 6 (95% CI, −30 to 44; P < .001) and 17 (95% CI, −23 to 59; −10.1% to 25.8%; P < .001) more MRSA cases, respectively, at the hospital over the study period. Overall, the 1-negative policy has statistically significantly lower costs ($628,452; 95% CI, $513,592–$752,148) annually (P < .001) in US dollars, inflation-adjusted for 2023) than the 2-negative policy ($687,946; 95% CI, $562,522–$812,662) and 3-negative ($702,823; 95% CI, $577,277–$846,605).
Conclusions:
A single negative MRSA nares PCR test may provide sufficient evidence to discontinue MRSA contact precautions, and it may be the most cost-effective option.
First published as a special issue of 'Policy and Politics', this book presents original critical reflections on the value of design approaches and how they relate to the classical idea of public administration as a design science.
Executive functions (EFs) are considered to be both unitary and diverse functions with common conceptualizations consisting of inhibitory control, working memory, and cognitive flexibility. Current research indicates that these abilities develop along different timelines and that working memory and inhibitory control may be foundational for cognitive flexibility, or the ability to shift attention between tasks or operations. Very few interventions target cognitive flexibility despite its importance for academic or occupational tasks, social skills, problem-solving, and goal-directed behavior in general, and the ability is commonly impaired in individuals with neurodevelopmental disorders (NDDs) such as autism spectrum disorder, attention deficit hyperactivity disorder, and learning disorders. The current study investigated a tablet-based cognitive flexibility intervention, Dino Island (DI), that combines a game-based, process-specific intervention with compensatory metacognitive strategies as delivered by classroom aides within a school setting.
Participants and Methods:
20 children between ages 6-12 years (x̄ = 10.83 years) with NDDs and identified executive function deficits and their assigned classroom aides (i.e., “interventionists”) were randomly assigned to either DI or an educational game control condition. Interventionists completed a 2-4 hour online training course and a brief, remote Q&A session with the research team, which provided key information for delivering the intervention such as game-play and metacognitive/behavioral strategy instruction. Fidelity checks were conducted weekly. Interventionists were instructed to deliver 14-16 hours of intervention during the school day over 6-8 weeks, divided into 3-4 weekly sessions of 30-60 minutes each. Baseline and post-intervention assessments consisted of cognitive measures of cognitive flexibility (Minnesota Executive Function Scale), working memory (Weschler Intelligence Scales for Children, 4th Edn. Integrated Spatial Span) and parent-completed EF rating scales (Behavior Rating Inventory of Executive Function).
Results:
Samples sizes were smaller than expected due to COVID-19 related disruptions within schools, so nonparametric analyses were conducted to explore trends in the data. Results of the Mann-Whitney U test indicated that participants within the DI condition made greater gains in cognitive flexibility with a trend towards significance (p = 0.115. After dummy coding for positive change, results also indicated that gains in spatial working memory differed by condition (p = 0.127). Similarly, gains in task monitoring trended towards significant difference by condition.
Conclusions:
DI, a novel EF intervention, may be beneficial to cognitive flexibility, working memory, and monitoring skills within youth with EF deficits. Though there were many absences and upheavals within the participating schools related to COVID-19, it is promising to see differences in outcomes with such a small sample. This poster will expand upon the current results as well as future directions for the DI intervention.
The Latinx population is rapidly aging and growing in the US and is at increased risk for stroke and dementia. We examined whether bilingualism confers cognitive resilience following stroke in a community-based sample of Mexican American (MA) older adults.
Participants and Methods:
Participants included predominantly urban, non-immigrant MAs aged 65+ from the Brain Attack Surveillance in Corpus Christi- Cognitive study. Participants were recruited using a two-stage area probability sample with door-to-door recruitment until the onset of the COVID-19 pandemic; sampling and recruitment were then completed via telephone. Cognition was assessed with the Montreal Cognitive Assessment (MoCA; 30-item in-person, 22-item via telephone) in English or Spanish. Bilingualism was assessed via a questionnaire and degree of bilingualism was calculated (range 0%-100% bilingual). Stroke history was collected via self-report. We harmonized the 22-item to the 30-item MoCA using published equipercentile equating. We conducted a series of regressions with the harmonized MoCA score as the dependent variable, stroke history and degree of bilingualism as independent variables, and age, sex/gender, education, assessment language, assessment mode (in-person vs. phone), and self-reported vascular risk factors (hypertension, diabetes, heart disease) as covariates. We included a stroke history by bilingualism interaction to examine whether bilingualism modifies the association between stroke history and MoCA performance.
Results:
Participants included 841 MA older adults (59% women; age M(SE) = 73.5(0.2); 44% less than high school education). Most (77%) of the sample completed the MoCA in English. 93 of 841 participants reported a history of stroke. In an unadjusted model, degree of bilingualism (b = 3.41, p < .0001) and stroke history (b = -1.98, p = .003) were associated with MoCA performance. In a fully adjusted model, stroke history (b = -1.79, p = .0007) but not bilingualism (b = 0.78, p = .21) was associated with MoCA performance. When an interaction term was added to the fully adjusted model, the interaction between stroke history and bilingualism was not significant (b= -0.47, p = .78).
Conclusions:
Degree of bilingualism does not modify the association between stroke history and MoCA performance in Mexican American older adults. These results should be replicated in samples of validated strokes, more comprehensive bilingualism and cognitive assessments, and in other bilingual populations.