We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Functional cognitive disorder is an increasingly recognised subtype of functional neurological disorder for which treatment options are currently limited. We have developed a brief online group acceptance and commitment therapy (ACT)-based intervention.
Aims
To assess the feasibility of conducting a randomised controlled trial of this intervention versus treatment as usual (TAU).
Method
The study was a parallel-group, single-blind randomised controlled trial, with participants recruited from cognitive neurology, neuropsychiatry and memory clinics in London. Participants were randomised into two groups: ACT + TAU or TAU alone. Feasibility was assessed on the basis of recruitment and retention rates, the acceptability of the intervention, and signal of efficacy on the primary outcome measure (Acceptance and Action Questionnaire II (AAQ-II)) score, although the study was not powered to demonstrate this statistically. Outcome measures were collected at baseline and at 2, 4 and 6 months post-intervention, including assessments of quality of life, memory, anxiety, depression and healthcare use.
Results
We randomised 44 participants, with a participation rate of 51.1% (95% CI 40.8–61.5%); 36% of referred participants declined involvement, but retention was high, with 81.8% of ACT participants attending at least four sessions, and 64.3% of ACT participants reported being ‘satisfied’ or ‘very satisfied’ compared with 0% in the TAU group. Psychological flexibility as measured using the AAQ-II showed a trend towards modest improvement in the ACT group at 6 months. Other measures (quality of life, mood, memory satisfaction) also demonstrated small to modest positive trends.
Conclusions
It has proven feasible to conduct a randomised controlled trial of ACT versus TAU.
With a few exceptions, the problem of linking item response model parameters from different item calibrations has been conceptualized as an instance of the problem of test equating scores on different test forms. This paper argues, however, that the use of item response models does not require any test score equating. Instead, it involves the necessity of parameter linking due to a fundamental problem inherent in the formal nature of these models—their general lack of identifiability. More specifically, item response model parameters need to be linked to adjust for the different effects of the identifiability restrictions used in separate item calibrations. Our main theorems characterize the formal nature of these linking functions for monotone, continuous response models, derive their specific shapes for different parameterizations of the 3PL model, and show how to identify them from the parameter values of the common items or persons in different linking designs.
Ultra-processed foods (UPF), defined using the Nova classification system, are associated with increased chronic disease risk. More recently, evidence suggests the UPF subgroup of whole-grain breads and cereals is in fact linked with reduced chronic disease risk. This study aimed to explore associations of cardiometabolic risk measures with Nova UPF intake v. when foods with ≥ 25 or ≥ 50 % whole grains are excluded from the definition. We considered dietary data from the Australian National Nutrition and Physical Activity Survey 2011–2012. Impacts on associations of UPF intake (quintiles) and cardiometabolic risk measures were analysed using regression models. The median proportion of UPF intake from high whole-grain foods was zero for all quintiles. Participants in the highest Nova UPF intake quintile had significantly higher weight (78·1 kg (0·6)), BMI (27·2 kg/m2 (0·2)), waist circumference (92·7 cm (0·5)) and weight-to-height ratio (0·55 (0·003)) compared with the lowest quintile (P< 0·05). Associations were the same when foods with ≥ 25 and ≥ 50 % whole grains were excluded. Adjusted R-squared values remained similar across all approaches for all outcomes. In Australia, high whole-grain foods considered UPF may not significantly contribute to deleterious cardiometabolic risk associations. Until conclusive evidence on Nova UPF is available, prioritisation should be given to the nutrient density of high whole-grain foods and their potential contribution to improving whole-grain intakes and healthful dietary patterns in Australia.
We examine whether the “privileged coordinates” of a geometric space encode its “amount of structure.” In doing so, we compare this coordinate approach to comparing amounts of structure to the more familiar automorphism approach. We first show that on a natural understanding of the former, it faces one of the same well-known problems as the latter. We then capture a precise sense in which the two approaches are closely related to one another, and we conclude by discussing whether they might still prove useful in cases of philosophical interest, despite their shortcomings.
Identifying persons with HIV (PWH) at increased risk for Alzheimer’s disease (AD) is complicated because memory deficits are common in HIV-associated neurocognitive disorders (HAND) and a defining feature of amnestic mild cognitive impairment (aMCI; a precursor to AD). Recognition memory deficits may be useful in differentiating these etiologies. Therefore, neuroimaging correlates of different memory deficits (i.e., recall, recognition) and their longitudinal trajectories in PWH were examined.
Design:
We examined 92 PWH from the CHARTER Program, ages 45–68, without severe comorbid conditions, who received baseline structural MRI and baseline and longitudinal neuropsychological testing. Linear and logistic regression examined neuroanatomical correlates (i.e., cortical thickness and volumes of regions associated with HAND and/or AD) of memory performance at baseline and multilevel modeling examined neuroanatomical correlates of memory decline (average follow-up = 6.5 years).
Results:
At baseline, thinner pars opercularis cortex was associated with impaired recognition (p = 0.012; p = 0.060 after correcting for multiple comparisons). Worse delayed recall was associated with thinner pars opercularis (p = 0.001) and thinner rostral middle frontal cortex (p = 0.006) cross sectionally even after correcting for multiple comparisons. Delayed recall and recognition were not associated with medial temporal lobe (MTL), basal ganglia, or other prefrontal structures. Recognition impairment was variable over time, and there was little decline in delayed recall. Baseline MTL and prefrontal structures were not associated with delayed recall.
Conclusions:
Episodic memory was associated with prefrontal structures, and MTL and prefrontal structures did not predict memory decline. There was relative stability in memory over time. Findings suggest that episodic memory is more related to frontal structures, rather than encroaching AD pathology, in middle-aged PWH. Additional research should clarify if recognition is useful clinically to differentiate aMCI and HAND.
Whole-grain intake is associated with reduced risk of non-communicable diseases. Greater understanding of major food sources of whole grains globally, and how intake has been quantified, is essential to informing accurate strategies aiming to increase consumption and reduce non-communicable disease risk. Therefore, the aim of this review was to identify the primary food sources of whole-grain intake globally and explore how they are quantified and reported within literature, and their recommendation within respective national dietary guidelines. A structured scoping review of published articles and grey literature used a predefined search strategy across electronic databases. Data were extracted and summarised based on identified outcomes (e.g. primary sources of whole-grain intake and quantification methods). Dietary intake values were noted where available. Thirteen records across twenty-four countries identified bread and bread rolls, and ready-to-eat cereals as primary sources of whole-grain intake in Australia, New Zealand, Europe, the UK and Northern America. Elsewhere, sources vary and for large parts of the world (e.g. Africa and Asia), intake data are limited or non-existent. Quantification of whole grain also varied across countries, with some applying different whole-grain food definitions, resulting in a whole-grain intake based on only consumption of select ‘whole-grain’ foods. National dietary guidelines were consistent in promoting whole-grain intake and providing examples of country-specific whole-grain foods. Consistency in whole-grain calculation methods is needed to support accurate and comparative research informing current intake evidence and promotional efforts. National dietary guidelines are consistent in promoting whole-grain intake; however, there is variability in recommendations.
Climate change is causing Himalayan glaciers to shrink rapidly and natural hazards to increase, while downstream exposure is growing. Glacier shrinkage promotes the formation of glacial lakes, which can suddenly drain and produce glacier lake outburst floods (GLOFs). Bhutan is one of the most vulnerable countries globally to these hazards. Here we use remotely sensed imagery to quantify changes in supraglacial water storage on Tshojo Glacier, Bhutan, where previous supraglacial pond drainage events have necessitated downstream evacuation. Results showed a doubling of both total ponded area (104 529 m2 to 213 943 m2) and its std dev. (64 808 m2 to 158 550 m2) between the periods 1987–2003 and 2007–2020, which was predominantly driven by increases in the areas of the biggest ponds. These ponds drained regularly and have occupied the same location since at least 1967. Tshojo Glacier has remained in the first stage of proglacial lake development for 53 years, which we attribute to its moderate slopes and ice velocities. Numerical modelling shows that pond outbursts can reach between ~6 and 47 km downstream, impacting the remote settlement of Lunana. Our results highlight the need to better quantify variability in supraglacial water storage and its potential to generate GLOFs, as climate warms.
The New Jersey Kids Study (NJKS) is a transdisciplinary statewide initiative to understand influences on child health, development, and disease. We conducted a mixed-methods study of project planning teams to investigate team effectiveness and relationships between team dynamics and quality of deliverables.
Methods:
Ten theme-based working groups (WGs) (e.g., Neurodevelopment, Nutrition) informed protocol development and submitted final reports. WG members (n = 79, 75%) completed questionnaires including de-identified demographic and professional information and a modified TeamSTEPPS Team Assessment Questionnaire (TAQ). Reviewers independently evaluated final reports using a standardized tool. We analyzed questionnaire results and final report assessments using linear regression and performed constant comparative qualitative analysis to identify central themes.
Results:
WG-level factors associated with greater team effectiveness included proportion of full professors (β = 31.24, 95% CI 27.65–34.82), team size (β = 0.81, 95% CI 0.70–0.92), and percent dedicated research effort (β = 0.11, 95% CI 0.09–0.13); age distribution (β = −2.67, 95% CI –3.00 to –2.38) and diversity of school affiliations (β = –33.32, 95% CI –36.84 to –29.80) were inversely associated with team effectiveness. No factors were associated with final report assessments. Perceptions of overall initiative leadership were associated with expressed enthusiasm for future NJKS participation. Qualitative analyses of final reports yielded four themes related to team science practices: organization and process, collaboration, task delegation, and decision-making patterns.
Conclusions:
We identified several correlates of team effectiveness in a team science initiative's early planning phase. Extra effort may be needed to bridge differences in team members' backgrounds to enhance the effectiveness of diverse teams. This work also highlights leadership as an important component in future investigator engagement.
The Watersports Inclusion Games is a free annual weekend event, where young people with a range of physical and intellectual disabilities and their families/siblings participate in various inclusive watersports activities.
Objectives
This study aims to assess the psychological benefits of watersports for young people with various physical and intellectual disabilities and investigate the extent of the impact of the COVID-19 pandemic on their access to watersports.
Methods
Following a literature review, a survey containing both quantitative and qualitative aspects was constructed using SurveyMonkey and circulated to the parents/guardians of participants three times following the event. The survey was completed anonymously on an opt-in basis and 28 responses that met our criteria for analysis were collected. Qualitative data from free-text responses were grouped under themes and quantitative data was analysed using SPSS.
Results
Despite 64% (n=18) of respondents indicating that their disability increased their vulnerability to COVID-19 in some capacity, the effect of the pandemic on accessibility was not statistically significant. This could be due to the small response number, or the everyday limitations participants faced prior to the pandemic. 92% (n=25) of participants indicated that there was great inclusion in the watersports activities and that they were “very beneficial” regarding the possibility of the whole family’s participation [p=0.005]. The survey also found a statistically significant association between the event’s activities being considered both “accessible” and “very beneficial” in terms of boosting self-confidence, with 57.1% of responses indicating agreement to this. (p=0.016)
Conclusions
Full-family participation and accessibility of activities were key facilitators to the enjoyment and benefit of participants. Programmes should be established that allow able-bodied siblings and young people with disabilities to participate in the same activities.
Frequently used physical therapy (PT) equipment is difficult to disinfect due to equipment material and shape. The efficacy of standard disinfection of PT equipment is poorly understood.
Methods:
We completed a 2-phase prospective microbiological analysis of fomites used in PT at our hospital from September 2022 to October 2023. For both phases, study fomites were obtained after usage and split into symmetrical halves for sampling. In phase 1, sides were sampled following standard disinfection. In phase 2, sides were randomized 1:1 to intervention or control. Samples were obtained before and after the intervention, a disinfection cabinet using Ultraviolet C (UV-C) and 6% nebulized hydrogen peroxide. We defined antimicrobial-resistant clinically important pathogens (AMR CIP) as methicillin-resistant staphylococcus aureus (MRSA), Vancomycin Resistant Enterococcus (VRE), and Multidrug resistant (MDR)-Gram-negatives and non-AMR CIP as methicillin-sensitive staphylococcus aureus (MSSA), Vancomycin sensitive Enterococcus (VSE), and Gram-negatives. Three assessments were made: 1) contamination following standard disinfection (phase 1), 2) contamination postintervention compared to no disinfection (phase 2) and, 3) contamination following standard disinfection compared to postintervention (phase 1 vs phase 2 intervention).
Results:
The median total colony-forming units (CFU) from 122 study fomite samples was 1,348 (IQR 398–2,365). At the sample level, 52(43%) and 15(12%) of samples harbored any clinically important pathogens (CIPs) or AMR CIPs, respectively. The median CFU was 0 (IQR 0–55) in the intervention group and 977 (409–2,547) in the control group (P < .00001).
Conclusion:
Following standard disinfection, PT equipment remained heavily contaminated including AMR and non-AMR CIPs. Following the intervention, PT equipment was less contaminated and harbored no AMR CIPs compared to control sides supporting the efficacy of the intervention on difficult-to-disinfect PT fomites.
This paper presents a detailed chronological study of the previously undisturbed burial ground of Choburak-I of the Bulan-Koby Culture in the Northern Altai using a program of comprehensive dating, including AMS 14C dating of human and animal remains (26 14C dates from 12 kurgans in total), and archaeological dating of the associated artifacts. This completely excavated cemetery contained numerous grave goods and various organic remains (anthropological and archaeozoological) critical for understanding the social and chronological dynamics of this culture during the Rouran period in Altai (second half of the 4th–first half of the 6th century CE). The results of archaeological dating, supported by the largest set of AMS 14C dates for the Bulan-Koby Culture, and further aided by Bayesian analysis, demonstrate the likely continuous existence of the necropolis within the period of 310–400 cal CE, which broadly corresponds to the beginning of the Rouran period in the history of Altai, with a maximum duration of 66 years. The presented results make it possible to consider the necropolis of Choburak-I as a chronologically defining monument of the Rouran period of Northern Altai and permit a new level of relative and absolute chronological reconstructions for archaeological sites of this region and adjacent territories at the turn of late antiquity and the early Middle Ages.
We evaluated sampling and detection methods for fungal contamination on healthcare surface materials, comparing the efficacy of foam sponges, flocked swabs, and Replicate Organism Detection And Counting (RODAC) plates alongside culture-based quantification and quantitative polymerase chain reaction (qPCR). Findings indicate that sponge sampling and qPCR detection performed best, suggesting a foundation for future studies aiming to surveillance practices for fungi.
Dietary therapies have revolutionised treatment for irritable bowel syndrome (IBS). However, response rates to the diet with the highest evidence of efficacy (the low FODMAP diet) remain at 50-75%, suggesting other potential drivers of symptom onset. A low food chemical elimination-rechallenge diet targeting bioactive food chemicals (including salicylates, amines, glutamate and other additives), is commonly applied in Australia in patients exhibiting both gastrointestinal and extra-intestinal symptoms. One key food chemical, salicylate, has been shown to elicit symptoms in IBS patients with aspirin-sensitivity(1), and 77% of IBS patients have reported amine-rich foods trigger symptoms(2). However, data supporting the full low chemical diet is scant, and safety concerns exist due to its restrictive nature potentially causing nutritional deficiencies and disordered eating. This cross-sectional survey aimed to evaluate the frequency of co-existing extra-intestinal symptoms, as well as explore patient perceptions and use of the low chemical diet in those with IBS and healthy controls. Participants with IBS (IBS-Severity Scoring System (IBS-SSS) >75), and healthy controls (not meeting Rome IV and IBS-SSS ≤75) were recruited via online advertisement. Validated questionnaires were used to assess gastrointestinal symptoms (IBS-SSS), extraintestinal symptoms (extended PHQ-12), nutrient (Comprehensive Nutritional Assessment Tool) and food additive intake (IBD-Food additive questionnaire). Additional questionnaires assessed use of dietary therapies with specific focus on food chemicals. Data was analysed using independent samples t-test and chi-square test. 204 IBS (Total IBS-SSS, 277 ± 79) and 22 healthy controls (36 ± 28, p<0.01) completed the study. IBS participants were more likely to report extra-intestinal symptoms including headaches (p<0.01), migraines (p = 0.03), fatigue (p<0.01), difficulty sleeping (p = 0.03), rhinitis (p = 0.02), urticaria (p = 0.04) and mood disturbance (p<0.01). IBS participants were more likely to report at least one food chemical as a trigger for gastrointestinal (38% vs 13%, p = 0.03) and/or extra-intestinal (30% vs 9%, p = 0.04) symptoms. In the IBS group, the most common suspected dietary triggers for gastrointestinal symptoms were salicylates (19%) followed by MSG (17%) and artificial colours (14%); while for extra-intestinal symptoms, MSG (15%) was most common, followed by amines (14%), and sulphites (12%). There was no significant difference in consumption of ultra-processed, additive containing foods. Twenty-one (10%) IBS participants were following a low chemical diet, with dietary advice provided by a dietitian (n = 13), general practitioner (n = 6), gastroenterologist (n = 6), naturopath (n = 3), family/friend (n = 4) and/or the diet was self-initiated (n = 7). Fourteen of the 21 (67%) reported following both a low food chemical and low FODMAP diet. Patients with IBS are more likely to report extra-intestinal symptoms compared to healthy controls. Despite limited evidence, a low food chemical diet is utilised to manage both gastrointestinal and extra-intestinal symptoms. Of concern, many respondents following a low food chemical diet reported also following a low FODMAP diet, which may have implications for nutritional adequacy.
Addressing aggressive behavior in adolescence is a key step toward preventing violence and associated social and economic costs in adulthood. This study examined the secondary effects of the personality-targeted substance use preventive program Preventure on aggressive behavior from ages 13 to 20.
Methods
In total, 339 young people from nine independent schools (M age = 13.03 years, s.d. = 0.47, range = 12–15) who rated highly on one of the four personality traits associated with increased substance use and other emotional/behavioral symptoms (i.e. impulsivity, anxiety sensitivity, sensation seeking, and negative thinking) were included in the analyses (n = 145 in Preventure, n = 194 in control). Self-report assessments were administered at baseline and follow-up (6 months, 1, 2, 3, 5.5, and 7 years). Overall aggression and subtypes of aggressive behaviors (proactive, reactive) were examined using multilevel mixed-effects analysis accounting for school-level clustering.
Results
Across the 7-year follow-up period, the average yearly reduction in the frequency of aggressive behaviors (b = −0.42; 95% confidence interval [CI] −0.64 to −0.20; p < 0.001), reactive aggression (b = −0.22; 95% CI 0.35 to −0.10; p = 0.001), and proactive aggression (b = −0.14; 95% CI −0.23 to −0.05; p = 0.002) was greater for the Preventure group compared to the control group.
Conclusions
The study suggests a brief personality-targeted intervention may have long-term impacts on aggression among young people; however, this interpretation is limited by imbalance of sex ratios between study groups.
Aquatic ecosystems - lakes, ponds and streams - are hotspots of biodiversity in the cold and arid environment of Continental Antarctica. Environmental change is expected to increasingly alter Antarctic aquatic ecosystems and modify the physical characteristics and interactions within the habitats that they support. Here, we describe physical and biological features of the peripheral ‘moat’ of a closed-basin Antarctic lake. These moats mediate connectivity amongst streams, lake and soils. We highlight the cyclical moat transition from a frozen winter state to an active open-water summer system, through refreeze as winter returns. Summer melting begins at the lakebed, initially creating an ice-constrained lens of liquid water in November, which swiftly progresses upwards, creating open water in December. Conversely, freezing progresses slowly from the water surface downwards, with water at 1 m bottom depth remaining liquid until May. Moats support productive, diverse benthic communities that are taxonomically distinct from those under the adjacent permanent lake ice. We show how ion ratios suggest that summer exchange occurs amongst moats, streams, soils and sub-ice lake water, perhaps facilitated by within-moat density-driven convection. Moats occupy a small but dynamic area of lake habitat, are disproportionately affected by recent lake-level rises and may thus be particularly vulnerable to hydrological change.
The origins and timing of inpatient room sink contamination with carbapenem-resistant organisms (CROs) are poorly understood.
Methods:
We performed a prospective observational study to describe the timing, rate, and frequency of CRO contamination of in-room handwashing sinks in 2 intensive care units (ICU) in a newly constructed hospital bed tower. Study units, A and B, were opened to patient care in succession. The patients in unit A were moved to a new unit in the same bed tower, unit B. Each unit was similarly designed with 26 rooms and in-room sinks. Microbiological samples were taken every 4 weeks from 3 locations from each study sink: the top of the bowl, the drain cover, and the p-trap. The primary outcome was sink conversion events (SCEs), defined as CRO contamination of a sink in which CRO had not previously been detected.
Results:
Sink samples were obtained 22 times from September 2020 to June 2022, giving 1,638 total environmental cultures. In total, 2,814 patients were admitted to study units while sink sampling occurred. We observed 35 SCEs (73%) overall; 9 sinks (41%) in unit A became contaminated with CRO by month 10, and all 26 sinks became contaminated in unit B by month 7. Overall, 299 CRO isolates were recovered; the most common species were Enterobacter cloacae and Pseudomonas aeruginosa.
Conclusion:
CRO contamination of sinks in 2 newly constructed ICUs was rapid and cumulative. Our findings support in-room sinks as reservoirs of CRO and emphasize the need for prevention strategies to mitigate contamination of hands and surfaces from CRO-colonized sinks.
Various water-based heater-cooler devices (HCDs) have been implicated in nontuberculous mycobacteria outbreaks. Ongoing rigorous surveillance for healthcare-associated M. abscessus (HA-Mab) put in place following a prior institutional outbreak of M. abscessus alerted investigators to a cluster of 3 extrapulmonary M. abscessus infections among patients who had undergone cardiothoracic surgery.
Methods:
Investigators convened a multidisciplinary team and launched a comprehensive investigation to identify potential sources of M. abscessus in the healthcare setting. Adherence to tap water avoidance protocols during patient care and HCD cleaning, disinfection, and maintenance practices were reviewed. Relevant environmental samples were obtained. Patient and environmental M. abscessus isolates were compared using multilocus-sequence typing and pulsed-field gel electrophoresis. Smoke testing was performed to evaluate the potential for aerosol generation and dispersion during HCD use. The entire HCD fleet was replaced to mitigate continued transmission.
Results:
Clinical presentations of case patients and epidemiologic data supported intraoperative acquisition. M. abscessus was isolated from HCDs used on patients and molecular comparison with patient isolates demonstrated clonality. Smoke testing simulated aerosolization of M. abscessus from HCDs during device operation. Because the HCD fleet was replaced, no additional extrapulmonary HA-Mab infections due to the unique clone identified in this cluster have been detected.
Conclusions:
Despite adhering to HCD cleaning and disinfection strategies beyond manufacturer instructions for use, HCDs became colonized with and ultimately transmitted M. abscessus to 3 patients. Design modifications to better contain aerosols or filter exhaust during device operation are needed to prevent NTM transmission events from water-based HCDs.
Many people with HIV (PWH) are at risk for age-related neurodegenerative disorders such as Alzheimer’s disease (AD). Studies on the association between cognition, neuroimaging outcomes, and the Apolipoprotein E4 (APOE4) genotype, which is associated with greater risk of AD, have yielded mixed results in PWH; however, many of these studies have examined a wide age range of PWH and have not examined APOE by race interactions that are observed in HIV-negative older adults. Thus, we examined how APOE status relates to cognition and medial temporal lobe (MTL) structures (implicated in AD pathogenesis) in mid- to older-aged PWH. In exploratory analyses, we also examined race (African American (AA)/Black and non-Hispanic (NH) White) by APOE status interactions on cognition and MTL structures.
Participants and Methods:
The analysis included 88 PWH between the ages of 45 and 68 (mean age=51±5.9 years; 86% male; 51% AA/Black, 38% NH-White, 9% Hispanic/Latinx, 2% other) from the CNS HIV Antiretroviral Therapy Effects Research multi-site study. Participants underwent APOE genotyping, neuropsychological testing, and structural MRI; APOE groups were defined as APOE4+ (at least one APOE4 allele) and APOE4- (no APOE4 alleles). Eighty-nine percent of participants were on antiretroviral therapy, 74% had undetectable plasma HIV RNA (<50 copies/ml), and 25% were APOE4+ (32% AA/Black/15% NH-White). Neuropsychological testing assessed seven domains, and demographically-corrected T-scores were calculated. FreeSurfer 7.1.1 was used to measure MTL structures (hippocampal volume, entorhinal cortex thickness, and parahippocampal thickness) and the effect of scanner was regressed out prior to analyses. Multivariable linear regressions tested the association between APOE status and cognitive and imaging outcomes. Models examining cognition covaried for comorbid conditions and HIV disease characteristics related to global cognition (i.e., AIDS status, lifetime methamphetamine use disorder). Models examining the MTL covaried for age, sex, and
relevant imaging covariates (i.e., intracranial volume or mean cortical thickness).
Results:
APOE4+ carriers had worse learning (ß=-0.27, p=.01) and delayed recall (ß=-0.25, p=.02) compared to the APOE4- group, but APOE status was not significantly associated with any other domain (ps>0.24). APOE4+ status was also associated with thinner entorhinal cortex (ß=-0.24, p=.02). APOE status was not significantly associated with hippocampal volume (ß=-0.08, p=0.32) or parahippocampal thickness (ß=-0.18, p=.08). Lastly, race interacted with APOE status such that the negative association between APOE4+ status and cognition was stronger in NH-White PWH as compared to AA/Black PWH in learning, delayed recall, and verbal fluency (ps<0.05). There were no APOE by race interactions for any MTL structures (ps>0.10).
Conclusions:
Findings suggest that APOE4 carrier status is associated with worse episodic memory and thinner entorhinal cortex in mid- to older-aged PWH. While APOE4+ groups were small, we found that APOE4 carrier status had a larger association with cognition in NH-White PWH as compared to AA/Black PWH, consistent with studies demonstrating an attenuated effect of APOE4 in older AA/Black HIV-negative older adults. These findings further highlight the importance of recruiting diverse samples and suggest exploring other genetic markers (e.g., ABCA7) that may be more predictive of AD in some races to better understand AD risk in diverse groups of PWH.
Blood-culture overutilization is associated with increased cost and excessive antimicrobial use. We implemented an intervention in the adult intensive care unit (ICU), combining education based on the DISTRIBUTE algorithm and restriction to infectious diseases and ICU providers. Our intervention led to reduced blood-culture utilization without affecting safety metrics.