We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Early Minimally Invasive Removal of Intracerebral Hemorrhage (ENRICH) trial demonstrated that minimally invasive surgery to treat spontaneous lobar intracerebral hemorrhage (ICH) improved functional outcomes. We aimed to explore current management trends for spontaneous lobar ICH in Canada to assess practice patterns and determine whether further randomized controlled trials are needed to clarify the role of surgical intervention.
Methods:
Neurologists, neurosurgeons, physiatrists and trainees in these specialties were invited to complete a 16-question survey exploring three areas: (1) current management for spontaneous lobar ICH at their institution, (2) perceived influence of ENRICH on their practice and (3) perceived need for additional clinical trial data. Standard descriptive statistics were used to report categorical variables. The χ2 test was used to compare responses across specialties and career stages.
Results:
The survey was sent to 433 physicians, and 101 (23.3%) responded. Sixty-eight percent of participants reported that prior to publication of the ENRICH trial, spontaneous lobar ICH was primarily managed conservatively, with surgery reserved for life-threatening situations. Forty-three percent of participants did not foresee a significant increase in surgical intervention at their institution. Of neurosurgical respondents, 33% remained hesitant to offer surgical intervention beyond lifesaving operations. Only 5% reported routinely using specifically designed technologies to evacuate ICH. Seventy percent reported that another randomized controlled trial comparing nonsurgical to surgical management for spontaneous lobar ICH is needed.
Conclusions:
There is significant practice variability in the management of spontaneous lobar ICH across Canadian institutions, stressing the need for additional clinical trial data to determine the role of surgical intervention.
Most people with mental illness in low and middle-income countries (LMICs) do not receive biomedical treatment, though many seek care from traditional healers and faith healers. We conducted a qualitative study in Buyende District, Uganda, using framework analysis. Data collection included interviews with 24 traditional healers, 20 faith healers, and 23 biomedical providers, plus 4 focus group discussions. Interviews explored treatment approaches, provider relationships, and collaboration potential until theoretical saturation was reached. Three main themes emerged: (1) Biomedical providers’ perspectives on traditional and faith healers; (2) Traditional and faith healers’ views on biomedical providers; and (3) Collaboration opportunities and barriers. Biomedical providers viewed faith healers positively but traditional healers as potentially harmful. Traditional and faith healers valued biomedical approaches while feeling variably accepted. Interest in collaboration existed across groups but was complicated by power dynamics, economic concerns, and differing mental illness conceptualizations. Traditional healers and faith healers routinely referred patients to biomedical providers, though reciprocal referrals were rare. The study reveals distinct dynamics among providers in rural Uganda, with historical colonial influences continuing to shape relationships and highlighting the need for integrated, contextually appropriate mental healthcare systems.
We sought to characterize US pediatric antimicrobial stewardship programs (ASPs), including their hospital demographics, staffing, funded full-time equivalents (FTEs) by hospital size, and relative emphasis on recommended stewardship strategies. We examined the self-reported characteristics of ASP personnel with regard to discipline, race, ethnicity, gender identity, and years of experience in antimicrobial stewardship.
Design:
Descriptive two-part survey.
Setting:
Pediatric ASPs at hospitals participating in Sharing Antimicrobial Reports for Pediatric Stewardship (SHARPS), a pediatric quality improvement collaborative of >70 children’s hospitals.
Participants:
Survey distributed to 82 US pediatric ASPs, excluding hospitals without pediatric ASPs. Part I completed by ASP leader (physician or pharmacist). Part II distributed to ASP team members.
Methods:
Part I addressed hospital demographics, ASP funding, and program choices related to the CDC’s 2019 Core Elements of Hospital Antibiotic Stewardship Programs. Part II requested that participants anonymously self-identify race, ethnicity, gender identity, training, and duration of ASP experience. Descriptive statistics performed.
Results:
Sixty-two ASPs responded: 61 (98%) with formal ASP, 40 (65%) from freestanding children’s hospitals. 40 (65%) co-led by an ASP physician and pharmacist. 60 (97%) reported dedicated inpatient physician FTE, 57 (92%) dedicated inpatient pharmacist FTE. Most programs (35 [58%]) reported inadequate staffing support. The 125 ASP professionals who completed Part II predominantly self-reported as White (89 [71%]), with fewer self-reporting as Asian (9 [15%]) or Black (4 [3%]).
Conclusion:
US pediatric ASPs have achieved substantial progress in meeting the CDC Core Elements, but many report insufficient resources. We identified underrepresentation in the ASP workforce.
Diagnostic stewardship of urine cultures from patients with indwelling urinary catheters may improve diagnostic specificity and clinical relevance of the test, but risk of patient harm is uncertain.
Methods:
We retrospectively evaluated the impact of a computerized clinical decision support tool to promote institutional appropriateness criteria (neutropenia, kidney transplant, recent urologic surgery, or radiologic evidence of urinary tract obstruction) for urine cultures from patients with an indwelling urinary catheter. The primary outcome was a change in catheter-associated urinary tract infection (CAUTI) rate from baseline (34 mo) to intervention period (30 mo, including a 2-mo wash-in period). We analyzed patient-level outcomes and adverse events.
Results:
Adjusted CAUTI rate decreased from 1.203 to 0.75 per 1,000 catheter-days (P = 0.52). Of 598 patients triggering decision support, 284 (47.5%) urine cultures were collected in agreement with institutional criteria and 314 (52.5%) were averted. Of 314 patients whose urine cultures were averted, 2 had a subsequent urine culture within 7 days that resulted in a change in antimicrobial therapy and 2 had diagnosis of bacteremia with suspected urinary source, but there were no delays in effective treatment.
Conclusion:
A diagnostic stewardship intervention was associated with an approximately 50% decrease in urine culture testing for inpatients with a urinary catheter. However, the overall CAUTI rate did not decrease significantly. Adverse outcomes were rare and minor among patients who had a urine culture averted. Diagnostic stewardship may be safe and effective as part of a multimodal program to reduce unnecessary urine cultures among patients with indwelling urinary catheters.
People with intellectual disability are more likely to experience mental health difficulties, and their treatment responses may differ from those in the general population. This book, written by leading clinical practitioners from around the world, provides comprehensive guidance on prescribing for people with intellectual disability, as well as general information on their clinical care. The guidelines have been conceived and developed by clinicians working in intellectual disability services. Combining the latest evidence and expert opinion, they provide a consensus approach to prescribing as part of a holistic package of care, and include numerous case examples and scenarios. Now in its fourth edition, this update reflects the changes in prescribing practice; it places emphasis on clinical scenarios and case examples and includes input from service users and their families. This is a practical guide for busy clinicians, and a valuable reference for all primary and secondary healthcare professionals.
Good social connections are proposed to positively influence the course of cognitive decline by stimulating cognitive reserve and buffering harmful stress-related health effects. Prior meta-analytic research has uncovered links between social connections and the risk of poor health outcomes such as mild cognitive impairment, dementia, and mortality. These studies have primarily used aggregate data from North America and Europe with limited markers of social connections. Further research is required to explore these associations longitudinally across a wider range of social connection markers in a global setting.
Research Objective:
We examined the associations between social connection structure, function, and quality and the risk of our primary outcomes (mild cognitive impairment, dementia, and mortality).
Method:
Individual participant-level data were obtained from 13 longitudinal studies of ageing from across the globe. We conducted survival analysis using Cox regression models and combined estimates from each study using two-stage meta-analysis. We examined three social constructs: connection structure (living situation, relationship status, interactions with friends/family, community group engagement), function (social support, having a confidante) and quality (relationship satisfaction, loneliness) in relation to the risks of three primary outcomes (mild cognitive impairment, dementia, and mortality). In our partially adjusted models, we included age, sex, and education and in fully adjusted models used these variables as well as diabetes, hypertension, smoking, cardiovascular risk, and depression.
Preliminary results of the ongoing study:
In our fully adjusted models we observed: a lower risk of mild cognitive impairment was associated with being married/in a relationship (vs. being single), weekly community group engagement (vs. no engagement), weekly family/friend interactions (vs. not interacting), and never feeling lonely (vs. often feeling lonely); a lower risk of dementia was associated with monthly/weekly family/friend interactions and having a confidante (vs. no confidante); a lower risk of mortality was associated with living with others (vs. living alone), yearly/monthly/weekly community group engagement, and having a confidante.
Conclusion:
Good social connection structure, function, and quality are associated with reduced risk of incident MCI, dementia, and mortality. Our results provide actionable evidence that social connections are required for healthy ageing.
Understanding the factors contributing to optimal cognitive function throughout the aging process is essential to better understand successful cognitive aging. Processing speed is an age sensitive cognitive domain that usually declines early in the aging process; however, this cognitive skill is essential for other cognitive tasks and everyday functioning. Evaluating brain network interactions in cognitively healthy older adults can help us understand how brain characteristics variations affect cognitive functioning. Functional connections among groups of brain areas give insight into the brain’s organization, and the cognitive effects of aging may relate to this large-scale organization. To follow-up on our prior work, we sought to replicate our findings regarding network segregation’s relationship with processing speed. In order to address possible influences of node location or network membership we replicated the analysis across 4 different node sets.
Participants and Methods:
Data were acquired as part of a multi-center study of 85+ cognitively normal individuals, the McKnight Brain Aging Registry (MBAR). For this analysis, we included 146 community-dwelling, cognitively unimpaired older adults, ages 85-99, who had undergone structural and BOLD resting state MRI scans and a battery of neuropsychological tests. Exploratory factor analysis identified the processing speed factor of interest. We preprocessed BOLD scans using fmriprep, Ciftify, and XCPEngine algorithms. We used 4 different sets of connectivity-based parcellation: 1)MBAR data used to define nodes and Power (2011) atlas used to determine node network membership, 2) Younger adults data used to define nodes (Chan 2014) and Power (2011) atlas used to determine node network membership, 3) Older adults data from a different study (Han 2018) used to define nodes and Power (2011) atlas used to determine node network membership, and 4) MBAR data used to define nodes and MBAR data based community detection used to determine node network membership.
Segregation (balance of within-network and between-network connections) was measured within the association system and three wellcharacterized networks: Default Mode Network (DMN), Cingulo-Opercular Network (CON), and Fronto-Parietal Network (FPN). Correlation between processing speed and association system and networks was performed for all 4 node sets.
Results:
We replicated prior work and found the segregation of both the cortical association system, the segregation of FPN and DMN had a consistent relationship with processing speed across all node sets (association system range of correlations: r=.294 to .342, FPN: r=.254 to .272, DMN: r=.263 to .273). Additionally, compared to parcellations created with older adults, the parcellation created based on younger individuals showed attenuated and less robust findings as those with older adults (association system r=.263, FPN r=.255, DMN r=.263).
Conclusions:
This study shows that network segregation of the oldest-old brain is closely linked with processing speed and this relationship is replicable across different node sets created with varied datasets. This work adds to the growing body of knowledge about age-related dedifferentiation by demonstrating replicability and consistency of the finding that as essential cognitive skill, processing speed, is associated with differentiated functional networks even in very old individuals experiencing successful cognitive aging.
Excellence is that quality that drives continuously improving outcomes for patients. Excellence must be measurable. We set out to measure excellence in forensic mental health services according to four levels of organisation and complexity (basic, standard, progressive and excellent) across seven domains: values and rights; clinical organisation; consistency; timescale; specialisation; routine outcome measures; research and development.
Aims
To validate the psychometric properties of a measurement scale to test which objective features of forensic services might relate to excellence: for example, university linkages, service size and integrated patient pathways across levels of therapeutic security.
Method
A survey instrument was devised by a modified Delphi process. Forensic leads, either clinical or academic, in 48 forensic services across 5 jurisdictions completed the questionnaire.
Results
Regression analysis found that the number of security levels, linked patient pathways, number of in-patient teams and joint university appointments predicted total excellence score.
Conclusions
Larger services organised according to stratified therapeutic security and with strong university and research links scored higher on this measure of excellence. A weakness is that these were self-ratings. Reliability could be improved with peer review and with objective measures such as quality and quantity of research output. For the future, studies are needed of the determinants of other objective measures of better outcomes for patients, including shorter lengths of stay, reduced recidivism and readmission, and improved physical and mental health and quality of life.
Recent arguments claim that behavioral science has focused – to its detriment – on the individual over the system when construing behavioral interventions. In this commentary, we argue that tackling economic inequality using both framings in tandem is invaluable. By studying individuals who have overcome inequality, “positive deviants,” and the system limitations they navigate, we offer potentially greater policy solutions.
Approximately 80 million people live with chronic hepatitis B virus (HBV) infection in the WHO Africa Region. The natural history of HBV infection in this population is poorly characterised, and may differ from patterns observed elsewhere due to differences in prevailing genotypes, environmental exposures, co-infections, and host genetics. Existing research is largely drawn from small, single-centre cohorts, with limited follow-up time. The Hepatitis B in Africa Collaborative Network (HEPSANET) was established in 2022 to harmonise the process of ongoing data collection, analysis, and dissemination from 13 collaborating HBV cohorts in eight African countries. Research priorities for the next 5 years were agreed upon through a modified Delphi survey prior to baseline data analysis being conducted. Baseline data on 4,173 participants with chronic HBV mono-infection were collected, of whom 38.3% were women and the median age was 34 years (interquartile range 28–42). In total, 81.3% of cases were identified through testing of asymptomatic individuals. HBeAg-positivity was seen in 9.6% of participants. Follow-up of HEPSANET participants will generate evidence to improve the diagnosis and management of HBV in this region.
To evaluate the construct validity of the NIH Toolbox Cognitive Battery (NIH TB-CB) in the healthy oldest-old (85+ years old).
Method:
Our sample from the McKnight Brain Aging Registry consists of 179 individuals, 85 to 99 years of age, screened for memory, neurological, and psychiatric disorders. Using previous research methods on a sample of 85 + y/o adults, we conducted confirmatory factor analyses on models of NIH TB-CB and same domain standard neuropsychological measures. We hypothesized the five-factor model (Reading, Vocabulary, Memory, Working Memory, and Executive/Speed) would have the best fit, consistent with younger populations. We assessed confirmatory and discriminant validity. We also evaluated demographic and computer use predictors of NIH TB-CB composite scores.
Results:
Findings suggest the six-factor model (Vocabulary, Reading, Memory, Working Memory, Executive, and Speed) had a better fit than alternative models. NIH TB-CB tests had good convergent and discriminant validity, though tests in the executive functioning domain had high inter-correlations with other cognitive domains. Computer use was strongly associated with higher NIH TB-CB overall and fluid cognition composite scores.
Conclusion:
The NIH TB-CB is a valid assessment for the oldest-old samples, with relatively weak validity in the domain of executive functioning. Computer use’s impact on composite scores could be due to the executive demands of learning to use a tablet. Strong relationships of executive function with other cognitive domains could be due to cognitive dedifferentiation. Overall, the NIH TB-CB could be useful for testing cognition in the oldest-old and the impact of aging on cognition in older populations.
Dietary fibre modulates gastrointestinal (GI) health and function, providing laxation, shifting microbiota, and altering bile acid (BA) metabolism. Fruit juice production removes the polyphenol- and fibre-rich pomace fraction. The effects of orange and apple pomaces on GI outcomes were investigated in healthy, free-living adults. Healthy adults were enrolled in two double-blinded, crossover trials, being randomised by baseline bowel movement (BM) frequency. In the first trial, subjects (n 91) received orange juice (OJ, 0 g fibre/d) or OJ + orange pomace (OJ + P, 10 g fibre/d) for 4 weeks, separated by a 3-week washout. Similarly, in the second trial, subjects (n 90) received apple juice (AJ, 0 g fibre/d) or AJ + apple pomace (AJ + P, 10 g fibre/d). Bowel habit diaries, GI tolerance surveys and 3-d diet records were collected throughout. Fresh faecal samples were collected from a participant subset for microbiota and BA analyses in each study. Neither pomace interventions influenced BM frequency. At Week 4, OJ + P tended to increase (P = 0·066) GI symptom occurrence compared with OJ, while AJ + P tended (P = 0·089) to increase flatulence compared with AJ. Faecalibacterium (P = 0·038) and Negativibacillus (P = 0·043) were differentially abundant between pre- and post-interventions in the apple trial but were no longer significant after false discovery rate (FDR) correction. Baseline fibre intake was independently associated with several microbial genera in both trials. Orange or apple pomace supplementation was insufficient to elicit changes in bowel habits, microbiota diversity or BA of free-living adults with healthy baseline BM. Future studies should consider baseline BM frequency and habitual fibre intake.