We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The transition from breastmilk to solid foods (weaning) is a critical stage in infant development and plays a decisive role in the maturation of the complex microbial community inhabiting the human colon. Diet is a major factor shaping the colonic microbiota, which ferments nutrients reaching the colon unabsorbed by the host to produce a variety of microbial metabolites influencing host physiology(1). Therefore, making adequate dietary choices during weaning can positively modulate the colonic microbiota, ultimately contributing to health in infancy and later life(2). However, our understanding of how complementary foods impact the colonic microbiota of weaning infants is limited. To address this knowledge gap, we employed a metagenome-scale modelling approach to simulate the impact of complementary foods, either combined with breastmilk or with breastmilk and other foods, on the production of organic acids by colonic microbes of weaning infants(3). Complementary foods and combinations of foods with the greatest impact on the in silico microbial production of organic acids were identified. These foods and food combinations were further tested in vitro, individually or in combination with infant formula. Fifty-three food samples were digested using a protocol adapted from INFOGEST to mimic infant digestion and then fermented with faecal inoculum from 6 New Zealand infants (5-11 months old). After 24h of fermentation, the production of organic acids was measured by gas chromatography. Differences in organic acid production between samples were determined using the Tukey Honestly Significant Difference test to account for multiple comparisons. The microbial composition was characterised by amplicon sequencing of the V3-V4 regions of the 16S bacterial gene. Taxonomy was assigned using the DADA2 pipeline and the SILVA database (version 138.1). Bioinformatic and statistical analyses were conducted using the R packages phyloseq and ANCOM-BC2, with the Holm-Bonferroni adjustment to account for false discovery rates in differential abundance testing. Blackcurrant and raspberries increased the production of acetate and propionate (Tukey’s test, p<0.05) and the relative abundance of the genus Parabacteroides (Dunnett’s test, adjusted p<0.05) compared to other foods. Raspberries also increased the abundance of the genus Eubacterium (Dunnett’s test, adjusted p<0.05). When combined with infant formula, black beans stood out for increasing the production of butyrate (Tukey’s test, p<0.05) and the relative abundance of the genus Clostridium (Dunnett’s test, adjusted p<0.05). In conclusion, this study provides new evidence on how complementary foods, both individually or in combination with other dietary compounds, influence the colonic microbiota of weaning infants in vitro. Insights generated by this research can help design future clinical trials, ultimately enhancing our understanding of the relationship between human nutrition and colonic microbiota composition and function in post-weaning life.
Micronutrient deficiencies (MND) are a significant global health issue, particularly affecting children’s growth and cognitive potential and predisposing to adverse health outcomes for women of reproductive age (WRA).(1) Over half of global MND cases occur in Sub-Saharan Africa (SSA), with 80% of women estimated to be deficient in at least one of three micronutrients(2). Large-scale food fortification is a cost-effective strategy recommended for combatting widespread MND and has been effectively implemented in many developed countries(3). In developing countries such as SSA, socio-economic barriers and a fragmented food processing industry hinders effective implementation of food fortification(4). As a result, countries with fortification programmes face significant challenges, including low coverage of fortified food in the population and poor compliance with fortification standards by food producers(5) The contribution of food fortification to nutrient intakes of WRA in SSA have yet to be fully assessed. This study sought to evaluate mandatory food fortification programmes in SSA and estimate the contribution of fortified food consumption to micronutrient intakes and requirements of WRA. We utilised multi-national fortification data from the global fortification data exchange, which includes data on country fortification standards and the estimated level of compliance to fortification requirements. Data on the supply and consumption of fortifiable food was also included from the FAO. We calculated the potential nutrient intake from fortified food consumption for each nutrient using country fortification standards and food availability. We adjusted the estimated intake for each nutrient by multiplying with the estimated compliance percentage. We also assessed what proportion of women’s requirements for essential micronutrients, folate, iron, iodine, vitamin A, and zinc, are met through fortified food consumption using RNI values from WHO/FAO for WRA. Between 2019 and 2021, we estimated that mandatory fortification of wheat and maize flour, oil and salt in SSA contributes a median of 138µgDFE of folic acid, 217µg of iodine, 43µg RAE of vitamin A and 2.1mg and 2.0mg of iron and zinc respectively to the intakes of WRA daily. These intakes represent 12.8% (0.0-49.2) of iron, 27.5% (0.0-83.2) of zinc, 55.0% (0.0-245.0) of folate, 8.8% (0.0-37.2) of vitamin A and 228.2% (98.2-358.6) of iodine requirements respectively, taking into consideration the lower bioavailability of iron and zinc from cereal-based diets of SSA populations. In reality, compliance with fortification requirements in SSA is low, estimated at a median of 22% (0.0 - 83.4) for maize flour, 44% (0.0 - 72.0) for vegetable oil and 83% (0.0 - 100.0) for wheat flour fortification and is a major factor limiting the overall contribution of fortification to micronutrient intakes. Inadequate regulatory monitoring to ensure compliance with fortification requirements in SSA have resulted in lower-quality fortified foods, limiting women’s potential to achieve adequate micronutrient intake through fortified food consumption.
Background: Our prior six-year review (n=2165) revealed 24% of patients undergoing posterior decompression surgeries (laminectomy or discectomy) sought emergency department (ED) care within three months post-surgery. We established an integrated Spine Assessment Clinic (SAC) to enhance patient outcomes and minimize unnecessary ED visits through pre-operative education, targeted QI interventions, and early post-operative follow-up. Methods: We reviewed 13 months of posterior decompression data (n=205) following SAC implementation. These patients received individualized, comprehensive pre-operative education and follow-up phone calls within 7 days post-surgery. ED visits within 90 days post-surgery were tracked using provincial databases and compared to our pre-SAC implementation data. Results: Out of 205 patients, 24 (11.6%) accounted for 34 ED visits within 90 days post-op, showing a significant reduction in ED visits from 24% to 11.6%, and decreased overall ED utilization from 42.1% to 16.6% (when accounting for multiple visits by the same patient). Early interventions including wound monitoring, outpatient bloodwork, and prescription adjustments for pain management, helped mitigate ED visits. Patient satisfaction surveys (n=62) indicated 92% were “highly satisfied” and 100% would recommend the SAC. Conclusions: The SAC reduced ED visits after posterior decompression surgery by over 50%, with pre-operative education, focused QI initiatives, and its individualized, proactive approach.
The Child Opportunity Index is an index of 29 indicators of social determinants of health linked to the United States of America Census. Disparities in the treatment of Wolff–Parkinson–White have not be reported. We hypothesise that lower Child Opportunity Index levels are associated with greater disease burden (antiarrhythmic use, ablation success, and Wolff–Parkinson–White recurrence) and ablation utilisation.
Methods:
A retrospective, single-centre study was performed with Wolff–Parkinson–White patients who received care from January 2021 to July 2023. Following exclusion for <5 years old and with haemodynamically significant CHD, 267 patients were included (45% high, 30% moderate, and 25% low Child Opportunity Index). Multi-level logistic and log-linear regression was performed to assess the relationship between Child Opportunity Index levels and outcomes.
Results:
Low patients were more likely to be Black (p < 0.0001) and to have public insurance (p = 0.0006), though, there were no significant differences in ablation utilisation (p = 0.44) or time from diagnosis to ablation (p = 0.37) between groups. There was an inverse relationship with emergency department use (p = 0.007). The low group had 2.8 times greater odds of having one or more emergency department visits compared to the high group (p = 0.004).
Conclusion:
The Child Opportunity Index was not related with ablation utilisation, while there was an inverse relationship in emergency department use. These findings suggest that while social determinants of health, as measured by Child Opportunity Index, may influence emergency department utilisation, they do not appear to impact the overall management and procedural timing for Wolff–Parkinson–White treatment.
Posttraumatic stress disorder (PTSD) has been associated with advanced epigenetic age cross-sectionally, but the association between these variables over time is unclear. This study conducted meta-analyses to test whether new-onset PTSD diagnosis and changes in PTSD symptom severity over time were associated with changes in two metrics of epigenetic aging over two time points.
Methods
We conducted meta-analyses of the association between change in PTSD diagnosis and symptom severity and change in epigenetic age acceleration/deceleration (age-adjusted DNA methylation age residuals as per the Horvath and GrimAge metrics) using data from 7 military and civilian cohorts participating in the Psychiatric Genomics Consortium PTSD Epigenetics Workgroup (total N = 1,367).
Results
Meta-analysis revealed that the interaction between Time 1 (T1) Horvath age residuals and new-onset PTSD over time was significantly associated with Horvath age residuals at T2 (meta β = 0.16, meta p = 0.02, p-adj = 0.03). The interaction between T1 Horvath age residuals and changes in PTSD symptom severity over time was significantly related to Horvath age residuals at T2 (meta β = 0.24, meta p = 0.05). No associations were observed for GrimAge residuals.
Conclusions
Results indicated that individuals who developed new-onset PTSD or showed increased PTSD symptom severity over time evidenced greater epigenetic age acceleration at follow-up than would be expected based on baseline age acceleration. This suggests that PTSD may accelerate biological aging over time and highlights the need for intervention studies to determine if PTSD treatment has a beneficial effect on the aging methylome.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
Migraine and post-traumatic stress disorder (PTSD) are both twice as common in women as men. Cross-sectional studies have shown associations between migraine and several psychiatric conditions, including PTSD. PTSD is disproportionally common among patients in headache clinics, and individuals with migraine and PTSD report greater disability from migraines and more frequent medication use. To further clarify the nature of the relationship between PTSD and migraine, we conducted bidirectional analyses of the association between (1) migraine and incident PTSD and (2) PTSD and incident migraine.
Methods
We used longitudinal data from 1989–2020 among the 33,327 Nurses’ Health Study II respondents to the 2018 stress questionnaire. We used log-binomial models to estimate the relative risk of developing PTSD among women with migraine and the relative risk of developing migraine among individuals with PTSD, trauma-exposed individuals without PTSD, and individuals unexposed to trauma, adjusting for race, education, marital status, high blood pressure, high cholesterol, alcohol intake, smoking, and body mass index.
Results
Overall, 48% of respondents reported ever experiencing migraine, 82% reported experiencing trauma and 9% met the Diagnostic and Statistical Manual of Mental Disorders-5 criteria for PTSD. Of those reporting migraine and trauma, 67% reported trauma before migraine onset, 2% reported trauma and migraine onset in the same year and 31% reported trauma after migraine onset. We found that migraine was associated with incident PTSD (adjusted relative risk [RR]: 1.26, 95% confidence interval [CI]: 1.14–1.39). PTSD, but not trauma without PTSD, was associated with incident migraine (adjusted RR: 1.20, 95% CI: 1.14–1.27). Findings were consistently stronger in both directions among those experiencing migraine with aura.
Conclusions
Our study provides further evidence that migraine and PTSD are strongly comorbid and found associations of similar magnitude between migraine and incident PTSD and PTSD and incident migraine.
Leader exemplification involves implicit and explicit claims of high moral values made by a leader. We employed a 2 × 3 experimental design with samples of 265 students in Study 1 and 142 working adults in Study 2 to examine the effects of leader exemplification (exemplification versus no exemplification) and ethical conduct (self-serving, self-sacrificial, and self-other focus) on perceived leader authenticity, trust in leader, and organizational advocacy. In Study 1, we found that exemplification produced elevated levels of perceived authenticity, trust, and advocacy in the form of employment and investment recommendations. We also showed that leader ethical conduct moderated this effect, as ratings were highest following a leader’s self-sacrificial conduct, lowest for self-serving conduct, and moderate for conduct reflecting self-other concerns. In Study 2, we replicated these findings for perceived authenticity and trust, but not organizational advocacy, which yielded mixed results. The leadership implications and future research directions are discussed.
With the advent of COVID-19, adaptation became a norm. Research data-collection methods similarly required adaptation, birthing the use of virtual platforms as first-line data collection tools to adhere to COVID-19 restrictions. This chapter presents an autoethnographic account of virtual qualitative data collection. A PhD candidate shares her experience of conducting individual and focus group interviews virtually in a developing nation. A discussion of the narrative and recommendations for virtual qualitative data collection are provided.
The introduction of digital approaches is perhaps the most significant change to the way that healthcare research is conducted that has been seen since computers first came into use. This introductory chapter will set the tone for the rest of the book. The book is divided into two parts: 1. digital platforms, and 2. approaches to healthcare research that are either uniquely digital or are adaptations of existing approaches to the online context. Within each of these parts, a collection of chapters by distinguished and rising authors present digital platforms and techniques and consider these as applied to a wide range of healthcare studies. This introduction will consider the broad area that the book addresses and will similarly be divided into the same two sections. The unique aspects of digital research approaches will be highlighted and emphasised, and the reader will be prepared for the chapters that follow.
During 2016–2022, Medicare part D beneficiaries filled 8,674,460 clotrimazole-betamethasone dipropionate prescriptions. Annual rates were stable (30.9 prescriptions/1,000 beneficiary-years in 2022, enough for one in every 33 beneficiaries). Diagnostic testing was infrequent, particularly among internal medicine, family medicine, and general practitioners, suggesting potential opportunities to improve diagnostic and prescribing practices.
In December 2018, an outbreak of Salmonella Enteritidis infections was identified in Canada by whole-genome sequencing (WGS). An investigation was initiated to identify the source of the illnesses, which proved challenging and complex. Microbiological hypothesis generation methods included comparisons of Salmonella isolate sequence data to historical domestic outbreaks and international repositories. Epidemiological hypothesis generation methods included routine case interviews, open-ended centralized re-interviewing, thematic analysis of open-ended interview data, collection of purchase records, a grocery store site visit, analytic comparison to healthy control groups, and case–case analyses. Food safety hypothesis testing methods included food sample collection and analysis, and traceback investigations. Overall, 83 cases were identified across seven provinces, with onset dates from 6 November 2018 to 7 May 2019. Case ages ranged from 1 to 88 years; 60% (50/83) were female; 39% (22/56) were hospitalized; and three deaths were reported. Brand X profiteroles and eclairs imported from Thailand were identified as the source of the outbreak, and eggs from an unregistered facility were hypothesized as the likely cause of contamination. This study aims to describe the outbreak investigation and highlight the multiple hypothesis generation methods that were employed to identify the source.
An investigation into an outbreak of Salmonella Newport infections in Canada was initiated in July 2020. Cases were identified across several provinces through whole-genome sequencing (WGS). Exposure data were gathered through case interviews. Traceback investigations were conducted using receipts, invoices, import documentation, and menus. A total of 515 cases were identified in seven provinces, related by 0–6 whole-genome multi-locus sequence typing (wgMLST) allele differences. The median age of cases was 40 (range 1–100), 54% were female, 19% were hospitalized, and three deaths were reported. Forty-eight location-specific case sub-clusters were identified in restaurants, grocery stores, and congregate living facilities. Of the 414 cases with exposure information available, 71% (295) had reported eating onions the week prior to becoming ill, and 80% of those cases who reported eating onions, reported red onion specifically. The traceback investigation identified red onions from Grower A in California, USA, as the likely source of the outbreak, and the first of many food recall warnings was issued on 30 July 2020. Salmonella was not detected in any tested food or environmental samples. This paper summarizes the collaborative efforts undertaken to investigate and control the largest Salmonella outbreak in Canada in over 20 years.
This editorial summarises the clinical relevance of ‘chronopsychiatry’, defined as the interface between circadian science and mental health science. Chronopsychiatry represents a move towards time-variable perspectives on neurobiology and symptoms, with a greater emphasis on chronotherapeutic interventions.
Rift propagation, rather than basal melt, drives the destabilization and disintegration of the Thwaites Eastern Ice Shelf. Since 2016, rifts have episodically advanced throughout the central ice-shelf area, with rapid propagation events occurring during austral spring. The ice shelf's speed has increased by ~70% during this period, transitioning from a rate of 1.65 m d−1 in 2019 to 2.85 m d−1 by early 2023 in the central area. The increase in longitudinal strain rates near the grounding zone has led to full-thickness rifts and melange-filled gaps since 2020. A recent sea-ice break out has accelerated retreat at the western calving front, effectively separating the ice shelf from what remained of its northwestern pinning point. Meanwhile, a distributed set of phase-sensitive radar measurements indicates that the basal melting rate is generally small, likely due to a widespread robust ocean stratification beneath the ice–ocean interface that suppresses basal melt despite the presence of substantial oceanic heat at depth. These observations in combination with damage modeling show that, while ocean forcing is responsible for triggering the current West Antarctic ice retreat, the Thwaites Eastern Ice Shelf is experiencing dynamic feedbacks over decadal timescales that are driving ice-shelf disintegration, now independent of basal melt.
Background: Canadian Emergency Departments (EDs) are overburdened. Understanding the drivers for postoperative patients to attend the ED allows for targeted interventions thereby reducing demand. We sought to identify “bounce back” patterns for subsequent QI initiatives. Methods: From April 1, 2016 to March 31, 2022, all provincial ED datasets (EDIS, STAR, Meditech) identified patients presenting within 90 days post-spine surgery. Using Canadian Classification of Health Interventions codes, laminectomies (1SC80) and discectomies (1SE87) demonstrated the highest ED visit rates. Comprehensive chart reviews were conducted identifying surgical and medical reasons for presentation within this timeframe. Results: Reviewing a cohort of 2165 post-decompression patients, 42.1% presented to the ED (n=912) with 62.8% of these directly related to surgery. Primary reasons included wound care (31.6%), pain management (31.6%), and bladder issues (retention or UTI, 11.0%). Simple wound evaluation constituted 49.7% of wound-related visits, with surgical site infection 37.6% and dehiscence 6.6% accounting for the remainder. Pain-related presentations resulted in 72.3% discharge with additional medications, and 27.7% necessitating hospital admission. New or worsening neurologic deficits were reported in 8.9% of ED visits. Conclusions: These findings illuminate crucial aspects of postoperative care and ED utilization patterns. Prioritizing patient education, pain management, and wound care could help alleviate the national ED crisis.
The early and sensitive detection of microbial contamination of kaolinite slurries is needed for timely treatment to prevent spoilage. The sensitivity, reproducibility, and time required by current methods, such as the dip-slide method, do not meet this challenge. A more sensitive, reproducible, and efficient method is required. The objective of the present study was to develop and validate such a method. The new method is based on the measured growth kinetics of indigenous kaolinite-slurry microorganisms. The microorganisms from kaolinite slurries with different contamination levels were eluted and quantified as colony-forming units (CFUs). Known quantities of E. coli (ATCC 11775) were inoculated into sterilized kaolinite slurries to relate kaolinite-slurry CFUs to true microbial concentrations. The inoculated slurries were subsequently incubated, re-extracted, and microbial concentrations quantified. The ratio of the known inoculated E. coli concentration to the measured concentration was expressed as the recovery efficiency coefficient. Indigenous microbial communities were serially diluted, incubated, and the growth kinetics measured and related to CFUs. Using the new method, greater optical densities (OD) and visible microbial growth were measured for greater dilutions of kaolinite slurries with large microbial-cell concentrations. Growth conditions were optimized to maximize the correlation between contamination level, microbial growth kinetics, and OD value. A Standard Bacterial Unit (SBU) scale with five levels of microbial contamination was designed for kaolinite slurries using the experimental results. The SBU scale was validated using a blind test of 50 unknown slurry samples with various contamination levels provided by the Imerys Company. The validation tests revealed that the new method using the SBU scale was more time efficient, sensitive, and reproducible than the dip-slide method.