We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Depression, a leading cause of global disability, arises from a multifaceted combination of genetic and environmental components. This study explores the relationship between major depressive disorder (MDD) polygenic scores (PGS), characteristics and symptoms of depression, and community-shared socioeconomic factors derived from postal code data in a cohort of 12,646 individuals from the Australian Genetics of Depression Study (AGDS). Our findings reveal that people living in areas with relatively higher socioeconomic advantages and education/occupation scores are more likely to report experiencing fewer depressive symptoms during their worst depressive period, as well as fewer number of lifetime episodes. Additionally, participants who reported depression onset later in life tend to currently reside in wealthier areas. Interestingly, no significant interaction between genetic and socioeconomic factors was observed, suggesting their independent contribution to depression outcomes. This research underscores the importance of integrating socioeconomic factors into psychiatric evaluation and care, and points to the critical role of public policy in addressing mental health disparities driven by socioeconomic factors. Future research should aim to further elucidate the causal relationships within these associations and explore the potential for integrated genetic and socioeconomic approaches in mental health interventions.
The stars of the Milky Way carry the chemical history of our Galaxy in their atmospheres as they journey through its vast expanse. Like barcodes, we can extract the chemical fingerprints of stars from high-resolution spectroscopy. The fourth data release (DR4) of the Galactic Archaeology with HERMES (GALAH) Survey, based on a decade of observations, provides the chemical abundances of up to 32 elements for 917 588 stars that also have exquisite astrometric data from the Gaia satellite. For the first time, these elements include life-essential nitrogen to complement carbon, and oxygen as well as more measurements of rare-earth elements critical to modern-life electronics, offering unparalleled insights into the chemical composition of the Milky Way. For this release, we use neural networks to simultaneously fit stellar parameters and abundances across the whole wavelength range, leveraging synthetic grids computed with Spectroscopy Made Easy. These grids account for atomic line formation in non-local thermodynamic equilibrium for 14 elements. In a two-iteration process, we first fit stellar labels to all 1 085 520 spectra, then co-add repeated observations and refine these labels using astrometric data from Gaia and 2MASS photometry, improving the accuracy and precision of stellar parameters and abundances. Our validation thoroughly assesses the reliability of spectroscopic measurements and highlights key caveats. GALAH DR4 represents yet another milestone in Galactic archaeology, combining detailed chemical compositions from multiple nucleosynthetic channels with kinematic information and age estimates. The resulting dataset, covering nearly a million stars, opens new avenues for understanding not only the chemical and dynamical history of the Milky Way but also the broader questions of the origin of elements and the evolution of planets, stars, and galaxies.
Functional cognitive disorder is an increasingly recognised subtype of functional neurological disorder for which treatment options are currently limited. We have developed a brief online group acceptance and commitment therapy (ACT)-based intervention.
Aims
To assess the feasibility of conducting a randomised controlled trial of this intervention versus treatment as usual (TAU).
Method
The study was a parallel-group, single-blind randomised controlled trial, with participants recruited from cognitive neurology, neuropsychiatry and memory clinics in London. Participants were randomised into two groups: ACT + TAU or TAU alone. Feasibility was assessed on the basis of recruitment and retention rates, the acceptability of the intervention, and signal of efficacy on the primary outcome measure (Acceptance and Action Questionnaire II (AAQ-II)) score, although the study was not powered to demonstrate this statistically. Outcome measures were collected at baseline and at 2, 4 and 6 months post-intervention, including assessments of quality of life, memory, anxiety, depression and healthcare use.
Results
We randomised 44 participants, with a participation rate of 51.1% (95% CI 40.8–61.5%); 36% of referred participants declined involvement, but retention was high, with 81.8% of ACT participants attending at least four sessions, and 64.3% of ACT participants reported being ‘satisfied’ or ‘very satisfied’ compared with 0% in the TAU group. Psychological flexibility as measured using the AAQ-II showed a trend towards modest improvement in the ACT group at 6 months. Other measures (quality of life, mood, memory satisfaction) also demonstrated small to modest positive trends.
Conclusions
It has proven feasible to conduct a randomised controlled trial of ACT versus TAU.
The Lyman alpha (Ly$\alpha$) forest in the spectra of $z\gt5$ quasars provides a powerful probe of the late stages of the epoch of reionisation (EoR). With the recent advent of exquisite datasets such as XQR-30, many models have struggled to reproduce the observed large-scale fluctuations in the Ly$\alpha$ opacity. Here we introduce a Bayesian analysis framework that forward-models large-scale lightcones of intergalactic medium (IGM) properties and accounts for unresolved sub-structure in the Ly$\alpha$ opacity by calibrating to higher-resolution hydrodynamic simulations. Our models directly connect physically intuitive galaxy properties with the corresponding IGM evolution, without having to tune ‘effective’ parameters or calibrate out the mean transmission. The forest data, in combination with UV luminosity functions and the CMB optical depth, are able to constrain global IGM properties at percent level precision in our fiducial model. Unlike many other works, we recover the forest observations without invoking a rapid drop in the ionising emissivity from $z\sim7$ to 5.5, which we attribute to our sub-grid model for recombinations. In this fiducial model, reionisation ends at $z=5.44\pm0.02$ and the EoR mid-point is at $z=7.7\pm0.1$. The ionising escape fraction increases towards faint galaxies, showing a mild redshift evolution at fixed UV magnitude, $M_\textrm{UV}$. Half of the ionising photons are provided by galaxies fainter than $M_\textrm{UV} \sim -12$, well below direct detection limits of optical/NIR instruments including $\textit{ JWST}$. We also show results from an alternative galaxy model that does not allow for a redshift evolution in the ionising escape fraction. Despite being decisively disfavoured by the Bayesian evidence, the posterior of this model is in qualitative agreement with that from our fiducial model. We caution, however, that our conclusions regarding the early stages of the EoR and which sources reionised the Universe are more model-dependent.
In many areas of The Gambia, West Africa, population crowding in a degraded environment has forced close interactions of diurnal primate species with humans. We assessed intestinal parasitic infection prevalence and diversity in 4 diurnal non-human primate (NHP) species, Chlorocebus sabaeus, Erythrocebus patas, Papio papio and Piliocolobus badius across 13 sampling sites. The effect of human activity, determined by the human activity index, and NHP group size on parasite richness was assessed using a generalized linear mixed model (GLMM). The most common protozoa identified were Entamoeba coli (30%) and Iodamoeba buetschlii (25%). The most common helminths were Strongyloides fuelleborni (11%), Oesophagostomum spp. (9%) and Trichuris trichiura (9%). Two of six (6%) Cyclospora spp. infections detected sequenced as Cyclospora cercopitheci (both in C. sabaeus). The more arboreal P. badius trended towards a lower prevalence of intestinal parasites, although this was not statistically significant (χ2P = 0.105). Human activity or group size did not have any significant effect on parasite richness for P. badius (P = 0.161 and P = 0.603) or P. papio (P = 0.817 and P = 0.607, respectively). There were insufficient observations to fit a GLMM to E. patas or C. sabaeus. Our reports present the richness and diversity of intestinal parasites in 4 diurnal NHPs in The Gambia, West Africa. Despite desertification and habitat loss, our results indicate that the prevalence and diversity of intestinal parasites in Gambian NHPs are seemingly unaffected by human activity. Further investigation with a larger dataset is required to better elucidate these findings.
Research is only beginning to shape our understanding of eating disorders as metabolic-psychiatric illnesses. How eating disorders (EDs) are classified is essential to future research for understanding the etiology of these severe illnesses and both developing and tailoring effective treatments. The gold standard for classification for research and diagnostic purposes has primarily been and continues to be the Diagnostic and Statistical Manual of Mental Disorders (DSM-5). With the reconceptualization of EDs comes new challenges of considering how EDs are classified to reflect clinical reality, prognosis and lived experience. In this article, we explore the DSM-5 method of categorical classification and how it may not accurately represent the fluidity in which EDs present themselves. We discuss alternative methods of conceptualizing EDs, and their relevance and implications for genetic research.
Novel ultrasound neuromodulation techniques allow therapeutic brain stimulation with unmet precision and non-invasive targeting of deep brain areas. Transcranial pulse stimulation (TPS), a multifrequency sonication technique, is approved for the clinical treatment of Alzheimer’s disease (AD). Here, we present the largest real-world retrospective analysis of ultrasound neuromodulation therapy in dementia (AD, vascular, mixed) and mild cognitive impairment (MCI).
Methods
The consecutive sample involved 58 patients already receiving state-of-the-art treatment in an open-label, uncontrolled, retrospective study. TPS therapy typically comprises 10 sessions (range 8–12) with individualized MRI-based target areas defined according to brain pathology and individual pathophysiology. We compared the CERAD-Plus neuropsychological test battery results before and after treatment, with the CERAD Corrected Total Score ( CTS) as the primary outcome. Furthermore, we analyzed side effects reported by patients during the treatment period.
Results
CERAD Corrected Total Score (CTS) significantly improved (p = .017, d = .32) after treatment (Baseline: M = 56.56, SD = 18.56; Post-treatment: M = 58.65, SD = 19.44). The group of top-responders (top quartile) improved even by 9.8 points. Fewer than one-third of all patients reported any sensation during treatment. Fatigue and transient headaches were the most common, with no severe adverse events.
Conclusions
The findings implicate TPS as a novel and safe add-on therapy for patients with dementia or MCI with the potential to further improve current state-of-the-art treatment results. Despite the individual benefits, further randomized, sham-controlled, longitudinal clinical trials are needed to differentiate the effects of verum and placebo.
Increasing daylight exposure might be a simple way to improve mental health. However, little is known about daylight-symptom associations in depressive disorders.
Methods
In a subset of the Australian Genetics of Depression Study (N = 13,480; 75% female), we explored associations between self-reported number of hours spent in daylight on a typical workday and free day and seven symptom dimensions: depressive (overall, somatic, psychological); hypo-manic-like; psychotic-like; insomnia; and daytime sleepiness. Polygenic scores for major depressive disorder (MDD); bipolar disorder (BD); and schizophrenia (SCZ) were calculated. Models were adjusted for age, sex, shift work status, employment status, season, and educational attainment. Exploratory analyses examined age-stratified associations (18–24 years; 25–34 years; 35–64 years; 65 and older). Bonferroni-corrected associations (p < 0.004) are discussed.
Results
Adults with depression reported spending a median of one hour in daylight on workdays and three hours on free days. More daylight exposure on workdays and free days was associated with lower depressive (overall, psychological, somatic) and insomnia symptoms (p’s<0.001), but higher hypo-manic-like symptoms (p’s<0.002). Genetic loading for MDD and SCZ were associated with less daylight exposure in unadjusted correlational analyses (effect sizes were not meaningful). Exploratory analyses revealed age-related heterogeneity. Among 18–24-year-olds, no symptom dimensions were associated with daylight. By contrast, for the older age groups, there was a pattern of more daylight exposure and lower insomnia symptoms (p < 0.003) (except for 25–34-year-olds on free days, p = 0.019); and lower depressive symptoms with more daylight on free days, and to some extent workdays (depending on the age-group).
Conclusions
Exploration of the causal status of daylight in depression is warranted.
United Nations (UN) peace missions are meant to foster peace. However, slim progress has been observed in the states in which they have been recently deployed. If peace misions are effective in alleviating the suffering of the population on many fronts, the puzzle remains: To what extent are UN peace missions powerful instruments of peaceful change, given the persistent political coups and ensuing protests and violence in the receiving countries despite their presence? In this chapter, a constructivist, multilevel analytical approach is used to discuss how peace missions factor in peaceful change. A three-fold argument is made: While it can be demonstrated that UN peace missions are powerful instruments of peaceful change at the international level all the while mitigating crises on a regional basis, they do little to prevent/alleviate the continuation of violence at the national level. Using the examples of Sudan, Mali, Central African Republic, the Democratic Republic of Congo, and Cyprus, this chapter examines how UN peace missions are instruments of change, yet their peaceful effect depends on whether one takes into account the national, regional, or international level factors.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
Foliar-applied postemergence applications of glufosinate are often applied to glufosinate-resistant crops to provide nonselective weed control without significant crop injury. Rainfall, air temperature, solar radiation, and relative humidity near the time of application have been reported to affect glufosinate efficacy. However, previous research may have not captured the full range of weather variability to which glufosinate may be exposed before or following application. Additionally, climate models suggest more extreme weather will become the norm, further expanding the weather range to which glufosinate can be exposed. The objective of this research was to quantify the probability of successful weed control (efficacy ≥85%) with glufosinate applied to some key weed species across a broad range of weather conditions. A database of >10,000 North American herbicide evaluation trials was used in this study. The database was filtered to include treatments with a single postemergence application of glufosinate applied to waterhemp [Amaranthus tuberculatus (Moq.) Sauer], morningglory species (Ipomoea spp.), and/or giant foxtail (Setaria faberi Herrm.) <15 cm in height. These species were chosen because they are well represented in the database and listed as common and troublesome weed species in both corn (Zea mays L.) and soybean [Glycine max (L.) Merr.] (Van Wychen 2020, 2022). Individual random forest models were created. Low rainfall (≤20 mm) over the 5 d before glufosinate application was detrimental to the probability of successful control of A. tuberculatus and S. faberi. Lower relative humidity (≤70%) and solar radiation (≤23 MJ m−1 d−1) on the day of application reduced the probability of successful weed control in most cases. Additionally, the probability of successful control decreased for all species when average air temperature over the first 5 d after application was ≤25 C. As climate continues to change and become more variable, the risk of unacceptable control of several common species with glufosinate is likely to increase.
Recent advancements in Earth system science have been marked by the exponential increase in the availability of diverse, multivariate datasets characterised by moderate to high spatio-temporal resolutions. Earth System Data Cubes (ESDCs) have emerged as one suitable solution for transforming this flood of data into a simple yet robust data structure. ESDCs achieve this by organising data into an analysis-ready format aligned with a spatio-temporal grid, facilitating user-friendly analysis and diminishing the need for extensive technical data processing knowledge. Despite these significant benefits, the completion of the entire ESDC life cycle remains a challenging task. Obstacles are not only of a technical nature but also relate to domain-specific problems in Earth system research. There exist barriers to realising the full potential of data collections in light of novel cloud-based technologies, particularly in curating data tailored for specific application domains. These include transforming data to conform to a spatio-temporal grid with minimum distortions and managing complexities such as spatio-temporal autocorrelation issues. Addressing these challenges is pivotal for the effective application of Artificial Intelligence (AI) approaches. Furthermore, adhering to open science principles for data dissemination, reproducibility, visualisation, and reuse is crucial for fostering sustainable research. Overcoming these challenges offers a substantial opportunity to advance data-driven Earth system research, unlocking the full potential of an integrated, multidimensional view of Earth system processes. This is particularly true when such research is coupled with innovative research paradigms and technological progress.
Despite societal perceptions of older adults as vulnerable, literature on resilience suggests that exposure to adversity and resources gained with life experience contribute to adaptation. One way to explore the nature of resilience is to document assets supporting adaptation. Interviews were conducted with older adults living in Canada at two time points during the COVID-19 pandemic, September 2020–May 2021 (T1) and January–August 2022 (T2). Reflexive thematic analysis was completed to report on what older adults identified as assets and how they understood the value of those assets for resilience. Participants indicated that the potential value of their contributions went largely untapped at the level of the community but supported individual and household adaptation. In line with calls for an all-of-society approach to reduce disaster risk and support resilience, creating a culture of inclusivity that recognizes the potential contributions of older adults should be paired with opportunities for action.
In responding to a Chemical, Biological, Radiological, and Nuclear explosive (CBRNe) disaster, clinical leaders have important decision-making responsibilities which include implementing hospital disaster protocols or incident command systems, managing staffing, and allocating resources. Despite emergency care clinical leaders’ integral role, there is minimal literature regarding the strategies they may use during CBRNe disasters. The aim of this study was to explore emergency care clinical leaders’ strategies related to managing patients following a CBRNe disaster.
Methods
Focus groups across 5 tertiary hospitals and 1 rural hospital in Queensland, Australia. Thirty-six hospital clinical leaders from the 6 study sites crucial to hospital disaster response participated in 6 focus groups undertaken between February and May 2021 that explored strategies and decision making to optimize patient care following a CBRNe disaster.
Results
Analysis revealed the use of rehearsals, adopting new models of care, enacting current surge management processes, and applying organization lessons were facilitating strategies. Barriers to management were identified, including resource constraints and sites operating over capacity.
Conclusions
Enhanced education and training of clinical leaders, flexible models of care, and existing established processes and tested frameworks could strengthen a hospital’s response when managing patients following a CBRNe disaster.
Genetic vulnerability to mental disorders has been associated with coronavirus disease-19 (COVID-19) outcomes. We explored whether polygenic risk scores (PRSs) for several mental disorders predicted poorer clinical and psychological COVID-19 outcomes in people with pre-existing depression.
Methods
Data from three assessments of the Australian Genetics of Depression Study (N = 4405; 52.2 years ± 14.9; 76.2% females) were analyzed. Outcomes included COVID-19 clinical outcomes (severe acute respiratory syndrome coronavirus 2 [SARS-CoV-2] infection and long COVID, noting the low incidence of COVID-19 cases in Australia at that time) and COVID-19 psychological outcomes (COVID-related stress and COVID-19 burnout). Predictors included PRS for depression, bipolar disorder, schizophrenia, and anxiety. The associations between these PRSs and the outcomes were assessed with adjusted linear/logistic/multinomial regressions. Mediation (N = 4338) and moderation (N = 3326) analyses were performed to explore the potential influence of anxiety symptoms and resilience on the identified associations between the PRSs and COVID-19 psychological outcomes.
Results
None of the selected PRS predicted SARS-CoV-2 infection or long COVID. In contrast, the depression PRS predicted higher levels of COVID-19 burnout. Anxiety symptoms fully mediated the association between the depression PRS and COVID-19 burnout. Resilience did not moderate this association.
Conclusions
A higher genetic risk for depression predicted higher COVID-19 burnout and this association was fully mediated by anxiety symptoms. Interventions targeting anxiety symptoms may be effective in mitigating the psychological effects of a pandemic among people with depression.
Foliar-applied postemergence herbicides are a critical component of corn (Zea mays L.) and soybean [Glycine max (L.) Merr.] weed management programs in North America. Rainfall and air temperature around the time of application may affect the efficacy of herbicides applied postemergence in corn or soybean production fields. However, previous research utilized a limited number of site-years and may not capture the range of rainfall and air temperatures that these herbicides are exposed to throughout North America. The objective of this research was to model the probability of achieving successful weed control (≥85%) with commonly applied postemergence herbicides across a broad range of environments. A large database of more than 10,000 individual herbicide evaluation field trials conducted throughout North America was used in this study. The database was filtered to include only trials with a single postemergence application of fomesafen, glyphosate, mesotrione, or fomesafen + glyphosate. Waterhemp [Amaranthus tuberculatus (Moq.) Sauer], morningglory species (Ipomoea spp.), and giant foxtail (Setaria faberi Herrm.) were the weeds of focus. Separate random forest models were created for each weed species by herbicide combination. The probability of successful weed control deteriorated when the average air temperature within the first 10 d after application was <19 or >25 C for most of the herbicide by weed species models. Additionally, drier conditions before postemergence herbicide application reduced the probability of successful control for several of the herbicide by weed species models. As air temperatures increase and rainfall becomes more variable, weed control with many of the commonly used postemergence herbicides is likely to become less reliable.
It is well established that there is a substantial genetic component to eating disorders (EDs). Polygenic risk scores (PRSs) can be used to quantify cumulative genetic risk for a trait at an individual level. Recent studies suggest PRSs for anorexia nervosa (AN) may also predict risk for other disordered eating behaviors, but no study has examined if PRS for AN can predict disordered eating as a global continuous measure. This study aimed to investigate whether PRS for AN predicted overall levels of disordered eating, or specific lifetime disordered eating behaviors, in an Australian adolescent female population.
Methods
PRSs were calculated based on summary statistics from the largest Psychiatric Genomics Consortium AN genome-wide association study to date. Analyses were performed using genome-wide complex trait analysis to test the associations between AN PRS and disordered eating global scores, avoidance of eating, objective bulimic episodes, self-induced vomiting, and driven exercise in a sample of Australian adolescent female twins recruited from the Australian Twin Registry (N = 383).
Results
After applying the false-discovery rate correction, the AN PRS was significantly associated with all disordered eating outcomes.
Conclusions
Findings suggest shared genetic etiology across disordered eating presentations and provide insight into the utility of AN PRS for predicting disordered eating behaviors in the general population. In the future, PRSs for EDs may have clinical utility in early disordered eating risk identification, prevention, and intervention.
Ice rises hold valuable records revealing the ice dynamics and climatic history of Antarctic coastal areas from the Last Glacial Maximum to today. This history is often reconstructed from isochrone radar stratigraphy and simulations focusing on Raymond arch evolution beneath the divides. However, this relies on complex ice-flow models where many parameters are unconstrained by observations. Our study explores quad-polarimetric, phase-coherent radar data to enhance understanding near ice divides and domes, using Hammarryggen Ice Rise (HIR) as a case study. Analysing a 5 km profile intersecting the dome, we derive vertical strain rates and ice-fabric properties. These align with ice core data near the summit, increasing confidence in tracing signatures from the dome to the flanks. The Raymond effect is evident, correlating with surface strain rates and radar stratigraphy. Stability is inferred over millennia for the saddle connecting HIR to the mainland, but dome ice-fabric appears relatively young compared to 2D model predictions. In a broader context, quad-polarimetric measurements provide valuable insights into ice-flow models, particularly for anisotropic rheology. Including quad-polarimetric data advances our ability to reconstruct past ice flow dynamics and climatic history in ice rises.
To compare rates of Clostridioides difficile infection (CDI) recurrence following initial occurrence treated with tapered enteral vancomycin compared to standard vancomycin.
Design:
Retrospective cohort study.
Setting:
Community health system.
Patients:
Adults ≥18 years of age hospitalized with positive C. difficile polymerase chain reaction or toxin enzyme immunoassay who were prescribed either standard 10–14 days of enteral vancomycin four times daily or a 12-week tapered vancomycin regimen.
Methods:
Retrospective propensity score pair matched cohort study. Groups were matched based on age < or ≥ 65 years and receipt of non-C. difficile antibiotics during hospitalization or within 6 months post-discharge. Recurrence rates were analyzed via logistic regression conditioned on matched pairs and reported as conditional odds ratios. The primary outcome was recurrence rates compared between standard vancomycin versus tapered vancomycin for treatment of initial CDI.
Results:
The CDI recurrence rate at 6 months was 5.3% (4/75) in the taper cohort versus 28% (21/75) in the standard vancomycin cohort. The median time to CDI recurrence was 115 days versus 20 days in the taper and standard vancomycin cohorts, respectively. When adjusted for matching, patients in the taper arm were less likely to experience CDI recurrence at 6 months when compared to standard vancomycin (cOR = 0.19, 95% CI 0.07–0.56, p < 0.002).
Conclusions:
Larger prospective trials are needed to elucidate the clinical utility of tapered oral vancomycin as a treatment option to achieve sustained clinical cure in first occurrences of CDI.