We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Indicating and depicting are widely understood to be fundamental, meaningful components of everyday spoken language discourse: a speaker's arms and hands are free to indicate and depict because they do not articulate words. In contrast, a signer's arms and hands do articulate signs. For this reason, linguists studying sign languages have overwhelmingly concluded that signers do not indicate and depict as a part of signed articulations. This book demonstrates that signers do, however, indicate - by incorporating non-lexical gestures into their articulations of individual signs. Fully illustrated throughout, it also shows that signers create depictions in numerous ways through conceptualizations, in which the hands, other parts of the body, and parts of the space ahead of the signer depict things. By establishing that indicating and depicting are also fundamental, meaningful aspects of sign language discourse, this book is essential reading for researchers and students of sign linguistics and gesture studies.
Background: Nipocalimab is a human IgG1 monoclonal antibody targeting FcRn that selectively reduces IgG levels without impacting antigen presentation, T- and B-cell functions. This study describes the effect of nipocalimab on vaccine response. Methods: Open-label, parallel, interventional study randomized participants 1:1 to receive intravenous 30mg/kg nipocalimab at Week0 and 15mg/kg at Week2 and Week4 (active) or no drug (control). On Day 3, participants received Tdap and PPSV®23 vaccinations and were followed through Wk16. Results: Twenty-nine participants completed the study and are included (active, n=15; control, n=14). Participants with a positive anti-tetanus IgG response was comparable between groups at Wk2 and Wk16, but lower at Wk4 (nipocalimab 3/15 [20%] vs control 7/14 [50%]; P=0.089). All maintained anti-tetanus IgG above the protective threshold (0.16IU/mL) through Wk16. While anti-pneumococcal-capsular-polysaccharide (PCP) IgG levels were lower during nipocalimab treatment, the percent increase from baseline at Wk2 and Wk16 was comparable between groups. Post-vaccination, anti-PCP IgG remained above 50mg/L and showed a 2-fold increase from baseline throughout the study in both groups. Nipocalimab co-administration with vaccines was safe and well-tolerated. Conclusions: These findings suggest that nipocalimab does not impact the development of an adequate IgG response to T-cell–dependent/independent vaccines and that nipocalimab-treated patients can follow recommended vaccination schedules.
More autonomous humanitarian international nongovernmental organizations (INGOs) have greater capacity to determine who receives aid among conflict- and crisis-affected populations than their donor-following counterparts. The latter are more likely to become instruments of states seeking geostrategic influence in places like Syria and Ukraine. Drawing on more than 120 interviews with INGO and donor agency workers, 10 months of political ethnography among INGOs working with refugees in Lebanon and Jordan after the war in Syria, and content analysis of organizational documents, this article investigates the ways that INGOs secure autonomy from donors. In a theory-building exercise, it introduces the concept of negotiation experience to explain why some INGOs develop skills and strategies that allow them to resist donor demands. It also identifies some of the tactics used by experienced negotiators to do so. The findings have implications for who controls and is accountable for humanitarian policy and practice, as well as the abilities of state donors to influence humanitarian behavior. They call into question expectations that INGOs “scramble” for funds under conditions of funding scarcity.
Recent changes to US research funding are having far-reaching consequences that imperil the integrity of science and the provision of care to vulnerable populations. Resisting these changes, the BJPsych Portfolio reaffirms its commitment to publishing mental science and advancing psychiatric knowledge that improves the mental health of one and all.
Despite advances in antiretroviral treatment (ART), human immunodeficiency virus (HIV) can detrimentally affect everyday functioning. Neurocognitive impairment (NCI) and current depression are common in people with HIV (PWH) and can contribute to poor functional outcomes, but potential synergies between the two conditions are less understood. Thus, the present study aimed to compare the independent and combined effects of NCI and depression on everyday functioning in PWH. We predicted worse functional outcomes with comorbid NCI and depression than either condition alone.
Methods:
PWH enrolled at the UCSD HIV Neurobehavioral Research Program were assessed for neuropsychological performance, depression severity (≤minimal, mild, moderate, or severe; Beck Depression Inventory-II), and self-reported everyday functioning.
Results:
Participants were 1,973 PWH (79% male; 66% racial/ethnic minority; Age: M = 48.6; Education: M = 13.0, 66% AIDS; 82% on ART; 42% with NCI; 35% BDI>13). ANCOVA models found effects of NCI and depression symptom severity on all functional outcomes (ps < .0001). With NCI and depression severity included in the same model, both remained significant (ps < .0001), although the effects of each were attenuated, and yielded better model fit parameters (i.e., lower AIC values) than models with only NCI or only depression.
Conclusions:
Consistent with prior literature, NCI and depression had independent effects on everyday functioning in PWH. There was also evidence for combined effects of NCI and depression, such that their comorbidity had a greater impact on functioning than either alone. Our results have implications for informing future interventions to target common, comorbid NCI and depressed mood in PWH and thus reduce HIV-related health disparities.
Introducing new herbicides requires a comprehensive understanding of how crops respond to various herbicide-related factors. Fluridone was registered for use in rice production in 2023, but research on rice tolerance to this herbicide is lacking. Hence, field research aimed to 1) evaluate the effect of fluridone application timing on rice tolerance and 2) assess rice response to fluridone in a mixture with standard rice herbicides applied to 3-leaf rice. Both experiments were conducted in a delay-flooded dry-seeded system using a randomized complete block design, with four replications. Treatments in the first experiment included a nontreated control and 10 application timings, ranging from 20 d preplant to postflood. The second experiment had a two-factor factorial structure, with factor A being the presence/absence of fluridone, and factor B being herbicide partners, including bispyribac-sodium, fenoxaprop, penoxsulam, propanil, quinclorac, quizalofop, and saflufenacil. In the first experiment, the maximum injury in 2022 was 28%, caused by the preemergence treatment. In 2023, fluridone applied preemergence caused the greatest injury (42%) 2 wk after flood establishment, declining to 37% in late season (13 d before rice reached 50% heading). Yield reductions of 21% occurred with the delayed preemergence treatment in 2022 and 42% with the preemergence treatment in 2023. Mixing fluridone with standard herbicides increased rice injury by no more than eight percentage points compared with the herbicides applied alone. Additionally, no adverse effects on rice groundcover or grain yield resulted from fluridone in the mixture. These results indicate a need to avoid fluridone applications near planting because of negative impacts on rice. Furthermore, fluridone can be mixed with commonly used rice herbicides, offering minimal risk to rice.
Medicare claims are frequently used to study Clostridioides difficile infection (CDI) epidemiology. However, they lack specimen collection and diagnosis dates to assign location of onset. Algorithms to classify CDI onset location using claims data have been published, but the degree of misclassification is unknown.
Methods:
We linked patients with laboratory-confirmed CDI reported to four Emerging Infections Program (EIP) sites from 2016–2021 to Medicare beneficiaries with fee-for-service Part A/B coverage. We calculated sensitivity of ICD-10-CM codes in claims within ±28 days of EIP specimen collection. CDI was categorized as hospital, long-term care facility, or community-onset using three different Medicare claims-based algorithms based on claim type, ICD-10-CM code position, duration of hospitalization, and ICD-10-CM diagnosis code presence-on-admission indicators. We assessed concordance of EIP case classifications, based on chart review and specimen collection date, with claims case classifications using Cohen’s kappa statistic.
Results:
Of 12,671 CDI cases eligible for linkage, 9,032 (71%) were linked to a single, unique Medicare beneficiary. Compared to EIP, sensitivity of CDI ICD-10-CM codes was 81%; codes were more likely to be present for hospitalized patients (93.0%) than those who were not (56.2%). Concordance between EIP and Medicare claims algorithms ranged from 68% to 75%, depending on the algorithm used (κ = 0.56–0.66).
Conclusion:
ICD-10-CM codes in Medicare claims data had high sensitivity compared to laboratory-confirmed CDI reported to EIP. Claims-based epidemiologic classification algorithms had moderate concordance with EIP classification of onset location. Misclassification of CDI onset location using Medicare algorithms may bias findings of claims-based CDI studies.
Boneseed [Chrysanthemoides monilifera subsp. monilifera (L.) Norl.; syn. Osteospermum moniliferum subsp. moniliferum L.] is a perennial shrub native to the southwestern and southern coasts of South Africa. It was introduced to Australia in about 1852 and now represents a significant threat to natural ecosystems. Despite C. monilifera subsp. monilifera being listed as a Weed of National Significance, momentum on improving its management has dissipated at a national level, beginning in 2008 (when a national research initiative finished) and increasingly after 2013 (when funding for national coordination ceased). A recent synthesis of past management for C. monilifera subsp. monilifera and recommendations for guiding future priorities has rekindled interest in Western Australia. To complement this synthesis and to identify improvements for program efficiency and effectiveness, we reviewed research and management findings on this weed with a focus on the past two decades. We collated information across the ecology and biology of C. monilifera subsp. monilifera, and the near relative, bitou bush [Chrysanthemoides monilifera subsp. rotunda (DC.) J.C. Manning & Goldblatt; syn. Osteospermum moniliferum subsp. rotundatum (DC.)], as well as useful insight from C. monilifera subsp. monilifera management programs applied elsewhere. As part of this review, we assessed the classical biological control work that has been done on C. monilifera subsp. monilifera, focusing on likely explanations for why, despite nine agents and a naturalized fungus, biological control is not an effective management tool. Our synthesis suggests that for the limited populations with low-abundance plants in Western Australia, eradication from the state remains a realistic target. This objective, however, needs to build on the collated baseline of past management efforts and deploy a carefully planned management program over the coming two decades. Systematic surveillance using the latest techniques, combined with manual or herbicide removal and controlled burns where possible, remain the most suitable methods to deploy. The long-lived soil seedbank requires detailed monitoring following initial plant removals and long-term funding to ensure the sustained effort required to deliver the goal of eradication of C. monilifera subsp. monilifera in Western Australia.
Duchenne muscular dystrophy is a devastating neuromuscular disorder characterized by the loss of dystrophin, inevitably leading to cardiomyopathy. Despite publications on prophylaxis and treatment with cardiac medications to mitigate cardiomyopathy progression, gaps remain in the specifics of medication initiation and optimization.
Method:
This document is an expert opinion statement, addressing a critical gap in cardiac care for Duchenne muscular dystrophy. It provides thorough recommendations for the initiation and titration of cardiac medications based on disease progression and patient response. Recommendations are derived from the expertise of the Advance Cardiac Therapies Improving Outcomes Network and are informed by established guidelines from the American Heart Association, American College of Cardiology, and Duchenne Muscular Dystrophy Care Considerations. These expert-derived recommendations aim to navigate the complexities of Duchenne muscular dystrophy-related cardiac care.
Results:
Comprehensive recommendations for initiation, titration, and optimization of critical cardiac medications are provided to address Duchenne muscular dystrophy-associated cardiomyopathy.
Discussion:
The management of Duchenne muscular dystrophy requires a multidisciplinary approach. However, the diversity of healthcare providers involved in Duchenne muscular dystrophy can result in variations in cardiac care, complicating treatment standardization and patient outcomes. The aim of this report is to provide a roadmap for managing Duchenne muscular dystrophy-associated cardiomyopathy, by elucidating timing and dosage nuances crucial for optimal therapeutic efficacy, ultimately improving cardiac outcomes, and improving the quality of life for individuals with Duchenne muscular dystrophy.
Conclusion:
This document seeks to establish a standardized framework for cardiac care in Duchenne muscular dystrophy, aiming to improve cardiac prognosis.
Validate a public health model identifying patients at high risk for carbapenem-resistant Enterobacterales (CRE) on admission and evaluate performance across a healthcare network.
Design:
Retrospective case-control studies
Participants:
Adults hospitalized with a clinical CRE culture within 3 days of admission (cases) and those hospitalized without a CRE culture (controls).
Methods:
Using public health data from Atlanta, GA (1/1/2016–9/1/2019), we validated a CRE prediction model created in Chicago. We then closely replicated this model using clinical data from a healthcare network in Atlanta (1/1/2015–12/31/2021) (“Public Health Model”) and optimized performance by adding variables from the healthcare system (“Healthcare System Model”). We frequency-matched cases and controls based on year and facility. We evaluated model performance in validation datasets using area under the curve (AUC).
Results:
Using public health data, we matched 181 cases to 764,408 controls, and the Chicago model performed well (AUC 0.85). Using clinical data, we matched 91 cases to 384,013 controls. The Public Health Model included age, prior infection diagnosis, number of and mean length of stays in acute care hospitalizations (ACH) in the prior year. The final Healthcare System Model added Elixhauser score, antibiotic days of therapy in prior year, diabetes, admission to the intensive care unit in prior year and removed prior number of ACH. The AUC increased from 0.68 to 0.73.
Conclusions:
A CRE risk prediction model using prior healthcare exposures performed well in a geographically distinct area and in an academic healthcare network. Adding variables from healthcare networks improved model performance.
In this first report of endoparasites from endemic land-mammals of the Galápagos Islands, we describe a new species of cestode of the genus Raillietina (Cyclophyllidea: Davaineidae) from a species of Nesoryzomys and summarize the extent of helminth parasitism in both oryzomyine endemics and introduced species of Rattus. Up to the current time, no helminth parasites have been reported from rodents of the Galápagos, and little work has yet been done describing and synthesizing Galápagos parasite diversity. In historical times, several species of autochthonous rodents have occupied the islands including: Nesoryzomys narboroughi Heller 1904, N. fernandinae Hutterer and Hirsch 1979, N. swarthi Orr, 1938, and Aegialomys galapagoensis (Waterhouse, 1839). Colonization of the islands by humans brought 3 known species of synanthropic rodents: Rattus rattus, R. norvegicus, and Mus musculus which are suspected to have caused the extinction of at least 3 other oryzomyines in historical times.
With wide-field phased array feed technology, the Australian Square Kilometre Array Pathfinder (ASKAP) is ideally suited to search for seemingly rare radio transient sources that are difficult to discover previous-generation narrow-field telescopes. The Commensal Real-time ASKAP Fast Transient (CRAFT) Survey Science Project has developed instrumentation to continuously search for fast radio transients (duration $\lesssim$ 1 s) with ASKAP, with a particular focus on finding and localising fast radio bursts (FRBs). Since 2018, the CRAFT survey has been searching for FRBs and other fast transients by incoherently adding the intensities received by individual ASKAP antennas, and then correcting for the impact of frequency dispersion on these short-duration signals in the resultant incoherent sum (ICS) in real time. This low-latency detection enables the triggering of voltage buffers, which facilitates the localisation of the transient source and the study of spectro-polarimetric properties at high time resolution. Here we report the sample of 43 FRBs discovered in this CRAFT/ICS survey to date. This includes 22 FRBs that had not previously been reported: 16 FRBs localised by ASKAP to $\lesssim 1$ arcsec and 6 FRBs localised to $\sim 10$ arcmin. Of the new arcsecond-localised FRBs, we have identified and characterised host galaxies (and measured redshifts) for 11. The median of all 30 measured host redshifts from the survey to date is $z=0.23$. We summarise results from the searches, in particular those contributing to our understanding of the burst progenitors and emission mechanisms, and on the use of bursts as probes of intervening media. We conclude by foreshadowing future FRB surveys with ASKAP using a coherent detection system that is currently being commissioned. This will increase the burst detection rate by a factor of approximately ten and also the distance to which ASKAP can localise FRBs.
The southern African shrub boneseed [Chrysanthemoides monilifera subsp. monilifera (L.) Norl.] is a perennial shrub that is a significant threat to natural ecosystems and is listed as a Weed of National Significance in Australia. In Western Australia (WA) it has spread across peri-urban and natural environments. We assembled a single standardized database containing more than 2,050 presence records for individual plants and 135 absence records at a local population level. We further refined the populations into 89 sites that require different management trajectories due to topography and capacity of land managers to implement control. Forty-nine of these sites were in urban regions and 40 sites were in regional WA. We split these 89 sites into three near-term management goals: watch (12), extirpate (68), and contain (9). The 12 watch sites are those where all available evidence suggests that there have been no new inputs into the seedbank for 15 yr. The 68 sites marked for extirpation are those where delimitation is already achieved or easily achievable, where there have been minimal seed inputs into the soil seedbank in recent years due to consistent surveillance and control, and where surveys for new plants are likely to be efficient to conduct. Finally, for nine sites in urban regions around Perth, we recommend containment in the near term with a longer-term goal to achieve delimitation and extirpation. To achieve the objective of state-level eradication, a coordinated and sustained campaign involving three components—delimitation of all sites, prevention of further inputs into the soil seedbank, and systematic field surveys to remove plants—must commence without delay. While resourcing requirements for delimitation and overall program management are not possible to estimate, our prior experience suggests that it will take at least 1,900 h of on-ground surveying by experienced personnel to achieve extirpation of C. monilifera subsp. monilifera in WA.
Examine the relationship between patients’ race and prescriber antibiotic choice while accounting for differences in underlying illness and infection severity.
Design:
Retrospective cohort analysis.
Setting:
Acute care facilities within an academic healthcare system.
Patients:
Adult inpatients from January 2019 through June 2022 discharged from the Hospital Medicine Service with an ICD-10 Code for Pneumonia.
Methods:
We describe variability in days of therapy of antimicrobials with activity against Pseudomonas aeruginosa (anti-Pseudomonas agents) or against MRSA (anti-MRSA agents), by patient’s race and ethnicity. We estimated the likelihood of receipt of any anti-Pseudomonas agents by race and modeled the effect of race on rate of use, adjusting for age, severity, and indication.
Results:
5,820 patients with 6,700 encounters were included. After adjusting for broad indication, severity, underlying illness, and age, use of anti-Pseudomonas agents were less likely among non-Hispanic Black patients than other race groups, although this effect was limited to younger patients (adjusted odds ratio [aOR] 0.45, 95% confidence interval [CI] 0.29, 0.70), and not older ones (aOR 0.98; 95% CI 0.85, 1.13); use of anti-MRSA agents were similar between groups. Among patients receiving any anti-Pseudomonas agents, Black patients received them for relatively lower proportion of their inpatient stay (incidence rate ratio 0.91; 95% CI 0.87, 0.96).
Conclusions:
We found difference in use of anti-Pseudomonas agents between non-Hispanic Black patients and other patients that could not be easily explained by indications or underlying illness, suggesting unmeasured factors may be playing a role in treatment decisions.
Identifying persons with HIV (PWH) at increased risk for Alzheimer’s disease (AD) is complicated because memory deficits are common in HIV-associated neurocognitive disorders (HAND) and a defining feature of amnestic mild cognitive impairment (aMCI; a precursor to AD). Recognition memory deficits may be useful in differentiating these etiologies. Therefore, neuroimaging correlates of different memory deficits (i.e., recall, recognition) and their longitudinal trajectories in PWH were examined.
Design:
We examined 92 PWH from the CHARTER Program, ages 45–68, without severe comorbid conditions, who received baseline structural MRI and baseline and longitudinal neuropsychological testing. Linear and logistic regression examined neuroanatomical correlates (i.e., cortical thickness and volumes of regions associated with HAND and/or AD) of memory performance at baseline and multilevel modeling examined neuroanatomical correlates of memory decline (average follow-up = 6.5 years).
Results:
At baseline, thinner pars opercularis cortex was associated with impaired recognition (p = 0.012; p = 0.060 after correcting for multiple comparisons). Worse delayed recall was associated with thinner pars opercularis (p = 0.001) and thinner rostral middle frontal cortex (p = 0.006) cross sectionally even after correcting for multiple comparisons. Delayed recall and recognition were not associated with medial temporal lobe (MTL), basal ganglia, or other prefrontal structures. Recognition impairment was variable over time, and there was little decline in delayed recall. Baseline MTL and prefrontal structures were not associated with delayed recall.
Conclusions:
Episodic memory was associated with prefrontal structures, and MTL and prefrontal structures did not predict memory decline. There was relative stability in memory over time. Findings suggest that episodic memory is more related to frontal structures, rather than encroaching AD pathology, in middle-aged PWH. Additional research should clarify if recognition is useful clinically to differentiate aMCI and HAND.
Annual bluegrass (Poa annua L.) populations in turfgrass have evolved resistance to several herbicides in the United States, but there has been no confirmed resistance from an agricultural field. Recently, glyphosate failed to control a P. annua population found in a field in a soybean [Glycine max (L.) Merr.] and rice (Oryza sativa L.) rotation in Poinsett County, AR. The present study focused on determining the sensitivity of a putatively resistant accession (R1) to glyphosate compared with two susceptible accessions (S1 and S2). The experiments included a dose–response study, 5-enolpyruvylshikimate-3-phosphate synthase (EPSPS) gene copy number and expression analysis, and assessment of mutations in EPSPS. Based on the dose–response analysis, R1 required 1,038 g ae ha−1 of glyphosate to cause 50% biomass reduction, whereas S1 and S2 only required 148.2 and 145.5 g ae ha−1, respectively. The resistance index (RI) was approximately 7-fold relative to the susceptible accessions. Real-time polymerase chain reaction data revealed at least a 15-fold increase in the EPSPS copy number in R1, along with a higher gene expression. No mutations in EPSPS were found. Gene duplication was identified as the main mechanism conferring resistance in R1. The research presented here reports the first incidence of glyphosate resistance in P. annua from an agronomic field crop situation in the United States.
To examine the relationship between race and ethnicity and central line-associated bloodstream infections (CLABSI) while accounting for inherent differences in CLABSI risk related to central venous catheter (CVC) type.
Design:
Retrospective cohort analysis.
Setting:
Acute care facilities within an academic healthcare system.
Patients:
Adult inpatients from January 2012 through December 2017 with CVC present for ≥2 contiguous days.
Methods:
We describe variability in demographics, comorbidities, CVC type/configuration, and CLABSI rate by patient’s race and ethnicity. We estimated the unadjusted risk of CLABSI for each demographic and clinical characteristic and then modelled the effect of race on time to CLABSI, adjusting for total parenteral nutrition use and CVC type. We also performed exploratory analysis replacing race and ethnicity with social vulnerability index (SVI) metrics.
Results:
32,925 patients with 57,642 CVC episodes met inclusion criteria, most of which (51,348, 89%) were among non-Hispanic White or non-Hispanic Black patients. CVC types differed between race/ethnicity groups. However, after adjusting for CVC type, configuration, and indication in an adjusted cox regression, the risk of CLABSI among non-Hispanic Black patients did not significantly differ from non-Hispanic White patients (adjusted hazard ratio [aHR] 1.19; 95% confidence interval [CI]: 0.94, 1.51). The odds of having a CLABSI among the most vulnerable SVI subset compared to the less vulnerable was no different (odds ratio [OR] 0.95; 95% CI: 0.75–1.2).
Conclusions:
We did not find a difference in CLABSI risk between non-Hispanic White and non-Hispanic Black patients when adjusting for CLABSI risk inherent in type and configuration of CVC.
Among inpatients, peer-comparison of prescribing metrics is challenging due to variation in patient-mix and prescribing by multiple providers daily. We established risk-adjusted provider-specific antibiotic prescribing metrics to allow peer-comparisons among hospitalists.
Methods:
Using clinical and billing data from inpatient encounters discharged from the Hospital Medicine Service between January 2020 through June 2021 at four acute care hospitals, we calculated bimonthly (every two months) days of therapy (DOT) for antibiotics attributed to specific providers based on patient billing dates. Ten patient-mix characteristics, including demographics, infectious disease diagnoses, and noninfectious comorbidities were considered as potential predictors of antibiotic prescribing. Using linear mixed models, we identified risk-adjusted models predicting the prescribing of three antibiotic groups: broad spectrum hospital-onset (BSHO), broad-spectrum community-acquired (BSCA), and anti-methicillin-resistant Staphylococcus aureus (Anti-MRSA) antibiotics. Provider-specific observed-to-expected ratios (OERs) were calculated to describe provider-level antibiotic prescribing trends over time.
Results:
Predictors of antibiotic prescribing varied for the three antibiotic groups across the four hospitals, commonly selected predictors included sepsis, COVID-19, pneumonia, urinary tract infection, malignancy, and age >65 years. OERs varied within each hospital, with medians of approximately 1 and a 75th percentile of approximately 1.25. The median OER demonstrated a downward trend for the Anti-MRSA group at two hospitals but remained relatively stable elsewhere. Instances of heightened antibiotic prescribing (OER >1.25) were identified in approximately 25% of the observed time-points across all four hospitals.
Conclusion:
Our findings indicate provider-specific benchmarking among inpatient providers is achievable and has potential utility as a valuable tool for inpatient stewardship efforts.
The incubation period for Clostridioides difficile infection (CDI) is generally considered to be less than 1 week, but some recent studies suggest that prolonged carriage prior to disease onset may be common.
Objective:
To estimate the incubation period for patients developing CDI after initial negative cultures.
Methods:
In 3 tertiary care medical centers, we conducted a cohort study to identify hospitalized patients and long-term care facility residents with negative initial cultures for C. difficile followed by a diagnosis of CDI with or without prior detection of carriage. Cases were classified as healthcare facility-onset, community-onset, healthcare facility-associated, or community-associated and were further classified as probable, possible, or unlikely CDI. A parametric accelerated failure time model was used to estimate the distribution of the incubation period.
Results:
Of 4,179 patients with negative enrollment cultures and no prior CDI diagnosis within 56 days, 107 (2.6%) were diagnosed as having CDI, including 19 (17.8%) with and 88 (82.2%) without prior detection of carriage. When the data were censored to only include participants with negative cultures collected within 14 days, the estimated median incubation period was 6 days with 25% and 75% of estimated incubation periods occurring within 3 and 12 days, respectively. The observed estimated incubation period did not differ significantly for patients classified as probable, possible, or unlikely CDI.
Conclusion:
Our findings are consistent with the previous studies that suggested the incubation period for CDI is typically less than 1 week and is less than 2 weeks in most cases.