We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To understand healthcare workers’ (HCWs) beliefs and practices toward blood culture (BCx) use.
Design:
Cross-sectional electronic survey and semi-structured interviews.
Setting:
Academic hospitals in the United States.
Participants:
HCWs involved in BCx ordering and collection in adult intensive care units (ICU) and wards.
Methods:
We administered an anonymous electronic survey to HCWs and conducted semi-structured interviews with unit staff and quality improvement (QI) leaders in these institutions to understand their perspectives regarding BCx stewardship between February and November 2023.
Results:
Of 314 HCWs who responded to the survey, most (67.4%) were physicians and were involved in BCx ordering (82.3%). Most survey respondents reported that clinicians had a low threshold to culture patients for fever (84.4%) and agreed they could safely reduce the number of BCx obtained in their units (65%). However, only half of them believed BCx was overused. Although most made BCx decisions as a team (74.1%), a minority reported these team discussions occurred daily (42.4%). A third of respondents reported not usually collecting the correct volume per BCx bottle, half were unaware of the improved sensitivity of 2 BCx sets, and most were unsure of the nationally recommended BCx contamination threshold (87.5%). Knowledge regarding the utility of BCx for common infections was limited.
Conclusions:
HCWs’ understanding of best collection practices and yield of BCx was limited.
Over a 2-year period, we identified Transmission from Room Environment Events (TREE) across the Johns Hopkins Health System, where the subsequent room occupant developed the same organism with the same antimicrobial susceptibilities as the patient who had previously occupied that room. Overall, the TREE rate was 50/100,000 inpatient days.
Background: External comparisons of antimicrobial use (AU) may be more informative if adjusted for encounter characteristics. Optimal methods to define input variables for encounter-level risk-adjustment models of AU are not established. Methods: This retrospective analysis of electronic health record data included 50 US hospitals in 2020-2021. We used NHSN definitions for all antibacterials days of therapy (DOT), including adult and pediatric encounters with at least 1 day present in inpatient locations. We assessed 4 methods to define input variables: 1) diagnosis-related group (DRG) categories by Yu et al., 2) adjudicated Elixhauser comorbidity categories by Goodman et al., 3) all Clinical Classification Software Refined (CCSR) diagnosis and procedure categories, and 4) adjudicated CCSR categories where codes not appropriate for AU risk-adjustment were excluded by expert consensus, requiring review of 867 codes over 4 months to attain consensus. Data were split randomly, stratified by bed size as follows: 1) training dataset including two-thirds of encounters among two-thirds of hospitals; 2) internal testing set including one-third of encounters within training hospitals, and 3) external testing set including the remaining one-third of hospitals. We used a gradient-boosted machine (GBM) tree-based model and two-staged approach to first identify encounters with zero DOT, then estimate DOT among those with >0.5 probability of receiving antibiotics. Accuracy was assessed using mean absolute error (MAE) in testing datasets. Correlation plots compared model estimates and observed DOT among testing datasets. The top 20 most influential variables were defined using modeled variable importance. Results: Our datasets included 629,445 training, 314,971 internal testing, and 419,109 external testing encounters. Demographic data included 41% male, 59% non-Hispanic White, 25% non-Hispanic Black, 9% Hispanic, and 5% pediatric encounters. DRG was missing in 29% of encounters. MAE was lower in pediatrics as compared to adults, and lowest for models incorporating CCSR inputs (Figure 1). Performance in internal and external testing was similar, though Goodman/Elixhauser variable strategies were less accurate in external testing and underestimated long DOT outliers (Figure 2). Agnostic and adjudicated CCSR model estimates were highly correlated; their influential variables lists were similar (Figure 3). Conclusion: Larger numbers of CCSR diagnosis and procedure inputs improved risk-adjustment model accuracy compared with prior strategies. Variable importance and accuracy were similar for agnostic and adjudicated approaches. However, maintaining adjudications by experts would require significant time and potentially introduce personal bias. If findings are confirmed, the need for expert adjudication of input variables should be reconsidered.
Disclosure: Elizabeth Dodds Ashley: Advisor- HealthTrackRx. David J Weber: Consultant on vaccines: Pfizer; DSMB chair: GSK; Consultant on disinfection: BD, GAMA, PDI, Germitec
Central-line–associated bloodstream infection (CLABSI) surveillance in home infusion therapy is necessary to track efforts to reduce infections, but a standardized, validated, and feasible definition is lacking. We tested the validity of a home-infusion CLABSI surveillance definition and the feasibility and acceptability of its implementation.
Design:
Mixed-methods study including validation of CLABSI cases and semistructured interviews with staff applying these approaches.
Setting:
This study was conducted in 5 large home-infusion agencies in a CLABSI prevention collaborative across 14 states and the District of Columbia.
From May 2021 to May 2022, agencies implemented a home-infusion CLABSI surveillance definition, using 3 approaches to secondary bloodstream infections (BSIs): National Healthcare Safety Program (NHSN) criteria, modified NHSN criteria (only applying the 4 most common NHSN-defined secondary BSIs), and all home-infusion–onset bacteremia (HiOB). Data on all positive blood cultures were sent to an infection preventionist for validation. Surveillance staff underwent semistructured interviews focused on their perceptions of the definition 1 and 3–4 months after implementation.
Results:
Interrater reliability scores overall ranged from κ = 0.65 for the modified NHSN criteria to κ = 0.68 for the NHSN criteria to κ = 0.72 for the HiOB criteria. For the NHSN criteria, the agency-determined rate was 0.21 per 1,000 central-line (CL) days, and the validator-determined rate was 0.20 per 1,000 CL days. Overall, implementing a standardized definition was thought to be a positive change that would be generalizable and feasible though time-consuming and labor intensive.
Conclusions:
The home-infusion CLABSI surveillance definition was valid and feasible to implement.
We compared the activity of 8 novel β-lactam and tetracycline-derivative antibiotics against a cohort of clinical carbapenem-resistant Enterobacterales (CRE) isolates and investigated the incremental susceptibility benefit of the addition of an aminoglycoside, fluoroquinolone, or polymyxin to the β-lactam agents to assist with empiric antibiotic decision making.
Methods:
A collection of consecutive CRE clinical isolates from unique patients at 3 US hospitals (2016–2021) was assembled. Broth microdilution was performed to obtain antimicrobial susceptibility testing results. Mechanisms of carbapenem resistance were investigated through short-read and long-read whole-genome sequencing.
Results:
Of the 603 CRE isolates, 276 (46%) were carbapenemase producing and 327 (54%) were non–carbapenemase producing, respectively. The organisms most frequently identified were Klebsiella pneumoniae (38%), Enterobacter cloacae complex (26%), and Escherichia coli (16%). We obtained the following percent susceptibility to novel β-lactam agents: ceftazidime-avibactam (95%), meropenem-vaborbactam (92%), imipenem-relebactam (84%), and cefiderocol (92%). Aminoglycosides and the polymyxins provided greater incremental coverage as second agents, compared to fluoroquinolones. Amikacin and plazomicin exhibited the greatest additive value. Ceftazidime-avibactam, meropenem-vaborbactam, and cefiderocol were active against 94% of the 220 KPC-producing isolates. Cefiderocol was active against 83% of the 29 NDM-producing isolates. Ceftazidime-avibactam had 100% activity against the 9 OXA-48-like–producing isolates. Tigecycline had the highest activity compared to other tetracyclines against KPC, NDM, or OXA-48-like–producing isolates.
Conclusion:
Selection among novel agents requires a nuanced understanding of the molecular epidemiology of CRE. This work provides insights into the comparative activity of novel agents and the additive value of a second antibiotic for empiric antibiotic decision making.
Targeted screening for carbapenem-resistant organisms (CROs), including carbapenem-resistant Enterobacteriaceae (CRE) and carbapenemase-producing organisms (CPOs), remains limited; recent data suggest that existing policies miss many carriers.
Objective:
Our objective was to measure the prevalence of CRO and CPO perirectal colonization at hospital unit admission and to use machine learning methods to predict probability of CRO and/or CPO carriage.
Methods:
We performed an observational cohort study of all patients admitted to the medical intensive care unit (MICU) or solid organ transplant (SOT) unit at The Johns Hopkins Hospital between July 1, 2016 and July 1, 2017. Admission perirectal swabs were screened for CROs and CPOs. More than 125 variables capturing preadmission clinical and demographic characteristics were collected from the electronic medical record (EMR) system. We developed models to predict colonization probabilities using decision tree learning.
Results:
Evaluating 2,878 admission swabs from 2,165 patients, we found that 7.5% and 1.3% of swabs were CRO and CPO positive, respectively. Organism and carbapenemase diversity among CPO isolates was high. Despite including many characteristics commonly associated with CRO/CPO carriage or infection, overall, decision tree models poorly predicted CRO and CPO colonization (C statistics, 0.57 and 0.58, respectively). In subgroup analyses, however, models did accurately identify patients with recent CRO-positive cultures who use proton-pump inhibitors as having a high likelihood of CRO colonization.
Conclusions:
In this inpatient population, CRO carriage was infrequent but was higher than previously published estimates. Despite including many variables associated with CRO/CPO carriage, models poorly predicted colonization status, likely due to significant host and organism heterogeneity.
Using samples collected for VRE surveillance, we evaluated unit admission prevalence of carbapenem-resistant Enterobacteriaceae (CRE) perirectal colonization and whether CRE carriers (unknown to staff) were on contact precautions for other indications. CRE colonization at unit admission was infrequent (3.9%). Most CRE carriers were not on contact precautions, representing a reservoir for healthcare-associated CRE transmission.
The longstanding association between the major histocompatibility complex (MHC) locus and schizophrenia (SZ) risk has recently been accounted for, partially, by structural variation at the complement component 4 (C4) gene. This structural variation generates varying levels of C4 RNA expression, and genetic information from the MHC region can now be used to predict C4 RNA expression in the brain. Increased predicted C4A RNA expression is associated with the risk of SZ, and C4 is reported to influence synaptic pruning in animal models.
Methods
Based on our previous studies associating MHC SZ risk variants with poorer memory performance, we tested whether increased predicted C4A RNA expression was associated with reduced memory function in a large (n = 1238) dataset of psychosis cases and healthy participants, and with altered task-dependent cortical activation in a subset of these samples.
Results
We observed that increased predicted C4A RNA expression predicted poorer performance on measures of memory recall (p = 0.016, corrected). Furthermore, in healthy participants, we found that increased predicted C4A RNA expression was associated with a pattern of reduced cortical activity in middle temporal cortex during a measure of visual processing (p < 0.05, corrected).
Conclusions
These data suggest that the effects of C4 on cognition were observable at both a cortical and behavioural level, and may represent one mechanism by which illness risk is mediated. As such, deficits in learning and memory may represent a therapeutic target for new molecular developments aimed at altering C4’s developmental role.
Antibiotic resistance is a major threat to public health. Resistance is largely driven by antibiotic usage, which in many cases is unnecessary and can be improved. The impact of decreasing overall antibiotic usage on resistance is unknown and difficult to assess using standard study designs. The objective of this study was to explore the potential impact of reducing antibiotic usage on the transmission of multidrug-resistant organisms (MDROs).
DESIGN
We used agent-based modeling to simulate interactions between patients and healthcare workers (HCWs) using model inputs informed by the literature. We modeled the effect of antibiotic usage as (1) a microbiome effect, for which antibiotic usage decreases competing bacteria and increases the MDRO transmission probability between patients and HCWs and (2) a mutation effect that designates a proportion of patients who receive antibiotics to subsequently develop a MDRO via genetic mutation.
SETTING
Intensive care unit
INTERVENTIONS
Absolute reduction in overall antibiotic usage by experimental values of 10% and 25%
RESULTS
Reducing antibiotic usage absolutely by 10% (from 75% to 65%) and 25% (from 75% to 50%) reduced acquisition rates of high-prevalence MDROs by 11.2% (P<.001) and 28.3% (P<.001), respectively. We observed similar effect sizes for low-prevalence MDROs.
CONCLUSIONS
In a critical care setting, where up to 50% of antibiotic courses may be inappropriate, even a moderate reduction in antibiotic usage can reduce MDRO transmission.
Introduction: Hospitalization due to ambulatory care sensitive conditions (ACSC) is a proxy measure for access to primary care. Emergency medical services (EMS) are increasingly called when primary care cannot be accessed. A novel paramedic-nurse EMS Mobile Care Team (MCT) was implemented in an under-serviced community. The MCT responds in a non-transport unit to bookings from EMS, emergency and primary care and to low-acuity 911 calls in a defined geographic region. Our objective was to compare the prevalence of ACSC in ground ambulance (GA) responses before and after the introduction of the MCT. Methods: A cross-sectional analysis of GA and MCT patients with ACSC (determined by chief complaint, clinical impression, treatment protocol and medical history) one year pre- and one year post-MCT implementation was conducted for the period Oct. 1, 2012 to Sept. 30, 2014. Demographics were described. Predictors of ACSC were identified via logistic regression. Prevalence was compared with chi-squared analysis. Results: There were 975 calls pre- and 1208 GA/95 MCT calls post-MCT. ACSC in GA patients pre- and post-MCT was similar: n=122, 12.5% vs. n=185, 15.3%; p=0.06. ACSC in patients seen by EMS (GA plus MCT) increased in the post-period: 122 (12.5%) vs. 204 (15.7%) p=0.04. Pre vs post, GA calls differed by sex (p=0.007) but not age (65.38 ± 15.12 vs. 62.51 ± 20.48; p=0.16). Post-MCT, prevalence of specific ACSC increased for GA: hypertension (p<0.001) and congestive heart failure (p=0.04). MCT patients with ACSC were less likely to have a primary care provider compared to GA (90.2% and 87.6% vs. 63.2%; p=0.003, p=0.004). Conclusion: The prevalence of ACSC did not decrease for GA with the introduction of the MCT, but ACSC in the overall patient population served by EMS increased. It is possible more patients with ACSC call or are referred to EMS for the new MCT service. Given that MCT patients were less likely to have a primary care provider this may represent an increase in access to care, or a shift away from other emergency/episodic care. These associations must be further studied to inform the ideal utility of adding such services to EMS and healthcare systems.
To identify Choosing Wisely items for the American Board of Internal Medicine Foundation.
METHODS
The Society for Healthcare Epidemiology of America (SHEA) elicited potential items from a hospital epidemiology listserv, SHEA committee members, and a SHEA–Infectious Diseases Society of America compendium with SHEA Research Network members ranking items by Delphi method voting. The SHEA Guidelines Committee reviewed the top 10 items for appropriateness for Choosing Wisely. Five final recommendations were approved via individual member vote by committees and the SHEA Board.
RESULTS
Ninety-six items were proposed by 87 listserv members and 99 SHEA committee members. Top 40 items were ranked by 24 committee members and 64 of 226 SHEA Research Network members. The 5 final recommendations follow: 1. Don’t continue antibiotics beyond 72 hours in hospitalized patients unless patient has clear evidence of infection. 2. Avoid invasive devices (including central venous catheters, endotracheal tubes, and urinary catheters)and, if required, use no longer than necessary. They pose a major risk for infections. 3. Don’t perform urinalysis, urine culture, blood culture, or Clostridium difficile testing unless patients have signs or symptoms of infection. Tests can be falsely positive leading to overdiagnosis and overtreatment. 4. Do not use antibiotics in patients with recent C. difficile without convincing evidence of need. Antibiotics pose a high risk of C. difficile recurrence. 5. Don’t continue surgical prophylactic antibiotics after the patient has left the operating room. Five runner-up recommendations are included.
CONCLUSIONS
These 5 SHEA Choosing Wisely and 5 runner-up items limit medical overuse.
Little is understood about the evolution of structural and functional brain changes during the course of uncontrolled focal status epilepticus in humans.
Methods:
We serially evaluated and treated a nine-year-old girl with refractory focal status epilepticus. Long-term EEG monitoring, MRI, MRA, SPECT, intraoperative visualization of affected cortex, and neuropathological examination of a biopsy specimen were conducted over a three year time span. Imaging changes were correlated with simultaneous treatment and EEG findings.
Results:
The EEG monitoring showed almost continuous spike discharges emanating initially from the right frontocentral area. These EEG abnormalities were intermittently suppressed by treatment with anesthetics. Over time, additional brain areas developed epileptiform EEG abnormalities. Serial MRI studies demonstrated an evolution of changes from normal, through increased regional T2 signal to generalized atrophy. An MRAdemonstrated dilatation of the middle cerebral artery stem on the right compared to the left with a broad distribution of flow-related enhancement. An 18FDG-PET scan showed a dramatically abnormal metabolic profile in the same right frontocentral areas, which modulated in response to treatment during the course of the illness. A right frontotemporal craniotomy revealed a markedly hyperemic cortical focus including vascular shunting. A sample of resected cortex showed severe gliosis and neuronal death.
Conclusions:
The co-registration of structural and functional imaging and its correlation with operative and pathological findings in this case illustrates the relentless progression of regional and generalized abnormalities in intractable focal status epilepticus that were only transiently modified by exhaustive therapeutic interventions. Increased flow through large vessels appeared to be shunted and did not translate into increased microvascular perfusion.
Combination antibiograms can be used to evaluate organism cross-resistance among multiple antibiotics. As combination therapy is generally favored for the treatment of carbapenemase-producing Enterobacteriaceae (CPE), combination antibiograms provide valuable information about the combination of antibiotics that achieve the highest likelihood of adequate antibiotic coverage against CPE.
Infect. Control Hosp. Epidemiol. 2015;36(12):1458–1460
Antimicrobial stewardship programs are increasingly recognized as critical in optimizing the use of antimicrobials. Consequently, more physicians, pharmacists, and other healthcare providers are developing and implementing such programs in a variety of healthcare settings. The purpose of this guidance document is to outline the knowledge and skills that are needed to lead an antimicrobial stewardship program. It was developed by antimicrobial stewardship experts from organizations that are engaged in advancing the field of antimicrobial stewardship.
Infect Control Hosp Epidemiol 2014;35(12):1444–1451
We describe two cases of infant botulism due to Clostridium butyricum producing botulinum type E neurotoxin (BoNT/E) and a previously unreported environmental source. The infants presented at age 11 days with poor feeding and lethargy, hypotonia, dilated pupils and absent reflexes. Faecal samples were positive for C. butyricum BoNT/E. The infants recovered after treatment including botulism immune globulin intravenous (BIG-IV). C. butyricum BoNT/E was isolated from water from tanks housing pet ‘yellow-bellied’ terrapins (Trachemys scripta scripta): in case A the terrapins were in the infant's home; in case B a relative fed the terrapin prior to holding and feeding the infant when both visited another relative. C. butyricum isolates from the infants and the respective terrapin tank waters were indistinguishable by molecular typing. Review of a case of C. butyricum BoNT/E botulism in the UK found that there was a pet terrapin where the infant was living. It is concluded that the C. butyricum-producing BoNT type E in these cases of infant botulism most likely originated from pet terrapins. These findings reinforce public health advice that reptiles, including terrapins, are not suitable pets for children aged <5 years, and highlight the importance of hand washing after handling these pets.
To explore current practices and decision making regarding antimicrobial prescribing among emergency department (ED) clinical providers.
Methods
We conducted a survey of ED providers recruited from 8 sites in 3 cities. Using purposeful sampling, we then recruited 21 providers for in-depth interviews. Additionally, we observed 10 patient-provider interactions at one of the ED sites. SAS 9.3 was used for descriptive and predictive statistics. Interviews were audio recorded, transcribed, and analyzed using a thematic, constructivist approach with consensus coding using NVivo 10.0. Field and interview notes collected during the observational study were aligned with themes identified through individual interviews.
Results
Of 150 survey respondents, 76% agreed or strongly agreed that antibiotics are overused in the ED, while half believed they personally did not overprescribe. Eighty-nine percent used a smartphone or tablet in the ED for antibiotic prescribing decisions. Several significant differences were found between attending and resident physicians. Interview analysis identified 42 codes aggregated into the following themes: (1) resource and environmental factors that affect care; (2) access to and quality of care received outside of the ED consult; (3) patient-provider relationships; (4) clinical inertia; and (5) local knowledge generation. The observational study revealed limited patient understanding of antibiotic use. Providers relied heavily upon diagnostics and provided limited education to patients. Most patients denied a priori expectations of being prescribed antibiotics.
Conclusions
Patient, provider, and healthcare system factors should be considered when designing interventions to improve antimicrobial stewardship in the ED setting.
Infect Control Hosp Epidemiol 2014;35(9):1114-1125
Several studies demonstrating that central line–associated bloodstream infections (CLABSIs) are preventable prompted a national initiative to reduce the incidence of these infections.
Methods.
We conducted a collaborative cohort study to evaluate the impact of the national “On the CUSP: Stop BSI” program on CLABSI rates among participating adult intensive care units (ICUs). The program goal was to achieve a unit-level mean CLABSI rate of less than 1 case per 1,000 catheter-days using standardized definitions from the National Healthcare Safety Network. Multilevel Poisson regression modeling compared infection rates before, during, and up to 18 months after the intervention was implemented.
Results.
A total of 1,071 ICUs from 44 states, the District of Columbia, and Puerto Rico, reporting 27,153 ICU-months and 4,454,324 catheter-days of data, were included in the analysis. The overall mean CLABSI rate significantly decreased from 1.96 cases per 1,000 catheter-days at baseline to 1.15 at 16–18 months after implementation. CLABSI rates decreased during all observation periods compared with baseline, with adjusted incidence rate ratios steadily decreasing to 0.57 (95% confidence intervals, 0.50–0.65) at 16–18 months after implementation.
Conclusion.
Coincident with the implementation of the national “On the CUSP: Stop BSI” program was a significant and sustained decrease in CLABSIs among a large and diverse cohort of ICUs, demonstrating an overall 43% decrease and suggesting the majority of ICUs in the United States can achieve additional reductions in CLABSI rates.