Original Articles
Cost-Effectiveness of Competing Treatment Strategies for Clostridium difficile Infection: A Systematic Review
- Phuc Le, Van T. Nghiem, Patricia Dolan Mullen, Abhishek Deshpande
-
- Published online by Cambridge University Press:
- 21 February 2018, pp. 412-424
-
- Article
- Export citation
-
BACKGROUND
Clostridium difficile infection (CDI) presents a substantial economic burden and is associated with significant morbidity. While multiple treatment strategies have been evaluated, a cost-effective management strategy remains unclear.
OBJECTIVEWe conducted a systematic review to assess cost-effectiveness analyses of CDI treatment and to summarize key issues for clinicians and policy makers to consider.
METHODSWe searched PubMed and 5 other databases from inception to August 2016. These searches were not limited by study design or language of publication. Two reviewers independently screened the literature, abstracted data, and assessed methodological quality using the Drummond and Jefferson checklist. We extracted data on study characteristics, type of CDI, treatment characteristics, and model structure and inputs.
RESULTSWe included 14 studies, and 13 of these were from high-income countries. More than 90% of these studies were deemed moderate-to-high or high quality. Overall, 6 studies used a decision-tree model and 7 studies used a Markov model. Cost of therapy, time horizon, treatment cure rates, and recurrence rates were common influential factors in the study results. For initial CDI, fidaxomicin was a more cost-effective therapy than metronidazole or vancomycin in 2 of 3 studies. For severe initial CDI, 2 of 3 studies found fidaxomicin to be the most cost-effective therapy. For recurrent CDI, fidaxomicin was cost-effective in 3 of 5 studies, while fecal microbiota transplantation (FMT) by colonoscopy was consistently cost-effective in 4 of 4 studies.
CONCLUSIONSThe cost-effectiveness of fidaxomicin compared with other pharmacologic therapies was not definitive for either initial or recurrent CDI. Despite its high cost, FMT by colonoscopy may be a cost-effective therapy for recurrent CDI. A consensus on model design and assumptions are necessary for future comparison of CDI treatment.
Infect Control Hosp Epidemiol 2018;39:412–424
Original Article
Transmission of resistant Gram-negative bacteria to healthcare personnel gowns and gloves during care of residents in community-based nursing facilities
- Natalia Blanco, J. Kristie Johnson, John D. Sorkin, Alison D. Lydecker, Lauren Levy, Lona Mody, Mary-Claire Roghmann
-
- Published online by Cambridge University Press:
- 08 October 2018, pp. 1425-1430
-
- Article
- Export citation
-
Objective
To estimate the risk of transmission of antibiotic-resistant Gram-negative bacteria (RGNB) to gowns and gloves worn by healthcare personnel (HCP) when providing care to residents of community-based nursing facilities to identify the types of care and resident characteristics associated with transmission.
DesignProspective observational study.
Settings and participantsResidents and HCP from 13 community-based nursing facilities in Maryland and Michigan.
MethodsPerianal swabs were collected from residents and cultured to detect RGNB. HCP wore gowns and gloves during usual care activities, and at the end of each interaction, these were swabbed in a standardized manner. Transmission of RGNB from a colonized resident to gowns and gloves was estimated. Odds ratios (ORs) of transmission associated with type of care or resident characteristic were calculated.
ResultsWe enrolled 403 residents and their HCP in this study. Overall, 19% of enrolled residents with a perianal swab (n=399) were colonized with at least 1 RGNB. RGNB transmission to either gloves or gowns occurred during 11% of the 584 interactions. Showering the resident, hygiene or toilet assistance, and wound dressing changes were associated with a high risk of transmission. Glucose monitoring and assistance with feeding or medication were associated with a low risk of transmission. Residents with a pressure ulcer were 3 times more likely to transmit RGNB than residents without one (OR, 3.3; 95% confidence interval [CI], 1.0–11.1).
ConclusionsGown and glove use in community nursing facilities should be prioritized for certain residents and care interactions that are deemed a high risk for transmission.
Active screening and interfacility communication of carbapenem-resistant Enterobacteriaceae (CRE) in a tertiary-care hospital
- Teppei Shimasaki, John Segreti, Alexander Tomich, Julie Kim, Mary K. Hayden, Michael Y. Lin, for the CDC Prevention Epicenters Program
-
- Published online by Cambridge University Press:
- 19 July 2018, pp. 1058-1062
-
- Article
- Export citation
-
Background
Hospitals may implement admission screening cultures and may review transfer documentation to identify patients colonized with carbapenem-resistant Enterobacteriaceae (CRE) to implement isolation precautions; however, outcomes and logistical considerations have not been well described.
MethodsAt an academic hospital in Chicago, we retrospectively studied the implementation and outcomes of CRE admission screening from 2013 to 2016 during 2 periods. During period 1, we implemented active CRE rectal culture screening for all adults patients admitted to intensive care units (ICUs) and for those transferred from outside facilities to general wards. During period 2, screening was restricted only to adults transferred from outside facilities. For a subset of transferred patients who were previously reported to the health department as CRE positive, we reviewed transfer paperwork for appropriate documentation of CRE.
ResultsOverall, 11,757 patients qualified for screening; rectal cultures were performed for 8,569 patients (73%). Rates of CRE screen positivity differed by period, previous facility type (if transferred), and current inpatient location. A higher combined CRE positivity rate was detected in the medical and surgical ICUs among period 2 patients (3.3%) versus all other ward-period comparisons (P<.001). Among 13 transferred patients previously known to be CRE colonized, appropriate CRE transfer documentation was available for only 4 patients (31%).
ConclusionsActive screening for CRE is feasible, and screening patients transferred from outside facilities to the medical or surgical ICU resulted in the highest screen positivity rate. Furthermore, CRE carriage was inconsistently documented in transfer paperwork, suggesting that admission screening or enhanced inter-facility communication are needed to improve the identification of CRE-colonized patients.
The cost of managing complex surgical site infections following primary hip and knee arthroplasty: A population-based cohort study in Alberta, Canada
- Elissa D. Rennert-May, John Conly, Stephanie Smith, Shannon Puloski, Elizabeth Henderson, Flora Au, Braden Manns
-
- Published online by Cambridge University Press:
- 10 September 2018, pp. 1183-1188
-
- Article
- Export citation
-
Objective
Nearly 800,000 primary hip and knee arthroplasty procedures are performed annually in North America. Approximately 1% of these are complicated by a complex surgical site infection (SSI), leading to very high healthcare costs. However, population-based studies to properly estimate the economic burden are lacking. We aimed to address this knowledge gap.
DesignEconomic burden study.
MethodsUsing administrative health and clinical databases, we created a cohort of all patients in Alberta, Canada, who received a primary hip or knee arthroplasty between April 1, 2012, and March 31, 2015. All patients who developed a complex SSI postoperatively were identified through a provincial infection prevention and control database. A combination of corporate microcosting data and gross costing methods were used to determine total mean 12- and 24-month costs, enabling comparison of costs between the infected and noninfected patients.
ResultsMean 12-month total costs were significantly greater in patients who developed a complex SSI compared to those who did not (CAD$95,321 [US$68,150] vs CAD$19,893 [US$14,223]; P < .001). The magnitude of the cost difference persisted even after controlling for underlying patient factors. The most commonly identified causative pathogen (38%) was Staphylococcus aureus (95% MSSA).
ConclusionsComplex SSIs following hip and knee arthroplasty lead to high healthcare costs, which are expected to rise as the yearly number of surgeries increases. Using our costing estimates, the cost-effectiveness of different strategies to prevent SSIs should be investigated.
Original Articles
Defining the Role of the Environment in the Emergence and Persistence of vanA Vancomycin-Resistant Enterococcus (VRE) in an Intensive Care Unit: A Molecular Epidemiological Study
- Andie S. Lee, Elizabeth White, Leigh G. Monahan, Slade O. Jensen, Raymond Chan, Sebastiaan J. van Hal
-
- Published online by Cambridge University Press:
- 03 April 2018, pp. 668-675
-
- Article
- Export citation
-
OBJECTIVE
To describe the transmission dynamics of the emergence and persistence of vanA vancomycin-resistant enterococcus (VRE) in an intensive care unit (ICU) using whole-genome sequencing of patient and environmental isolates.
DESIGNRetrospective cohort study.
SETTINGICU in a tertiary referral center.
PARTICIPANTSPatients admitted to the ICU over an 11-month period.
METHODSVanA VRE isolated from patients (n=31) were sequenced using the Illumina MiSeq platform. Environmental samples from bed spaces, equipment, and waste rooms were collected. All vanA VRE-positive environmental samples (n=14) were also sequenced. Data were collected regarding patient ward and bed movements.
RESULTSThe 31 patient vanA VRE isolates were from screening (n=19), urine (n=4), bloodstream (n=3), skin/wound (n=3), and intra-abdominal (n=2) sources. The phylogeny from sequencing data confirmed several VRE clusters, with 1 group accounting for 38 of 45 isolates (84%). Within this cluster, cross-transmission was extensive and complex across the ICU. Directionality indicated that colonized patients contaminated environmental sites. Similarly, environmental sources not only led to patient colonization but also to infection. Notably, shared equipment acted as a conduit for transmission between different ICU areas. Infected patients, however, were not linked to further VRE transmission.
CONCLUSIONSGenomic sequencing confirmed a predominantly clonal outbreak of VRE with complex transmission dynamics. The environmental reservoir, particularly from shared equipment, played a key role in ongoing VRE spread. This study provides evidence to support the use of multifaceted strategies, with an emphasis on measures to reduce bacterial burden in the environment, for successful VRE control.
Infect Control Hosp Epidemiol 2018;39:668–675
Urine Culture on Admission Impacts Antibiotic Use and Length of Stay: A Retrospective Cohort Study
- Molly J. Horstman, Andrew M. Spiegelman, Aanand D. Naik, Barbara W. Trautner
-
- Published online by Cambridge University Press:
- 27 March 2018, pp. 547-554
-
- Article
- Export citation
-
OBJECTIVE
To examine the impact of urine culture testing on day 1 of admission on inpatient antibiotic use and hospital length of stay (LOS).
DESIGNWe performed a retrospective cohort study using a national dataset from 2009 to 2014.
SETTINGThe study used data from 230 hospitals in the United States.
PARTICIPANTSAdmissions for adults 18 years and older were included in this study. Hospitalizations were matched with coarsened exact matching by facility, patient age, gender, Medicare severity-diagnosis related group (MS-DRG), and 3 measures of disease severity.
METHODSA multilevel Poisson model and a multilevel linear regression model were used to determine the impact of an admission urine culture on inpatient antibiotic use and LOS.
RESULTSMatching produced a cohort of 88,481 patients (n=41,070 with a culture on day 1, n=47,411 without a culture). A urine culture on admission led to an increase in days of inpatient antibiotic use (incidence rate ratio, 1.26; P<.001) and resulted in an additional 36,607 days of inpatient antibiotic treatment. Urine culture on admission resulted in a 2.1% increase in LOS (P=.004). The predicted difference in bed days of care between admissions with and without a urine culture resulted in 6,071 additional bed days of care. The impact of urine culture testing varied by admitting diagnosis.
CONCLUSIONSPatients with a urine culture sent on day 1 of hospital admission receive more days of antibiotics and have a longer hospital stay than patients who do not have a urine culture. Targeted interventions may reduce the potential harms associated with low-yield urine cultures on day 1.
Infect Control Hosp Epidemiol 2018;39:547–554
The Role of Negative Methicillin-Resistant Staphylococcus aureus Nasal Surveillance Swabs in Predicting the Need for Empiric Vancomycin Therapy in Intensive Care Unit Patients
- Darunee Chotiprasitsakul, Pranita D. Tamma, Avinash Gadala, Sara E. Cosgrove
-
- Published online by Cambridge University Press:
- 28 January 2018, pp. 290-296
-
- Article
- Export citation
-
OBJECTIVES
The role of methicillin-resistant Staphylococcus aureus (MRSA) nasal surveillance swabs (nasal swabs) in guiding decisions about prescribing vancomycin is unclear. We aimed to determine the likelihood that patients with negative MRSA nasal swabs develop subsequent MRSA infections; to assess avoidable vancomycin days for patients with negative nasal swabs; and to identify risk factors for having a negative nasal swab and developing a MRSA infection during the intensive care unit (ICU) stay.
METHODSThis retrospective cohort study was conducted in 6 ICUs at a tertiary-care hospital from December 2013 through June 2015. The negative predictive value (NPV), defined as the ability of a negative nasal swab to predict no subsequent MRSA infection, was calculated. Days of vancomycin continued or restarted after 3 days from the collection time of the first negative nasal swab were determined. A matched case-control study identified risk factors for having a negative nasal swab and developing MRSA infection.
RESULTSOf 11,441 patients with MRSA-negative nasal swabs, the rate of subsequent MRSA infection was 0.22%. A negative nasal swab had a NPV of 99.4% (95% confidence interval [CI], 99.1%–99.6%). Vancomycin was continued or started after nasal swab results were available in 1,431 patients, translating to 7,364 vancomycin days. No risk factors associated with MRSA infection were identified.
CONCLUSIONSIn our hospital with a low prevalence of MRSA transmission, a negative MRSA nasal swab was helpful in identifying patients with low risk of MRSA infection in whom empiric vancomycin therapy could be stopped and in whom the subsequent initiation of vancomycin therapy during an ICU admission could be avoided.
Infect Control Hosp Epidemiol 2018;39:290–296
Original Article
Predictors of Antimicrobial Stewardship Program Recommendation Disagreement
- Laura L. Bio, Jenna F. Kruger, Betty P. Lee, Matthew S. Wood, Hayden T. Schwenk
-
- Published online by Cambridge University Press:
- 30 April 2018, pp. 806-813
-
- Article
- Export citation
-
OBJECTIVE
To identify predictors of disagreement with antimicrobial stewardship prospective audit and feedback recommendations (PAFR) at a free-standing children’s hospital.
DESIGNRetrospective cohort study of audits performed during the antimicrobial stewardship program (ASP) from March 30, 2015, to April 17, 2017.
METHODSThe ASP included audits of antimicrobial use and communicated PAFR to the care team, with follow-up on adherence to recommendations. The primary outcome was disagreement with PAFR. Potential predictors for disagreement, including patient-level, antimicrobial, programmatic, and provider-level factors, were assessed using bivariate and multivariate logistic regression models.
RESULTSIn total, 4,727 antimicrobial audits were performed during the study period; 1,323 PAFR (28%) and 187 recommendations (15%) were not followed due to disagreement. Providers were more likely to disagree with PAFR when the patient had a gastrointestinal infection (odds ratio [OR], 5.50; 95% confidence interval [CI], 1.99–15.21), febrile neutropenia (OR, 6.14; 95% CI, 2.08–18.12), skin or soft-tissue infections (OR, 6.16; 95% CI, 1.92–19.77), or had been admitted for 31–90 days at the time of the audit (OR, 2.08; 95% CI, 1.36–3.18). The longer the duration since the attending provider had been trained (ie, the more years of experience), the more likely they were to disagree with PAFR recommendations (OR, 1.02; 95% CI, 1.01–1.04).
CONCLUSIONSEvaluation of our program confirmed patient-level predictors of PAFR disagreement and identified additional programmatic and provider-level factors, including years of attending experience. Stewardship interventions focused on specific diagnoses and antimicrobials are unlikely to result in programmatic success unless these factors are also addressed.
Infect Control Hosp Epidemiol 2018;806–813
Original Articles
Quasi-experimental Studies in the Fields of Infection Control and Antibiotic Resistance, Ten Years Later: A Systematic Review
- Rotana Alsaggaf, Lyndsay M. O’Hara, Kristen A. Stafford, Surbhi Leekha, Anthony D. Harris, for the CDC Prevention Epicenters Program
-
- Published online by Cambridge University Press:
- 08 February 2018, pp. 170-176
-
- Article
- Export citation
-
OBJECTIVE
A systematic review of quasi-experimental studies in the field of infectious diseases was published in 2005. The aim of this study was to assess improvements in the design and reporting of quasi-experiments 10 years after the initial review. We also aimed to report the statistical methods used to analyze quasi-experimental data.
DESIGNSystematic review of articles published from January 1, 2013, to December 31, 2014, in 4 major infectious disease journals.
METHODSQuasi-experimental studies focused on infection control and antibiotic resistance were identified and classified based on 4 criteria: (1) type of quasi-experimental design used, (2) justification of the use of the design, (3) use of correct nomenclature to describe the design, and (4) statistical methods used.
RESULTSOf 2,600 articles, 173 (7%) featured a quasi-experimental design, compared to 73 of 2,320 articles (3%) in the previous review (P<.01). Moreover, 21 articles (12%) utilized a study design with a control group; 6 (3.5%) justified the use of a quasi-experimental design; and 68 (39%) identified their design using the correct nomenclature. In addition, 2-group statistical tests were used in 75 studies (43%); 58 studies (34%) used standard regression analysis; 18 (10%) used segmented regression analysis; 7 (4%) used standard time-series analysis; 5 (3%) used segmented time-series analysis; and 10 (6%) did not utilize statistical methods for comparisons.
CONCLUSIONSWhile some progress occurred over the decade, it is crucial to continue improving the design and reporting of quasi-experimental studies in the fields of infection control and antibiotic resistance to better evaluate the effectiveness of important interventions.
Infect Control Hosp Epidemiol 2018;39:170–176
Original Article
Environmental transmission of Clostridioides difficile ribotype 027 at a long-term care facility; an outbreak investigation guided by whole genome sequencing
- Bradley T. Endres, Kierra M. Dotson, Kelley Poblete, Jacob McPherson, Chris Lancaster, Eugénie Bassères, Ali Memariani, Sandi Arnold, Shawn Tupy, Conner Carlsen, Bonnie Morehead, Sophia Anyatonwu, Christa Cook, Khurshida Begum, M. Jahangir Alam, Kevin W. Garey
-
- Published online by Cambridge University Press:
- 26 September 2018, pp. 1322-1329
-
- Article
- Export citation
-
Objective
This article describes a CDI outbreak in a long-term care (LTC) facility that used molecular typing techniques and whole-genome sequencing to identify widespread dissemination of the clonal strain in the environment which was successfully removed after terminal cleaning.
SettingThis study was conducted in a long-term care facility in Texas.
MethodsA recently hospitalized LTC patient was diagnosed with CDI followed shortly thereafter by 7 subsequent CDI cases. A stool specimen was obtained from each patient for culturing and typing. An environmental point-prevalence study of the facility was conducted before and after terminal cleaning of the facility to assess environmental contamination. Cultured isolates were typed using ribotyping, multilocus variant analysis, and whole-genome sequencing.
ResultsStool samples were available for 5 of 8 patients; of these specimens, 4 grew toxigenic C. difficile ribotype 027. Of 50 environmental swab samples collected throughout the facility prior to the facility-wide terminal cleaning, 19 (38%) grew toxigenic C. difficile (most commonly ribotype 027, 79%). The terminal cleaning was effective at reducing C. difficile spores in the environment and at eradicating the ribotype 027 strain (P<.001). Using multilocus variance analysis and whole-genome sequencing, clinical and environmental strains were highly related and, in some cases, were identical.
ConclusionUsing molecular typing techniques, we demonstrated reduced environmental contamination with toxigenic C. difficile and the eradication of a ribotype 027 clone. These techniques may help direct infection control efforts and decrease the burden of CDI in the healthcare system.
Original Articles
Risk Factors for Staphylococcus aureus Acquisition in the Neonatal Intensive Care Unit: A Matched Case-Case-Control Study
- Matthew C. Washam, Andrea Ankrum, Beth E. Haberman, Mary Allen Staat, David B. Haslam
-
- Published online by Cambridge University Press:
- 21 November 2017, pp. 46-52
-
- Article
- Export citation
-
OBJECTIVE
To determine risk factors independent of length of stay (LOS) for Staphylococcus aureus acquisition in infants admitted to the neonatal intensive care unit (NICU).
DESIGNRetrospective matched case–case-control study.
SETTINGQuaternary-care referral NICU at a large academic children’s hospital.
METHODSInfants admitted between January 2014 and March 2016 at a level IV NICU who acquired methicillin resistant (MRSA) or susceptible (MSSA) S. aureus were matched with controls by duration of exposure to determine risk factors for acquisition. A secondary post hoc analysis was performed on the entire cohort of at-risk infants for risk factors identified in the primary analysis to further quantify risk.
RESULTSIn total, 1,751 infants were admitted during the study period with 199 infants identified as having S. aureus prevalent on admission. There were 246 incident S. aureus acquisitions in the remaining at-risk infant cohort. On matched analysis, infants housed in a single-bed unit were associated with a significantly decreased risk of both MRSA (P=.03) and MSSA (P=.01) acquisition compared with infants housed in multibed pods. Across the entire cohort, pooled S. aureus acquisition was significantly lower in infants housed in single-bed units (hazard ratio,=0.46; confidence interval, 0.34–0.62).
CONCLUSIONSNICU bed design is significantly associated with S. aureus acquisition in hospitalized infants independent of LOS.
Infect Control Hosp Epidemiol 2018;39:46–52
Original Article
Validation of semiautomated surgical site infection surveillance using electronic screening algorithms in 38 surgery categories
- Sun Young Cho, Doo Ryeon Chung, Jong Rim Choi, Doo Mi Kim, Si-Ho Kim, Kyungmin Huh, Cheol-In Kang, Kyong Ran Peck
-
- Published online by Cambridge University Press:
- 12 June 2018, pp. 931-935
-
- Article
- Export citation
-
Objective
To verify the validity of a semiautomated surgical site infection (SSI) surveillance system using electronic screening algorithms in 38 categories of surgery.
DesignA cohort study for validation of semiautomated SSI surveillance system using screening algorithms.
SettingA 1,989-bed tertiary-care referral center in Seoul, Republic of Korea.
MethodsA dataset of 40,516 surgical procedures in 38 categories stored in the conventional SSI surveillance registry at the Samsung Medical Center between January 2013 and December 2014 was used as the reference standard. In the semiautomated surveillance system, electronic screening algorithms flagged cases meeting at least 1 of 3 criteria: antibiotic prescription, microbial culture, and infectious disease consultation. Flagged cases were audited by infection preventionists. Analyses of sensitivity, specificity, and positive predictive value (PPV) were conducted for the semiautomated surveillance system, and its effect on reducing the workload for chart review was evaluated.
ResultsA total of 575 SSI events (1·42%) were identified by conventional SSI surveillance. The sensitivity of the semiautomated SSI surveillance was 96·7%, and the PPV of the screening algorithms alone was 4·1%. Semiautomated SSI surveillance reduced the chart review workload of the infection preventionists from 1,283 to 482 person hours per year (a 62·4% decrease).
ConclusionsCompared to conventional surveillance, semiautomated surveillance using electronic screening algorithms followed by chart review of selected cases can provide high-validity surveillance results and can significantly reduce the workload of infection preventionists.
Modified Reporting of Positive Urine Cultures to Reduce Inappropriate Treatment of Asymptomatic Bacteriuria Among Nonpregnant, Noncatheterized Inpatients: A Randomized Controlled Trial
- Peter Daley, David Garcia, Raheel Inayatullah, Carla Penney, Sarah Boyd
-
- Published online by Cambridge University Press:
- 28 May 2018, pp. 814-819
-
- Article
- Export citation
-
DESIGN
We conducted a randomized, parallel, unblinded, superiority trial of a laboratory reporting intervention designed to reduce antibiotic treatment of asymptomatic bacteriuria (ASB).
METHODSResults of positive urine cultures from 110 consecutive inpatients at 2 urban acute-care hospitals were randomized to standard report (control) or modified report (intervention). The standard report included bacterial count, bacterial identification, and antibiotic susceptibility information including drug dosage and cost. The modified report stated: “This POSITIVE urine culture may represent asymptomatic bacteriuria or urinary tract infection. If urinary tract infection is suspected clinically, please call the microbiology laboratory … for identification and susceptibility results.” We used the following exclusion criteria: age <18 years, pregnancy, presence of an indwelling urinary catheter, samples from patients already on antibiotics, neutropenia, or admission to an intensive care unit. The primary efficacy outcome was the proportion of appropriate antibiotic therapy prescribed.
RESULTSAccording to our intention-to-treat (ITT) analysis, the proportion of appropriate treatment (urinary tract infection treated plus ASB not treated) was higher in the modified arm than in the standard arm: 44 of 55 (80.0%) versus 29 of 55 (52.7%), respectively (absolute difference, −27.3%; RR, 0.42; P = .002; number needed to report for benefit, 3.7).
CONCLUSIONSModified reporting resulted in a significant reduction in inappropriate antibiotic treatment without an increase in adverse events. Safety should be further assessed in a large effectiveness trial before implementation
TRIAL REGISTRATION. clinicaltrials.gov#NCT02797613
Infect Control Hosp Epidemiol 2018;814–819
Original Articles
Screening for Asymptomatic Clostridium difficile Among Bone Marrow Transplant Patients: A Mixed-Methods Study of Intervention Effectiveness and Feasibility
- Anna K. Barker, Benjamin Krasity, Jackson Musuuza, Nasia Safdar
-
- Published online by Cambridge University Press:
- 25 January 2018, pp. 177-185
-
- Article
- Export citation
-
OBJECTIVE
To identify facilitators and barriers to implementation of a Clostridium difficile screening intervention among bone marrow transplant (BMT) patients and to evaluate the clinical effectiveness of the intervention on the rate of hospital-onset C. difficile infection (HO-CDI).
DESIGNBefore-and-after trial
SETTINGA 505-bed tertiary-care medical center
PARTICIPANTSAll 5,357 patients admitted to the BMT and general medicine wards from January 2014 to February 2017 were included in the study. Interview participants included 3 physicians, 4 nurses, and 4 administrators.
INTERVENTIONAll BMT patients were screened within 48 hours of admission. Colonized patients, as defined by a C. difficile–positive polymerase chain reaction (PCR) stool result, were placed under contact precautions for the duration of their hospital stay.
METHODSInterview responses were coded according to the Systems Engineering Initiative for Patient Safety conceptual framework. We compared pre- and postintervention HO-CDI rates on BMT and general internal medicine units using time-series analysis.
RESULTSStakeholder engagement, at both the person and organizational level, facilitates standardization and optimization of intervention protocols. While the screening intervention was generally well received, tools and technology were sources of concern. The mean incidence of HO-CDI decreased on the BMT service postintervention (P<.0001). However, the effect of the change in the trend postintervention was not significantly different on BMT compared to the control wards (P=.93).
CONCLUSIONSWe report the first mixed-methods study to evaluate a C. difficile screening intervention among the BMT population. The positive nature by which the intervention was received by front-line clinical staff, laboratory staff, and administrators is promising for future implementation studies.
Infect Control Hosp Epidemiol 2018;39:177–185
Risk of Surgical Site Infection (SSI) following Colorectal Resection Is Higher in Patients With Disseminated Cancer: An NCCN Member Cohort Study
- Mini Kamboj, Teresa Childers, Jessica Sugalski, Donna Antonelli, Juliane Bingener-Casey, Jamie Cannon, Karie Cluff, Kimberly A. Davis, E. Patchen Dellinger, Sean C. Dowdy, Kim Duncan, Julie Fedderson, Robert Glasgow, Bruce Hall, Marilyn Hirsch, Matthew Hutter, Lisa Kimbro, Boris Kuvshinoff II, Martin Makary, Melanie Morris, Sharon Nehring, Sonia Ramamoorthy, Rebekah Scott, Mindy Sovel, Vivian Strong, Ashley Webster, Elizabeth Wick, Julio Garcia Aguilar, Robert Carlson, Kent Sepkowitz
-
- Published online by Cambridge University Press:
- 19 March 2018, pp. 555-562
-
- Article
- Export citation
-
BACKGROUND
Surgical site infections (SSIs) following colorectal surgery (CRS) are among the most common healthcare-associated infections (HAIs). Reduction in colorectal SSI rates is an important goal for surgical quality improvement.
OBJECTIVETo examine rates of SSI in patients with and without cancer and to identify potential predictors of SSI risk following CRS
DESIGNAmerican College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) data files for 2011–2013 from a sample of 12 National Comprehensive Cancer Network (NCCN) member institutions were combined. Pooled SSI rates for colorectal procedures were calculated and risk was evaluated. The independent importance of potential risk factors was assessed using logistic regression.
SETTINGMulticenter study
PARTICIPANTSOf 22 invited NCCN centers, 11 participated (50%). Colorectal procedures were selected by principal procedure current procedural technology (CPT) code. Cancer was defined by International Classification of Disease, Ninth Revision, Clinical Modification (ICD-9-CM) codes.
MAIN OUTCOMEThe primary outcome of interest was 30-day SSI rate.
RESULTSA total of 652 SSIs (11.06%) were reported among 5,893 CRSs. Risk of SSI was similar for patients with and without cancer. Among CRS patients with underlying cancer, disseminated cancer (SSI rate, 17.5%; odds ratio [OR], 1.66; 95% confidence interval [CI], 1.23–2.26; P=.001), ASA score ≥3 (OR, 1.41; 95% CI, 1.09–1.83; P=.001), chronic obstructive pulmonary disease (COPD; OR, 1.6; 95% CI, 1.06–2.53; P=.02), and longer duration of procedure were associated with development of SSI.
CONCLUSIONSPatients with disseminated cancer are at a higher risk for developing SSI. ASA score >3, COPD, and longer duration of surgery predict SSI risk. Disseminated cancer should be further evaluated by the Centers for Disease Control and Prevention (CDC) in generating risk-adjusted outcomes.
Infect Control Hosp Epidemiol 2018;39:555–562
A Generalizable, Data-Driven Approach to Predict Daily Risk of Clostridium difficile Infection at Two Large Academic Health Centers
- Jeeheh Oh, Maggie Makar, Christopher Fusco, Robert McCaffrey, Krishna Rao, Erin E. Ryan, Laraine Washer, Lauren R. West, Vincent B. Young, John Guttag, David C. Hooper, Erica S. Shenoy, Jenna Wiens
-
- Published online by Cambridge University Press:
- 26 March 2018, pp. 425-433
-
- Article
-
- You have access Access
- HTML
- Export citation
-
OBJECTIVE
An estimated 293,300 healthcare-associated cases of Clostridium difficile infection (CDI) occur annually in the United States. To date, research has focused on developing risk prediction models for CDI that work well across institutions. However, this one-size-fits-all approach ignores important hospital-specific factors. We focus on a generalizable method for building facility-specific models. We demonstrate the applicability of the approach using electronic health records (EHR) from the University of Michigan Hospitals (UM) and the Massachusetts General Hospital (MGH).
METHODSWe utilized EHR data from 191,014 adult admissions to UM and 65,718 adult admissions to MGH. We extracted patient demographics, admission details, patient history, and daily hospitalization details, resulting in 4,836 features from patients at UM and 1,837 from patients at MGH. We used L2 regularized logistic regression to learn the models, and we measured the discriminative performance of the models on held-out data from each hospital.
RESULTSUsing the UM and MGH test data, the models achieved area under the receiver operating characteristic curve (AUROC) values of 0.82 (95% confidence interval [CI], 0.80–0.84) and 0.75 ( 95% CI, 0.73–0.78), respectively. Some predictive factors were shared between the 2 models, but many of the top predictive factors differed between facilities.
CONCLUSIONA data-driven approach to building models for estimating daily patient risk for CDI was used to build institution-specific models at 2 large hospitals with different patient populations and EHR systems. In contrast to traditional approaches that focus on developing models that apply across hospitals, our generalizable approach yields risk-stratification models tailored to an institution. These hospital-specific models allow for earlier and more accurate identification of high-risk patients and better targeting of infection prevention strategies.
Infect Control Hosp Epidemiol 2018;39:425–433
Impact of Discontinuing Contact Precautions for Methicillin-Resistant Staphylococcus aureus and Vancomycin-Resistant Enterococcus: An Interrupted Time Series Analysis
- Part of:
- Gonzalo Bearman, Salma Abbas, Nadia Masroor, Kakotan Sanogo, Ginger Vanhoozer, Kaila Cooper, Michelle Doll, Michael P. Stevens, Michael B. Edmond
-
- Published online by Cambridge University Press:
- 27 March 2018, pp. 676-682
-
- Article
-
- You have access Access
- HTML
- Export citation
-
OBJECTIVE
To investigate the impact of discontinuing contact precautions among patients infected or colonized with methicillin-resistant Staphylococcus aureus (MRSA) or vancomycin-resistant Enterococcus (VRE) on rates of healthcare-associated infection (HAI). DESIGN. Single-center, quasi-experimental study conducted between 2011 and 2016.
METHODSWe employed an interrupted time series design to evaluate the impact of 7 horizontal infection prevention interventions across intensive care units (ICUs) and hospital wards at an 865-bed urban, academic medical center. These interventions included (1) implementation of a urinary catheter bundle in January 2011, (2) chlorhexidine gluconate (CHG) perineal care outside ICUs in June 2011, (3) hospital-wide CHG bathing outside of ICUs in March 2012, (4) discontinuation of contact precautions in April 2013 for MRSA and VRE, (5) assessments and feedback with bare below the elbows (BBE) and contact precautions in August 2014, (6) implementation of an ultraviolet-C disinfection robot in March 2015, and (7) 72-hour automatic urinary catheter discontinuation orders in March 2016. Segmented regression modeling was performed to assess the changes in the infection rates attributable to the interventions.
RESULTSThe rate of HAI declined throughout the study period. Infection rates for MRSA and VRE decreased by 1.31 (P=.76) and 6.25 (P=.21) per 100,000 patient days, respectively, and the infection rate decreased by 2.44 per 10,000 patient days (P=.23) for device-associated HAI following discontinuation of contact precautions.
CONCLUSIONThe discontinuation of contact precautions for patients infected or colonized with MRSA or VRE, when combined with horizontal infection prevention measures was not associated with an increased incidence of MRSA and VRE device-associated infections. This approach may represent a safe and cost-effective strategy for managing these patients.
Infect Control Hosp Epidemiol 2018;39:676–682
Original Article
Poor clinical outcomes associated with community-onset urinary tract infections due to extended-spectrum cephalosporin-resistant Enterobacteriaceae
- Judith A. Anesi, Ebbing Lautenbach, Irving Nachamkin, Charles Garrigan, Warren B. Bilker, Jacqueline Omorogbe, Lois Dankwa, Mary K. Wheeler, Pam Tolomeo, Jennifer H. Han, for the CDC Prevention Epicenters Program
-
- Published online by Cambridge University Press:
- 30 October 2018, pp. 1431-1435
-
- Article
- Export citation
-
Objective
Resistance to extended-spectrum cephalosporins (ESC) among Enterobacteriaceae (EB) is increasingly prevalent. We sought to determine the clinical outcomes associated with community-onset ESC-resistant (ESC-R) EB urinary tract infections (UTIs) in a US health system.
DesignRetrospective cohort study.
PatientsAll patients presenting to the emergency departments (EDs) or outpatient practices with EB UTIs between 2010 and 2013 were included. Exposed patients had ESC-R EB UTIs. Unexposed patients had ESC-susceptible EB UTIs and were matched to exposed subjects 1:1 on study year. Multivariable logistic regression analyses were performed to evaluate the association between ESC-R EB UTI and the outcomes of clinical failure and inappropriate initial antibiotic therapy (IIAT).
ResultsA total of 302 patients with community-onset EB UTI were included, with 151 exposed and unexposed. On multivariable analyses, UTI due to an ESC-R EB was significantly associated with clinical failure (odds ratio [OR], 7.07; 95% confidence interval [CI], 3.16–15.82; P<.01). Other independent risk factors for clinical failure included infection with Citrobacter spp and need for hemodialysis. UTI due to an ESC-R EB was also significantly associated with IIAT (OR, 4.40; 95% CI, 2.64–7.33; P<.01).
ConclusionsCommunity-onset UTI due to an ESC-R EB organism is significantly associated with clinical failure, which may be due in part to IIAT. Further studies are needed to determine which patients in the community are at high risk for drug-resistant infection to help inform prompt diagnosis and appropriate antibiotic prescribing for ESC-R EB.
Original Articles
Importation, Mitigation, and Genomic Epidemiology of Candida auris at a Large Teaching Hospital
- Emil P. Lesho, Melissa Z. Bronstein, Patrick McGann, Jason Stam, Yoon Kwak, Rosslyn Maybank, Jodi McNamara, Megan Callahan, Jean Campbell, Mary K. Hinkle, Edward E. Walsh
-
- Published online by Cambridge University Press:
- 06 December 2017, pp. 53-57
-
- Article
- Export citation
-
OBJECTIVE
Candida auris (CA) is an emerging multidrug-resistant pathogen associated with increased mortality. The environment may play a role, but transmission dynamics remain poorly understood. We sought to limit environmental and patient CA contamination following a sustained unsuspected exposure.
DESIGNQuasi-experimental observation.
SETTINGA 528-bed teaching hospital.
PATIENTSThe index case patient and 17 collocated ward mates.
INTERVENTIONImmediately after confirmation of CA in the bloodstream and urine of a patient admitted 6 days previously, active surveillance, enhanced transmission-based precautions, environmental cleaning with peracetic acid-hydrogen peroxide and ultraviolet light, and patient relocation were undertaken. Pre-existing agreements and foundational relationships among internal multidisciplinary teams and external partners were leveraged to bolster detection and mitigation efforts and to provide genomic epidemiology.
RESULTSCandida auris was isolated from 3 of 132 surface samples on days 8, 9, and 15 of ward occupancy, and from no patient samples (0 of 48). Environmental and patient isolates were genetically identical (4–8 single-nucleotide polymorphisms [SNPs]) and most closely related to the 2013 India CA-6684 strain (~200 SNPs), supporting the epidemiological hypothesis that the source of environmental contamination was the index case patient, who probably acquired the South Asian strain from another New York hospital. All isolates contained a mutation associated with azole resistance (K163R) found in the India 2105 VPCI strain but not in CA-6684. The index patient remained colonized until death. No surfaces were CA-positive 1 month later.
CONCLUSIONCompared to previous descriptions, CA dissemination was minimal. Immediate access to rapid CA diagnostics facilitates early containment strategies and outbreak investigations.
Infect Control Hosp Epidemiol 2018;39:53–57
Original Article
Outpatient antimicrobial stewardship targets for treatment of skin and soft-tissue infections
- Preeti Jaggi, Ling Wang, Sean Gleeson, Melissa Moore-Clingenpeel, Joshua R. Watson
-
- Published online by Cambridge University Press:
- 02 July 2018, pp. 936-940
-
- Article
- Export citation
-
Objective
We sought to identify factors associated with long duration and/or non–first-line choice of treatment for pediatric skin and soft-tissue infections (SSTIs).
DesignRetrospective cohort study.
SettingAmbulatory encounter claims of Medicaid-insured children lacking chronic medical conditions treated for SSTI and/or animal bite injury in Ohio in 2014.
MethodsFor all diagnoses, long treatment duration was defined as treatment >7 days. Non–first-line choice of treatment for SSTI included treatment with 2 antimicrobials dispensed on the same calendar day or any treatment not listed in the Infectious Diseases Society of America guidelines. The adjusted odds of (1) long treatment duration and (2) non–first-line choice of treatment were calculated for patient age, prescriber type, and patient county of residence characteristics (ie, rural vs metropolitan area and poverty rate).
ResultsOf 10,310 encounters with complete data available, long treatment duration was prescribed in 7,968 (77.3%). The most common duration of treatment prescribed was 10 days. A non–first-line choice was prescribed in 1,030 encounters (10%). Dispensation of 2 antimicrobials on the same calendar day was the most common reason for the non–first-line choice, and of these, trimethoprim-sulfamethoxazole plus a first-generation cephalosporin was the most common regimen. Compared to pediatricians, the adjusted odds ratio of long treatment duration was significantly lower for all other primary care specialties. Conversely, nonpediatricians were more likely to prescribe a non–first-line treatment choice. Patient residence in a high-poverty county increased the odds of both long duration and non–first-line choice of treatment.
ConclusionsHealthcare claims may be utilized to measure opportunities for first-line choice and/or shorter duration of treatment for SSTI.