Original Articles
Postoperative Burden of Hospital-Acquired Clostridium difficile Infection
- Zaid M. Abdelsattar, Greta Krapohl, Layan Alrahmani, Mousumi Banerjee, Robert W. Krell, Sandra L. Wong, Darrell A. Campbell, Jr, David M. Aronoff, Samantha Hendren
-
- Published online by Cambridge University Press:
- 05 January 2015, pp. 40-46
-
- Article
- Export citation
-
OBJECTIVE
Clostridium difficile infection (CDI) is a common hospital-acquired infection. Previous reports on the incidence, risk factors, and impact of CDI on resources in the surgical population are limited. In this context, we study CDI across diverse surgical settings.
METHODSWe prospectively identified patients with laboratory-confirmed postoperative CDI after 40 different general, vascular, or gynecologic surgeries at 52 academic and community hospitals between July 2012 and September 2013. We used multivariable regression models to identify CDI risk factors and to determine the impact of CDI on resource utilization.
RESULTSOf 35,363 patients, 179 (0.51%) developed postoperative CDI. The highest rates of CDI occurred after lower-extremity amputation (2.6%), followed by bowel resection or repair (0.9%) and gastric or esophageal operations (0.7%). Gynecologic and endocrine operations had the lowest rates (0.1% and 0%, respectively). By multivariable analyses, older age, chronic immunosuppression, hypoalbuminemia (≤3.5 g/dL), and preoperative sepsis were associated with CDI. Use of prophylactic antibiotics was not independently associated with CDI, neither was sex, body mass index (BMI), surgical priority, weight loss, or comorbid conditions. Three procedure groups had higher odds of postoperative CDI: lower-extremity amputations (adjusted odds ratio [aOR], 3.5; P=.03), gastric or esophageal operations (aOR, 2.1; P=.04), and bowel resection or repair (aOR, 2; P=.04). Postoperative CDI was independently associated with increased length of stay (mean, 13.7 d vs 4.5 d), emergency department presentations (18.9 vs 9.1%) and readmissions (38.9% vs 7.2%, all P<.001).
CONCLUSIONSIncidence of postoperative CDI varies by surgical procedure. Postoperative CDI is also associated with higher rates of extended length of stay, emergency room presentations, and readmissions, which places a potentially preventable burden on hospital resources.
Infect Control Hosp Epidemiol 2015;36(1): 40–46
Evolving Epidemiology of Staphylococcus aureus Bacteremia
- Yoona Rhee, Alla Aroutcheva, Bala Hota, Robert A. Weinstein, Kyle J. Popovich
-
- Published online by Cambridge University Press:
- 16 September 2015, pp. 1417-1422
-
- Article
- Export citation
-
BACKGROUND
Methicillin-resistant Staphylococcus aureus (MRSA) infections due to USA300 have become widespread in community and healthcare settings. It is unclear whether risk factors for bloodstream infections (BSIs) differ by strain type.
OBJECTIVETo examine the epidemiology of S. aureus BSIs, including USA300 and non-USA300 MRSA strains.
DESIGNRetrospective observational study with molecular analysis.
SETTINGLarge urban public hospital.
PATIENTSIndividuals with S. aureus BSIs from January 1, 2007 through December 31, 2013.
METHODSWe used electronic surveillance data to identify cases of S. aureus BSI. Available MRSA isolates were analyzed by pulsed-field gel electrophoresis. Poisson regression was used to evaluate changes in BSI incidence over time. Risk factor data were collected by medical chart review and logistic regression was used for multivariate analysis of risk factors.
RESULTSA total of 1,015 cases of S. aureus BSIs were identified during the study period; 36% were due to MRSA. The incidence of hospital-onset (HO) MRSA BSIs decreased while that of community-onset (CO) MRSA BSIs remained stable. The rate of CO– and HO– methicillin-susceptible S. aureus infections both decreased over time. More than half of HO-MRSA BSIs were due to the USA300 strain type and for 4 years, the proportion of HO-MRSA BSIs due to USA300 exceeded 60%. On multivariate analysis, current or former drug use was the only epidemiologic risk factor for CO- or HO-MRSA BSIs due to USA300 strains.
CONCLUSIONSUSA300 MRSA is endemic in communities and hospitals and certain populations (eg, those who use illicit drugs) may benefit from enhanced prevention efforts in the community.
Infect. Control Hosp. Epidemiol. 2015;36(12):1417–1422
Clostridium difficile Infection Among Veterans Health Administration Patients
- Yinong Young-Xu, Jennifer L Kuntz, Dale N. Gerding, Julia Neily, Peter Mills, Erik R. Dubberke, Margaret A. Olsen, Ciarán P. Kelly, Cédric Mahé
-
- Published online by Cambridge University Press:
- 05 June 2015, pp. 1038-1045
-
- Article
- Export citation
-
OBJECTIVE
To report on the prevalence and incidence of Clostridium difficile infection (CDI) from 2009 to 2013 among Veterans Healthcare Administration patients
DESIGNA retrospective descriptive analysis of data extracted from a large electronic medical record (EMR) database
SETTINGData were acquired from VHA healthcare records from 2009 to 2013 that included outpatient clinical visits, long-term care, and hospitalized care as well as pharmacy and laboratory information.
RESULTSIn 2009, there were 10,207 CDI episodes, and in 2013, there were 12,143 CDI episodes, an increase of 19.0%. The overall CDI rate increased by 8.4% from 193 episodes per 100,000 patient years in 2009 to 209 episodes per 100,000 patient years in 2013. Of the CDI episodes identified in 2009, 58% were identified during a hospitalization, and 42% were identified in an outpatient setting. In 2013, 44% of the CDI episodes were identified in an outpatient setting.
CONCLUSIONThis is one of the largest studies that has utilized timely EMR data to describe the current CDI epidemiology at the VHA. Despite an aging population with greater burden of comorbidity than the general US population, our data show that VHA CDI rates stabilized between 2011 and 2013 following increases likely attributable to the introduction of the more sensitive nucleic acid amplification tests (NAATs). The findings in this report will help establish an accurate benchmark against which both current and future VA CDI prevention initiatives can be measured.
Infect. Control Hosp. Epidemiol. 2015;36(9):1038–1045
An Economic Analysis of Adherence Engineering to Improve Use of Best Practices During Central Line Maintenance Procedures
- Richard E. Nelson, Aaron W. Angelovic, Scott D. Nelson, Jeremy R. Gleed, Frank A. Drews
-
- Published online by Cambridge University Press:
- 16 March 2015, pp. 550-556
-
- Article
- Export citation
-
OBJECTIVE
Adherence engineering applies human factors principles to examine non-adherence within a specific task and to guide the development of materials or equipment to increase protocol adherence and reduce human error. Central line maintenance (CLM) for intensive care unit (ICU) patients is a task through which error or non-adherence to protocols can cause central line-associated bloodstream infections (CLABSIs). We conducted an economic analysis of an adherence engineering CLM kit designed to improve the CLM task and reduce the risk of CLABSI.
METHODSWe constructed a Markov model to compare the cost-effectiveness of the CLM kit, which contains each of the 27 items necessary for performing the CLM procedure, compared with the standard care procedure for CLM, in which each item for dressing maintenance is gathered separately. We estimated the model using the cost of CLABSI overall ($45,685) as well as the excess LOS (6.9 excess ICU days, 3.5 excess general ward days).
RESULTSAssuming the CLM kit reduces the risk of CLABSI by 100% and 50%, this strategy was less costly (cost savings between $306 and $860) and more effective (between 0.05 and 0.13 more quality-adjusted life-years) compared with not using the pre-packaged kit. We identified threshold values for the effectiveness of the kit in reducing CLABSI for which the kit strategy was no longer less costly.
CONCLUSIONAn adherence engineering–based intervention to streamline the CLM process can improve patient outcomes and lower costs. Patient safety can be improved by adopting new approaches that are based on human factors principles.
Infect Control Hosp Epidemiol 2015;00(0): 1–7
Frequency, Risk Factors, and Outcomes of Vancomycin-Resistant Enterococcus Colonization and Infection in Patients with Newly Diagnosed Acute Leukemia: Different Patterns in Patients with Acute Myelogenous and Acute Lymphoblastic Leukemia
- Clyde D. Ford, Bert K. Lopansri, Souha Haydoura, Greg Snow, Kristin K. Dascomb, Julie Asch, Finn Bo Petersen, John P. Burke
-
- Published online by Cambridge University Press:
- 05 January 2015, pp. 47-53
-
- Article
- Export citation
-
OBJECTIVE
To determine the frequency, risk factors, and outcomes for vancomycin-resistant Enterococcus (VRE) colonization and infection in patients with newly diagnosed acute leukemia.
DESIGNRetrospective clinical study with VRE molecular strain typing.
SETTINGA regional referral center for acute leukemia.
PATIENTSTwo hundred fourteen consecutive patients with newly diagnosed acute leukemia between 2006 and 2012.
METHODSAll patients had a culture of first stool and weekly surveillance for VRE. Clinical data were abstracted from the Intermountain Healthcare electronic data warehouse. VRE molecular typing was performed utilizing the semi-automated DiversiLab System.
RESULTSThe rate of VRE colonization was directly proportional to length of stay and was higher in patients with acute lymphoblastic leukemia. Risk factors associated with colonization include administration of corticosteroids (P=0.004) and carbapenems (P=0.009). Neither a colonized prior room occupant nor an increased unit colonization pressure affected colonization risk. Colonized patients with acute myelogenous leukemia had an increased risk of VRE bloodstream infection (BSI, P=0.002). Other risk factors for VRE BSI include severe neutropenia (P=0.04) and diarrhea (P=0.008). Fifty-eight percent of BSI isolates were identical or related by molecular typing. Eighty-nine percent of bloodstream isolates were identical or related to stool isolates identified by surveillance cultures. VRE BSI was associated with increased costs (P=0.0003) and possibly mortality.
CONCLUSIONSVRE colonization has important consequences for patients with acute myelogenous leukemia undergoing induction therapy. For febrile neutropenic patients with acute myelogenous leukemia, use of empirical antibiotic regimens that avoid carbapenems and include VRE coverage may be helpful in decreasing the risks associated with VRE BSI.
Infect Control Hosp Epidemiol 2015;36(1): 47–53
The Perennial Problem of Variability In Adenosine Triphosphate (ATP) Tests for Hygiene Monitoring Within Healthcare Settings
- Greg S. Whiteley, Chris Derry, Trevor Glasbey, Paul Fahey
-
- Published online by Cambridge University Press:
- 03 March 2015, pp. 658-663
-
- Article
- Export citation
-
OBJECTIVE
To investigate the reliability of commercial ATP bioluminometers and to document precision and variability measurements using known and quantitated standard materials.
METHODSFour commercially branded ATP bioluminometers and their consumables were subjected to a series of controlled studies with quantitated materials in multiple repetitions of dilution series. The individual dilutions were applied directly to ATP swabs. To assess precision and reproducibility, each dilution step was tested in triplicate or quadruplicate and the RLU reading from each test point was recorded. Results across the multiple dilution series were normalized using the coefficient of variation.
RESULTSThe results for pure ATP and bacterial ATP from suspensions of Staphylococcus epidermidis and Pseudomonas aeruginosa are presented graphically. The data indicate that precision and reproducibility are poor across all brands tested. Standard deviation was as high as 50% of the mean for all brands, and in the field users are not provided any indication of this level of imprecision.
CONCLUSIONSThe variability of commercial ATP bioluminometers and their consumables is unacceptably high with the current technical configuration. The advantage of speed of response is undermined by instrument imprecision expressed in the numerical scale of relative light units (RLU).
Infect Control Hosp Epidemiol 2015;00(0):1–6
A Randomized Clinical Trial Comparing Use of Rapid Molecular Testing for Staphylococcus aureus for Patients With Cutaneous Abscesses in the Emergency Department With Standard of Care
- Larissa S. May, Richard E. Rothman, Loren G. Miller, Gillian Brooks, Mark Zocchi, Catherine Zatorski, Andrea F. Dugas, Chelsea E. Ware, Jeanne A. Jordan
-
- Published online by Cambridge University Press:
- 26 August 2015, pp. 1423-1430
-
- Article
- Export citation
-
OBJECTIVE
To determine whether real-time availability of rapid molecular results of Staphylococcus aureus would impact emergency department clinician antimicrobial selection for adults with cutaneous abscesses.
DESIGNWe performed a prospective, randomized controlled trial comparing a rapid molecular test with standard of care culture-based testing. Follow-up telephone calls were made at between 2 and 7 days, 1 month, and 3 months after discharge.
SETTINGTwo urban, academic emergency departments.
PATIENTSPatients at least 18 years old presenting with a chief complaint of abscess, cellulitis, or insect bite and receiving incision and drainage were eligible. Seven hundred seventy-eight people were assessed for eligibility and 252 met eligibility criteria.
METHODSClinician antibiotic selection and clinical outcomes were evaluated. An ad hoc outcome of test performance was performed.
RESULTSWe enrolled 252 patients and 126 were randomized to receive the rapid test. Methicillin-susceptible S. aureus–positive patients receiving rapid test results were prescribed beta-lactams more often than controls (absolute difference, 14.5% [95% CI, 1.1%–30.1%]) whereas methicillin-resistant S. aureus–positive patients receiving rapid test results were more often prescribed anti–methicillin-resistant S. aureus antibiotics (absolute difference, 21.5% [95% CI, 10.1%–33.0%]). There were no significant differences between the 2 groups in 1-week or 3-month clinical outcomes.
CONCLUSIONAvailability of rapid molecular test results after incision and drainage was associated with more-targeted antibiotic selection.
TRIAL REGISTRATIONclinicaltrials.gov Identifier: NCT01523899
Infect. Control Hosp. Epidemiol. 2015;36(12):1423–1430
Risk Factors for In-Hospital Mortality among a Cohort of Children with Clostridium difficile Infection
- Neika Vendetti, Theoklis Zaoutis, Susan E. Coffin, Julia Shaklee Sammons
-
- Published online by Cambridge University Press:
- 02 July 2015, pp. 1183-1189
-
- Article
- Export citation
-
OBJECTIVE
The incidence of Clostridium difficile infection (CDI) has increased and has been associated with poor outcomes among hospitalized children, including increased risk of death. The purpose of this study was to identify risk factors for all-cause in-hospital mortality among children with CDI.
METHODSA multicenter cohort of children with CDI, aged 1–18 years, was established among children hospitalized at 41 freestanding children’s hospitals between January 1, 2006 and August 31, 2011. Children with CDI were identified using a validated case-finding tool (ICD-9-CM code for CDI plus C. difficile test charge). Only the first CDI-related hospitalization during the study period was used. Risk factors for all-cause in-hospital mortality within 30 days of C. difficile test were evaluated using a multivariable logistic regression model.
RESULTSWe identified 7,318 children with CDI during the study period. The median age of this cohort was 6 years [interquartile range (IQR): 2–13]; the mortality rate was 1.5% (n=109); and the median number of days between C. difficile testing and death was 12 (IQR, 7–20). Independent risk factors for death included older age [adjusted odds ratio (OR, 95% confidence interval), 2.29 (1.40–3.77)], underlying malignancy [3.57 (2.36–5.40)], cardiovascular disease [2.06 (1.28–3.30)], hematologic/immunologic condition [1.89 (1.05–3.39)], gastric acid suppression [2.70 (1.43–5.08)], and presence of >1 severity of illness marker [3.88 (2.44–6.19)].
CONCLUSIONPatients with select chronic conditions and more severe disease are at increased risk of death. Identifying risk factors for in-hospital mortality can help detect subpopulations of children that may benefit from targeted CDI prevention and treatment strategies.
Infect Control Hosp Epidemiol 2015;36(10):1183–1189
Precautionary Practices of Healthcare Workers Who Disinfect Medical and Dental Devices Using High-Level Disinfectants
- Scott A. Henn, James M. Boiano, Andrea L. Steege
-
- Published online by Cambridge University Press:
- 18 December 2014, pp. 180-185
-
- Article
- Export citation
-
BACKGROUND
High-level disinfectants (HLDs) are used throughout the healthcare industry to chemically disinfect reusable, semicritical medical and dental devices to control and prevent healthcare-associated infections among patient populations. Workers who use HLDs are at risk of exposure to these chemicals, some of which are respiratory and skin irritants and sensitizers.
OBJECTIVETo evaluate exposure controls used and to better understand impediments to healthcare workers using personal protective equipment while handling HLDs.
DESIGNWeb-based survey.
PARTICIPANTSA targeted sample of members of professional practice organizations representing nurses, technologists/technicians, dental professionals, respiratory therapists, and others who reported handling HLDs in the previous 7 calendar days. Participating organizations invited either all or a random sample of members via email, which included a hyperlink to the survey.
METHODSDescriptive analyses were conducted including simple frequencies and prevalences.
RESULTSA total of 4,657 respondents completed the survey. The HLDs used most often were glutaraldehyde (59%), peracetic acid (16%), and ortho-phthalaldehyde (15%). Examples of work practices or events that could increase exposure risk included failure to wear water-resistant gowns (44%); absence of standard procedures for minimizing exposure (19%); lack of safe handling training (17%); failure to wear protective gloves (9%); and a spill/leak of HLD during handling (5%). Among all respondents, 12% reported skin contact with HLDs, and 33% of these respondents reported that they did not always wear gloves.
CONCLUSIONFindings indicated that precautionary practices were not always used, underscoring the importance of improved employer and worker training and education regarding HLD hazards.
Infect Control Hosp Epidemiol 2014;00(0): 1–6
Central Line-Associated Bloodstream Infections in Non-ICU Inpatient Wards: A 2-Year Analysis
- Yoona Rhee, Michael Heung, Benrong Chen, Carol E. Chenoweth
-
- Published online by Cambridge University Press:
- 23 January 2015, pp. 424-430
-
- Article
- Export citation
-
OBJECTIVE
Little is known about patient-specific factors contributing to central line-associated bloodstream infection (CLABSI) outside of the intensive care unit (ICU). We sought to describe these factors and hypothesized that dialysis patients would comprise a significant proportion of this cohort.
DESIGNRetrospective observational study from January 2010 to December 2011
SETTINGAn 880-bed tertiary teaching hospital
PATIENTSPatients with CLABSI in non–ICU wards
METHODSCLABSI patients were identified from existing infection-control databases and primary chart review was conducted. National Health and Safety Network (NHSN) definitions were utilized for CLABSI and pathogen classification. CLABSI rates were calculated per patient day. Total mortality rates were inclusive of hospice patients.
RESULTSOver a 2-year period, 104 patients incurred 113 CLABSIs for an infection rate of 0.35 per 1,000 patient days. The mean length of hospital stay prior to CLABSI was 16±13.3 days, which was nearly 3 times that of hospital-wide non-ICU length of stay. Only 11 patients (10.6%) received dialysis within 48 hours of CLABSI. However, 67% of patients had a hematologic malignancy, and 91.8% of those admitted with a malignant hematologic diagnosis were neutropenic at the time of CLABSI. Enterococcus spp. was the most common organism recovered, and half of all central venous catheters (CVCs) present were peripherally inserted central catheters (PICC lines). Mortality rates were 18.3% overall and 27.3% among dialysis patients.
CONCLUSIONSIn patients with CLABSIs outside of the ICU, only 10.6% received dialysis prior to infection. However, underlying hematologic malignancy, neutropenia, and PICC lines were highly prevalent in this population.
Infect Control Hosp Epidemiol 2015;00(0):1–7
Epidemiology of Methicillin-Susceptible Staphylococcus aureus in a Neonatology Ward
- Yvonne Achermann, Kati Seidl, Stefan P. Kuster, Nadja Leimer, Nina Durisch, Evelyne Ajdler-Schäffler, Stephan Karrer, Gabriela Senn, Anne Holzmann-Bürgel, Aline Wolfensberger, Antonio Leone, Romaine Arlettaz, Annelies S. Zinkernagel, Hugo Sax
-
- Published online by Cambridge University Press:
- 20 August 2015, pp. 1305-1312
-
- Article
- Export citation
-
OBJECTIVE
In-hospital transmission of methicillin-susceptible Staphylococcus aureus (MSSA) among neonates remains enigmatic. We describe the epidemiology of MSSA colonization and infection in a 30-bed neonatal ward.
DESIGNMultimodal outbreak investigation
SETTINGA public 800-bed tertiary care university hospital in Switzerland
METHODSInvestigations in 2012–2013, triggered by a MSSA infection cluster, included prospective MSSA infection surveillance, microbiologic screening of neonates and environment, onsite observations, and a prospective cohort study. MSSA isolates were characterized by pulsed-field gel electrophoresis (PFGE) and selected isolates were examined for multilocus sequence type (MLST) and virulence factors.
RESULTSAmong 726 in 2012, 30 (4.1%) patients suffered from MSSA infections including 8 (1.1%) with bacteremia. Among 655 admissions in 2013, 13 (2.0%) suffered from MSSA infections including 2 (0.3%) with bacteremia. Among 177 neonates screened for S. aureus carriage, overall 77 (44%) tested positive. A predominant PFGE-1-ST30 strain was identified in 6 of 30 infected neonates (20%) and 30 of 77 colonized neonates (39%). This persistent clone was pvl-negative, tst-positive and belonged to agr group III. We found no environmental point source. MSSA carriage was associated with central vascular catheter use but not with a particular midwife, nurse, physician, or isolette. Observed healthcare worker behavior may have propagated transmission via hands and fomites. Despite multimodal interventions, clonal transmission and colonization continued and another clone, PFGE-6-ST5, became predominant.
CONCLUSIONSHospital-acquired MSSA clones represent a high proportion of MSSA colonization but not MSSA infections in neonate inpatients. In contrast to persisting MSSA, transmission infection rates decreased concurrently with interventions. It remains to be established whether eradication of hospital-acquired MSSA strains would reduce infection rates further.
Infect. Control Hosp. Epidemiol. 2015;36(11):1305–1312
Interindividual Contacts and Carriage of Methicillin-Resistant Staphylococcus aureus: A Nested Case-Control Study
- Thomas Obadia, Lulla Opatowski, Laura Temime, Jean-Louis Herrmann, Éric Fleury, Pierre-Yves Boëlle, Didier Guillemot
-
- Published online by Cambridge University Press:
- 20 April 2015, pp. 922-929
-
- Article
- Export citation
-
BACKGROUND
Reducing the spread of multidrug-resistant bacteria in hospitals remains a challenge. Current methods are screening of patients, isolation, and adherence to hygiene measures among healthcare workers (HCWs). More specific measures could rely on a better characterization of the contacts at risk of dissemination.
OBJECTIVETo quantify how close-proximity interactions (CPIs) affected Staphylococcus aureus dissemination.
DESIGNNested case-control study.
SETTINGFrench long-term care facility in 2009.
PARTICIPANTSPatients (n=329) and HCWs (n=261).
METHODSWe recorded CPIs using electronic devices together with S. aureus nasal carriage during 4 months in all participants. Cases consisted of patients showing incident S. aureus colonization and were paired to 8 control patients who did not exhibit incident colonization at the same date. Conditional logistic regression was used to quantify associations between incidence and exposure to demographic, network, and carriage covariables.
RESULTSThe local structure of contacts informed on methicillin-resistant S. aureus (MRSA) carriage acquisition: CPIs with more HCWs were associated with incident MRSA colonization in patients (odds ratio [OR], 1.10 [95% CI, 1.04–1.17] for 1 more HCW), as well as longer CPI durations (1.03 [1.01–1.06] for a 1-hour increase). Joint analysis of carriage and contacts showed increased carriage acquisition in case of CPI with another colonized individual (OR, 1.55 [1.14–2.11] for 1 more HCW). Global network measurements did not capture associations between contacts and carriage.
CONCLUSIONSElectronically recorded CPIs inform on the risk of MRSA carriage, warranting more study of in-hospital contact networks to design targeted intervention strategies.
Infect. Control Hosp. Epidemiol. 2015;36(8):922–929
One-Week versus 2-Day Ventilator Circuit Change in Neonates with Prolonged Ventilation: Cost-Effectiveness and Impact on Ventilator-Associated Pneumonia
- Shih-Ming Chu, Mei-Chin Yang, Hsiu-Feng Hsiao, Jen-Fu Hsu, Reyin Lien, Ming-Chou Chiang, Ren-Huei Fu, Hsuan-Rong Huang, Kuang-Hung Hsu, Ming-Horng Tsai
-
- Published online by Cambridge University Press:
- 22 December 2014, pp. 287-293
-
- Article
- Export citation
-
Objective
To investigate the impact of 1-week ventilator circuit change on ventilator-associated pneumonia and its cost-effectiveness compared with a 2-day change.
DesignAn observational cohort study.
SettingA tertiary level neonatal intensive care unit in a university-affiliated teaching hospital in Taiwan.
PatientsAll neonates in the neonatal intensive care unit receiving invasive intubation for more than 1 week from July 1, 2011, through December 31, 2013.
InterventionWe investigated the impact of 2 ventilator circuit change regimens, either every 2 days or 7 days, on ventilator-associated pneumonia of our cohort.
Measurements and Main ResultsA total of 361 patients were maintained on mechanical ventilators for 13,981 days. The 2 groups did not differ significantly in any demographic characteristics. The rate of ventilator-associated pneumonia was comparable between the 2-day group and the 7-day group (8.2 vs 9.5 per 1,000 ventilator-days, P=.439). The durations of mechanical ventilation and hospital stay, and rates of bloodstream infection and mortality, were also comparable between the 2 groups. Switching from a 2-day to a 7-day change policy would save our neonatal intensive care unit a yearly sum of US $29,350 and 525 working hours.
ConclusionDecreasing the frequency of ventilator circuit changes from every 2 days to once per week is safe and cost-effective in neonates requiring prolonged intubation for more than 1 week.
Infect Control Hosp Epidemiol 2014;00(0): 1–7
Preventing Central Line–Associated Bloodstream Infections: A Qualitative Study of Management Practices
- Ann Scheck McAlearney, Jennifer L. Hefner, Julie Robbins, Michael I. Harrison, Andrew Garman
-
- Published online by Cambridge University Press:
- 23 February 2015, pp. 557-563
-
- Article
- Export citation
-
OBJECTIVE
To identify factors that may explain hospital-level differences in outcomes of programs to prevent central line–associated bloodstream infections.
DESIGNExtensive qualitative case study comparing higher- and lower-performing hospitals on the basis of reduction in the rate of central line–associated bloodstream infections. In-depth interviews were transcribed verbatim and analyzed to determine whether emergent themes differentiated higher- from lower-performing hospitals.
SETTINGEight US hospitals that had participated in the federally funded On the CUSP—Stop BSI initiative.
PARTICIPANTSOne hundred ninety-four interviewees including administrative leaders, clinical leaders, professional staff, and frontline physicians and nurses.
RESULTSA main theme that differentiated higher- from lower-performing hospitals was a distinctive framing of the goal of “getting to zero” infections. Although all sites reported this goal, at the higher-performing sites the goal was explicitly stated, widely embraced, and aggressively pursued; in contrast, at the lower-performing hospitals the goal was more of an aspiration and not embraced as part of the strategy to prevent infections. Five additional management practices were nearly exclusively present in the higher-performing hospitals: (1) top-level commitment, (2) physician-nurse alignment, (3) systematic education, (4) meaningful use of data, and (5) rewards and recognition. We present these strategies for prevention of healthcare-associated infection as a management “bundle” with corresponding suggestions for implementation.
CONCLUSIONSSome of the variance associated with CLABSI prevention program outcomes may relate to specific management practices. Adding a management practice bundle may provide critical guidance to physicians, clinical managers, and hospital leaders as they work to prevent healthcare-associated infections.
Infect Control Hosp Epidemiol 2015;00(0): 1–7
Severity of Disease Estimation and Risk-Adjustment for Comparison of Outcomes in Mechanically Ventilated Patients Using Electronic Routine Care Data
- Maaike S. M. van Mourik, Karel G. M. Moons, MICU Registry, Michael V. Murphy, Marc J. M. Bonten, Michael Klompas
-
- Published online by Cambridge University Press:
- 17 April 2015, pp. 807-815
-
- Article
- Export citation
-
BACKGROUND
Valid comparison between hospitals for benchmarking or pay-for-performance incentives requires accurate correction for underlying disease severity (case-mix). However, existing models are either very simplistic or require extensive manual data collection.
OBJECTIVETo develop a disease severity prediction model based solely on data routinely available in electronic health records for risk-adjustment in mechanically ventilated patients.
DESIGNRetrospective cohort study.
PARTICIPANTSMechanically ventilated patients from a single tertiary medical center (2006–2012).
METHODSPredictors were extracted from electronic data repositories (demographic characteristics, laboratory tests, medications, microbiology results, procedure codes, and comorbidities) and assessed for feasibility and generalizability of data collection. Models for in-hospital mortality of increasing complexity were built using logistic regression. Estimated disease severity from these models was linked to rates of ventilator-associated events.
RESULTSA total of 20,028 patients were initiated on mechanical ventilation, of whom 3,027 deceased in hospital. For models of incremental complexity, area under the receiver operating characteristic curve ranged from 0.83 to 0.88. A simple model including demographic characteristics, type of intensive care unit, time to intubation, blood culture sampling, 8 common laboratory tests, and surgical status achieved an area under the receiver operating characteristic curve of 0.87 (95% CI, 0.86–0.88) with adequate calibration. The estimated disease severity was associated with occurrence of ventilator-associated events.
CONCLUSIONSAccurate estimation of disease severity in ventilated patients using electronic, routine care data was feasible using simple models. These estimates may be useful for risk-adjustment in ventilated patients. Additional research is necessary to validate and refine these models.
Infect. Control Hosp. Epidemiol. 2015;36(7):807–815
Low Yield of Methicillin-Resistant Staphylococcus aureus Screening in Hemodialysis Patients: 10 Years’ Experience
- H. M. Gebreselassie, T. Kaspar, S. Droz, J. Marschall
-
- Published online by Cambridge University Press:
- 26 May 2015, pp. 1046-1049
-
- Article
- Export citation
-
OBJECTIVE
To determine the prevalence of methicillin-resistant Staphylococcus aureus (MRSA) nasal colonization in hemodialysis patients and to analyze the cost-effectiveness of our screening approach compared with an alternative strategy.
DESIGNScreening study and cost-effectiveness analysis.
METHODSAnalysis of twice-yearly MRSA prevalence studies conducted in the hemodialysis unit of a 950-bed tertiary care hospital from January 1, 2004, through December 31, 2013. For this purpose, nasal swab samples were cultured on MRSA screening agar (mannitol-oxacillin biplate).
RESULTSThere were 20 mass screenings during the 10-year study period. We identified 415 patients participating in at least 1 screening, with an average of 4.5 screenings per patient. Of 415 screened patients, 15 (3.6%) were found to be MRSA carriers. The first mass screening in 2004 yielded the highest percentage of MRSA (6/101 [6%]). Only 7 subsequent screenings revealed new MRSA carriers, whereas 4 screenings confirmed previously known carriers, and 8 remained negative. None of the carriers developed MRSA bacteremia during the study period. The total cost of our screening approach, that is, screening and isolation costs, was US $93,930. The total cost of an alternative strategy (ie, no mass screening administered) would be equivalent to costs of isolation of index cases and contact tracing was estimated to be US $5,382 (difference, US $88,548).
CONCLUSIONSIn an area of low MRSA endemicity (<5%), regular nasal screenings of a high-risk population yielded a low rate of MRSA carriers. Twice-yearly MRSA screening of dialysis patients is unlikely to be cost-effective if MRSA prevalence is low.
Infect. Control Hosp. Epidemiol. 2015;36(9):1046–1049
Evaluating State-Specific Antibiotic Resistance Measures Derived from Central Line-Associated Bloodstream Infections, National Healthcare Safety Network, 2011
- Minn M. Soe, Jonathan R. Edwards, Dawn M. Sievert, Philip M. Ricks, Shelley S. Magill, Scott K. Fridkin
-
- Published online by Cambridge University Press:
- 05 January 2015, pp. 54-64
-
- Article
- Export citation
-
DISCLOSURE
The findings and conclusions in this report are those of the authors and do not necessarily represent the official position of the Centers for Disease Control and Prevention or the Agency for Toxic Substances and Diseases Registry.
OBJECTIVEDescribe the impact of standardizing state-specific summary measures of antibiotic resistance that inform regional interventions to reduce transmission of resistant pathogens in healthcare settings.
DESIGNAnalysis of public health surveillance data.
METHODSCentral line–associated bloodstream infection (CLABSI) data from intensive care units (ICUs) of facilities reporting to the National Healthcare Safety Network in 2011 were analyzed. For CLABSI due to methicillin-resistant Staphylococcus aureus (MRSA), extended-spectrum cephalosporin (ESC)-nonsusceptible Klebsiella species, and carbapenem-nonsusceptible Klebsiella species, we computed 3 state-level summary measures of nonsusceptibility: crude percent nonsusceptible, model-based adjusted percent nonsusceptible, and crude infection incidence rate.
RESULTSOverall, 1,791 facilities reported CLABSIs from ICU patients. Of 1,618 S. aureus CLABSIs with methicillin-susceptibility test results, 791 (48.9%) were due to MRSA. Of 756 Klebsiella CLABSIs with ESC-susceptibility test results, 209 (27.7%) were due to ESC-nonsusceptible Klebsiella, and among 661 Klebsiella CLABSI with carbapenem susceptibility test results, 70 (10.6%) were due to carbapenem-nonsusceptible Klebsiella. All 3 state-specific measures demonstrated variability in magnitude by state. Adjusted measures, with few exceptions, were not appreciably different from crude values for any phenotypes. When linking values of crude and adjusted percent nonsusceptible by state, a state’s absolute rank shifted slightly for MRSA in 5 instances and only once each for ESC-nonsusceptible and carbapenem-nonsusceptible Klebsiella species. Infection incidence measures correlated strongly with both percent nonsusceptibility measures.
CONCLUSIONSCrude state-level summary measures, based on existing NHSN CLABSI data, may suffice to assess geographic variability in antibiotic resistance. As additional variables related to antibiotic resistance become available, risk-adjusted summary measures are preferable.
Infect Control Hosp Epidemiol 2015;36(1): 54–64
Transmission of Methicillin-Resistant Staphylococcus aureus (MRSA) to Healthcare Worker Gowns and Gloves During Care of Nursing Home Residents
- Mary-Claire Roghmann, J. Kristie Johnson, John D. Sorkin, Patricia Langenberg, Alison Lydecker, Brian Sorace, Lauren Levy, Lona Mody
-
- Published online by Cambridge University Press:
- 26 May 2015, pp. 1050-1057
-
- Article
- Export citation
-
OBJECTIVE
To estimate the frequency of methicillin-resistant Staphylococcus aureus (MRSA) transmission to gowns and gloves worn by healthcare workers (HCWs) interacting with nursing home residents to better inform infection prevention policies in this setting
DESIGNObservational study
SETTINGParticipants were recruited from 13 community-based nursing homes in Maryland and Michigan
PARTICIPANTSResidents and HCWs from these nursing homes
METHODSResidents were cultured for MRSA at the anterior nares and perianal or perineal skin. HCWs wore gowns and gloves during usual care activities. At the end of each activity, a research coordinator swabbed the HCW’s gown and gloves.
RESULTSA total of 403 residents were enrolled; 113 were MRSA colonized. Glove contamination was higher than gown contamination (24% vs 14% of 954 interactions; P<.01). Transmission varied greatly by type of care from 0% to 24% for gowns and from 8% to 37% for gloves. We identified high-risk care activities: dressing, transferring, providing hygiene, changing linens, and toileting the resident (OR >1.0; P<.05). We also identified low-risk care activities: giving medications and performing glucose monitoring (OR<1.0; P<.05). Residents with chronic skin breakdown had significantly higher rates of gown and glove contamination.
CONCLUSIONSMRSA transmission from MRSA-positive residents to HCW gown and gloves is substantial; high-contact activities of daily living confer the highest risk. These activities do not involve overt contact with body fluids, skin breakdown, or mucous membranes, which suggests the need to modify current standards of care involving the use of gowns and gloves in the nursing home setting.
Infect. Control Hosp. Epidemiol. 2015;36(9):1050–1057
Comparison of a Silver-Coated Needleless Connector and a Standard Needleless Connector for the Prevention of Central Line-Associated Bloodstream Infections
- Part of:
- Jesse T. Jacob, Sheri Chernetsky Tejedor, Mary Dent Reyes, Xin Lu, Kirk A. Easley, William L. Aurand, Gina Garrett, Kimberly Graham, Carolyn Holder, Chad Robichaux, James P. Steinberg
-
- Published online by Cambridge University Press:
- 30 December 2014, pp. 294-301
-
- Article
-
- You have access Access
- HTML
- Export citation
-
OBJECTIVE
To assess the impact of a novel, silver-coated needleless connectors (NCs) on central-line–associated bloodstream infection (CLABSI) rates compared with a mechanically identical NCs without a silver coating.
DESIGNProspective longitudinal observation study
SETTINGTwo 500-bed university hospitals
PATIENTSAll hospitalized adults from November 2009 to June 2011 with non-hemodialysis central lines
INTERVENTIONSHospital A started with silver-coated NCs and switched to standard NCs in September 2010; hospital B started with standard NCs and switched to silver-coated NCs. The primary outcome was the difference revealed by Poisson multivariate regression in CLABSI rate using standard Centers for Disease Control and Prevention surveillance definitions. The secondary outcome was a comparison of organism-specific CLABSI rates by NC type.
RESULTSAmong 15,845 hospital admissions, 140,186 central-line days and 221 CLABSIs were recorded during the study period. In a multivariate model, the CLABSI rate per 1,000 central-line days was lower with silver-coated NCs than with standard NCs (1.21 vs 1.79; incidence rate ratio=0.68 [95% CI: 0.52–0.89], P=.005). A lower CLABSI rate per 1,000 central-line days for the silver-coated NCs versus the standard NCs was observed with S. aureus (0.11 vs 0.30, P=.02), enterococci (0.10 vs 0.27, P=.03), and Gram-negative organisms (0.28 vs 0.63, P=.003) but not with coagulase-negative staphylococci (0.31 vs 0.36) or Candida spp. (0.42 vs 0.40).
CONCLUSIONSThe use of silver-coated NCs decreased the CLABSI rate by 32%. CLABSI reduction efforts should include efforts to minimize contamination of NCs.
Infect Control Hosp Epidemiol 2014;00(0): 1–8
A Pilot Study of a Computerized Decision Support System to Detect Invasive Fungal Infection in Pediatric Hematology/Oncology Patients
- Adam Bartlett, Emma Goeman, Aditi Vedi, Mona Mostaghim, Toby Trahair, Tracey A. O’Brien, Pamela Palasanthiran, Brendan McMullan
-
- Published online by Cambridge University Press:
- 17 August 2015, pp. 1313-1317
-
- Article
- Export citation
-
OBJECTIVE
Computerized decision support systems (CDSSs) can provide indication-specific antimicrobial recommendations and approvals as part of hospital antimicrobial stewardship (AMS) programs. The aim of this study was to assess the performance of a CDSS for surveillance of invasive fungal infections (IFIs) in an inpatient hematology/oncology cohort.
METHODSBetween November 1, 2012, and October 31, 2013, pediatric hematology/oncology inpatients diagnosed with an IFI were identified through an audit of the CDSS and confirmed by medical record review. The results were compared to hospital diagnostic-related group (DRG) coding for IFI throughout the same period.
RESULTSA total of 83 patients were prescribed systemic antifungals according to the CDSS for the 12-month period. The CDSS correctly identified 19 patients with IFI on medical record review, compared with 10 patients identified by DRG coding, of whom 9 were confirmed to have IFI on medical record review.
CONCLUSIONSCDSS was superior to diagnostic coding in detecting IFI in an inpatient pediatric hematology/oncology cohort. The functionality of CDSS lends itself to inpatient infectious diseases surveillance but depends on prescriber adherence.
Infect. Control Hosp. Epidemiol. 2015;36(11):1313–1317