Original Articles
Cost-Effectiveness Analysis of Active Surveillance Screening for Methicillin-Resistant Staphylococcus aureus in an Academic Hospital Setting
- JaHyun Kang, Paul Mandsager, Andrea K. Biddle, David J. Weber
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 477-486
-
- Article
- Export citation
-
Objective.
To evaluate the cost-effectiveness of 3 alternative active screening strategies for methicillin-resistant Staphylococcus aureus (MRSA): universal surveillance screening for all hospital admissions, targeted surveillance screening for intensive care unit admissions, and no surveillance screening.
Design.Cost-effectiveness analysis using decision modeling.
Methods.Cost-effectiveness was evaluated from the perspective of an 800-bed academic hospital with 40,000 annual admissions over the time horizon of a hospitalization. All input probabilities, costs, and outcome data were obtained through a comprehensive literature review. Effectiveness outcome was MRSA healthcare-associated infections (HAIs). One-way and probabilistic sensitivity analyses were conducted.
Results.In the base case, targeted surveillance screening was a dominant strategy (ie, was associated with lower costs and resulted in better outcomes) for preventing MRSA HAL Universal surveillance screening was associated with an incremental cost-effectiveness ratio of $14,955 per MRSA HAL In one-way sensitivity analysis, targeted surveillance screening was a dominant strategy across most parameter ranges. Probabilistic sensitivity analysis also demonstrated that targeted surveillance screening was the most cost-effective strategy when willingness to pay to prevent a case of MRSA HAI was less than $71,300.
Conclusion.Targeted active surveillance screening for MRSA is the most cost-effective screening strategy in an academic hospital setting. Additional studies that are based on actual hospital data are needed to validate this model. However, the model supports current recommendations to use active surveillance to detect MRSA.
Seasonal Influenza Vaccine Compliance among Hospital-Based and Nonhospital-Based Healthcare Workers
- Terri Rebmann, Kathleen S. Wright, John Anthony, Richard C. Knaup, Eleanor B. Peters
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 243-249
-
- Article
- Export citation
-
Background.
Influenza vaccination among nonhospital healthcare workers (HCWs) is imperative, but only limited data are available for factors affecting their compliance.
Objective.To examine the factors influencing influenza vaccine compliance among hospital and nonhospital HCWs.
Design and Setting.A vaccine compliance questionnaire was administered to HCWs working in myriad healthcare settings in March-June 2011.
Methods.Online and paper surveys were used to assess compliance with the 2010/2011, 2009/2010, and H1N1 influenza vaccines and to examine factors that predicted the uptake of the 2010/2011 seasonal influenza vaccine.
Results.In all, 3,188 HCWs completed the survey; half of these (n = 1,719) reported no hospital work time. Compliance rates for all 3 vaccines were significantly higher (P< .001) among hospital versus nonhospital HCWs. In logistic regression stratified by hospital versus nonhospital setting, and when controlling for demographics and past behavior, the determinants of vaccination against the 2010/2011 seasonal influenza among nonhospital-based HCWs included having a mandatory vaccination policy, perceived importance, no fear of vaccine adverse effects, free and on-site access, and perceived susceptibility to influenza. Determinants of hospital-based HCW vaccine compliance included having a mandatory vaccination policy, belief that HCWs should be vaccinated every year, occupational health encouragement, perceived importance of vaccination, on-site access, and no fear of vaccine adverse effects. The strongest predictor of compliance for both worker groups was existence of a mandatory vaccination policy.
Conclusions.The reasons for vaccine uptake among nonhospital-based versus hospital-based HCWs differed. Targeted interventions should be aimed at workers in these settings to increase their vaccine compliance, including implementing a mandatory vaccination policy.
Infect Control Hosp Epidemiol 2012;33(3):243-249
Original Article
Evaluation of Potential Environmental Contamination Sources for the Presence of Multidrug-Resistant Bacteria Linked to Wound Infections in Combat Casualties
- Edward F. Keen III, Katrin Mende, Heather C. Yun, Wade K. Aldous, Timothy E. Wallum, Charles H. Guymon, David W. Cole, Helen K. Crouch, Matthew E. Griffith, Bernadette L. Thompson, Joel T. Rose, Clinton K. Murray
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 905-911
-
- Article
- Export citation
-
Objective.
To determine whether multidrug-resistant (MDR) gram-negative organisms are present in Afghanistan or Iraq soil samples, contaminate standard deployed hospital or modular operating rooms (ORs), or aerosolize during surgical procedures.
Design.Active surveillance.
Setting.US military hospitals in the United States, Afghanistan, and Iraq.
Methods.Soil samples were collected from sites throughout Afghanistan and Iraq and analyzed for presence of MDR bacteria. Environmental sampling of selected newly established modular and deployed OR high-touch surfaces and equipment was performed to determine the presence of bacterial contamination. Gram-negative bacteria aerosolization during OR surgical procedures was determined by microbiological analysis of settle plate growth.
Results.Subsurface soil sample isolates recovered in Afghanistan and Iraq included various pansusceptible members of Enterobacteriaceae, Vibrio species, Pseudomonas species, Acinetobacter Iwojfii, and coagulase-negative Staphylococcus (CNS). OR contamination studies in Afghanistan revealed 1 surface with a Micrococcus luteus. Newly established US-based modular ORs and the colocated fixed-facility ORs revealed no gram-negative bacterial contamination prior to the opening of the modular OR and 5 weeks later. Bacterial aerosolization during surgery in a deployed fixed hospital revealed a mean gram-negative bacteria colony count of 12.8 colony-forming units (CFU)/dm2/h (standard deviation [SD], 17.0) during surgeries and 6.5 CFU/dm2/h (SD, 7.5; P = .14) when the OR was not in use.
Conclusion.This study demonstrates no significant gram-negative bacilli colonization of modular and fixed-facility ORs or dirt and no significant aerosolization of these bacilli during surgical procedures. These results lend additional support to the role of nosocomial transmission of MDR pathogens or the colonization of the patient themselves prior to injury.
Audit and Feedback to Reduce Broad-Spectrum Antibiotic Use among Intensive Care Unit Patients A Controlled Interrupted Time Series Analysis
- Marion Elligsen, Sandra A. N. Walker, Ruxandra Pinto, Andrew Simor, Samira Mubareka, Anita Rachlis, Vanessa Allen, Nick Daneman
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 354-361
-
- Article
- Export citation
-
Objective.
We aimed to rigorously evaluate the impact of prospective audit and feedback on broad-spectrum antimicrobial use among critical care patients.
Design.Prospective, controlled interrupted time series.
SettingSingle tertiary care center with 3 intensive care units.
Patients and Interventions.A formal review of all critical care patients on their third or tenth day of broad-spectrum antibiotic therapy was conducted, and suggestions for antimicrobial optimization were communicated to the critical care team.
Outcomes.The primary outcome was broad-spectrum antibiotic use (days of therapy per 1000 patient-days; secondary outcomes included overall antibiotic use, gram-negative bacterial susceptibility, nosocomial Clostridium difficile infections, length of stay, and mortality.
Results.The mean monthly broad-spectrum antibiotic use decreased from 644 days of therapy per 1,000 patient-days in the preintervention period to 503 days of therapy per 1,000 patient-days in the postintervention period (P < .0001); time series modeling confirmed an immediate decrease (± standard error) of 119 ± 37.9 days of therapy per 1,000 patient-days (P = .0054). In contrast, no changes were identified in the use of broad-spectrum antibiotics in the control group (nonintervention medical and surgical wards) or in the use of control medications in critical care (stress ulcer prophylaxis). The incidence of nosocomial C. difficile infections decreased from 11 to 6 cases in the study intensive care units, whereas the incidence increased from 87 to 116 cases in the control wards (P = .04). Overall gram-negative susceptibility to meropenem increased in the critical care units. Intensive care unit length of stay and mortality did not change.
Conclusions.Institution of a formal prospective audit and feedback program appears to be a safe and effective means to improve broad-spectrum antimicrobial use in critical care.
Original Articles
Economic Impact of Ventilator-Associated Pneumonia in a Large Matched Cohort
- Marin H. Kollef, Cindy W. Hamilton, Frank R. Ernst
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 250-256
-
- Article
- Export citation
-
Objective.
To evaluate the economic impact of ventilator-associated pneumonia (VAP) on length of stay and hospital costs.
Design.Retrospective matched cohort study.
Setting.Premier database of hospitals in the United States.
Patients.Eligible patients were admitted to intensive care units (ICUs), received mechanical ventilation for ≥2 calendar-days, and were discharged between October 1, 2008, and December 31, 2009.
Methods.VAP was defined by International Classification of Diseases, Ninth Revision (ICD-9), code 997.31 and ventilation charges for ≥2 calendar-days. We matched patients with VAP to patients without VAP by propensity score on the basis of demographics, administrative data, and severity of illness. Cost was based on provider perspective and procedural cost accounting methods.
Results.Of 88,689 eligible patients, 2,238 (2.5%) had VAP; the incidence rate was 1.27 per 1,000 ventilation-days. In the matched cohort, patients with VAP (n = 2,144) had longer mean durations of mechanical ventilation (21.8 vs 10.3 days), ICU stay (20.5 vs 11.6 days), and hospitalization (32.6 vs 19.5 days; all P< .0001) than patients without VAP (n = 2,144). Mean hospitalization costs were $99,598 for patients with VAP and $59,770 for patients without VAP (P< .0001), resulting in an absolute difference of $39,828. Patients with VAP had a lower in-hospital mortality rate than patients without VAP (482/2,144 [22.5%] vs 630/2,144 [29.4%]; P<.0001).
Conclusions.Our findings suggest that VAP continues to occur as defined by the new specific ICD-9 code and is associated with a statistically significant resource utilization burden, which underscores the need for cost-effective interventions to minimize the occurrence of this complication.
Infect Control Hosp Epidemiol 2012;33(3):250-256
Effectiveness of Selected Surgical Masks in Arresting Vegetative Cells and Endospores When Worn by Simulated Contagious Patients
- Christopher F. Green, Craig S. Davidson, Adelisa L. Panlilio, Paul A. Jensen, Yan Jin, Shawn G. Gibbs, Pasquale V. Scarpino
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 487-494
-
- Article
- Export citation
-
Objective.
The objective of this study was to quantify the effectiveness of selected surgical masks in arresting vegetative cells and endospores in an experimental model that simulated contagious patients.
Setting.Laboratory.
Methods.Five commercially available surgical masks were tested for their ability to arrest infectious agents. Surgical masks were placed over the nose and mouth of mannequin head forms (Simulaids adult model Brad CPR torso). The mannequins were retrofitted with a nebulizer attached to an automated breathing simulator calibrated to a tidal volume of 500 mL/breath and a breathing rate of 20 breaths/min, for a minute respiratory volume of 10 L/min. Aerosols of endospores or vegetative cells were generated with a modified microbiological research establishment-type 6-jet collision nebulizer, while air samples were taken with all-glass impinger (AGI-30) samplers downstream of the point source. All experiments were conducted in a horizontal bioaerosol chamber.
Results.Mean arrestance of bioaerosols by the surgical masks ranged from 48% to 68% when the masks were challenged with endospores and from 66% to 76% when they were challenged with vegetative cells. When the arrestance of endospores was evaluated, statistical differences were observed between some pairs, though not all, of the models evaluated. There were no statistically significant differences in arrestance observed between models of surgical masks challenged with vegetative cells.
Conclusions.The arrestance of airborne vegetative cells and endospores by surgical masks worn by simulated contagious patients supports surgical mask use as one of the recommended cough etiquette interventions to limit the transmission of airborne infectious agents.
Commentary
Infections Associated with Use of Ultrasound Transmission Gel: Proposed Guidelines to Minimize Risk
- Susan C. Oleszkowicz, Paul Chittick, Victoria Russo, Paula Keller, Matthew Sims, Jeffrey Band
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 1235-1237
-
- Article
- Export citation
Original Article
Successful Implementation of a Window for Routine Antimicrobial Prophylaxis Shorter than That of the World Health Organization Standard
- Heidi Misteli, Andreas F. Widmer, Walter P. Weber, Evelyne Bucher, Marc Dangel, Stefan Reck, Daniel Oertli, Walter R. Marti, Rachel Rosenthal
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 912-916
-
- Article
- Export citation
-
Objective.
To evaluate the feasibility of implementation of the refined window for routine antimicrobial prophylaxis (RAP) of 30-74 minutes before skin incision compared to the World Health Organization (WHO) standard of 0-60 minutes.
Design.Prospective study on timing of routine antimicrobial prophylaxis in 2 different time periods.
Setting.Tertiary referral university hospital with 30,000 surgical procedures per year.
Methods.In all consecutive vascular, visceral, and trauma procedures, the timing was prospectively recorded during a first time period of 2 years (A; baseline) and a second period of 1 year (B; after intervention). An intensive intervention program was initiated after baseline. The primary outcome parameter was timing; the secondary outcome parameter was surgical site infection (SSI) rate in the subgroup of patients undergoing cholecystectomy/colon resection.
Results.During baseline time period A (3,836 procedures), RAP was administered 30–74 minutes before skin incision in 1,750 (41.0%) procedures; during time period B (1,537 procedures), it was administered in 914 (56.0%; P < .001). The subgroup analysis did not reveal a significant difference in SSI rate.
Conclusions.This bundle of interventions resulted in a statistically significant improvement of timing of RAP even at a shortened window compared to the WHO standard.
Original Articles
Cost-Effectiveness of Preoperative Nasal Mupirocin Treatment in Preventing Surgical Site Infection in Patients Undergoing Total Hip and Knee Arthroplasty: A Cost-Effectiveness Analysis
- Xan F. Courville, Ivan M. Tomek, Kathryn B. Kirkland, Marian Birhle, Stephen R. Kantor, Samuel R. G. Finlayson
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 152-159
-
- Article
- Export citation
-
Objective.
To perform a cost-effectiveness analysis to evaluate preoperative use of mupirocin in patients with total joint arthroplasty (TJA).
Design.Simple decision tree model.
Setting.Outpatient TJA clinical setting.
Participants.Hypothetical cohort of patients with TJA.
Interventions.A simple decision tree model compared 3 strategies in a hypothetical cohort of patients with TJA: (1) obtaining preoperative screening cultures for all patients, followed by administration of mupirocin to patients with cultures positive for Staphylococcus aureus; (2) providing empirical preoperative treatment with mupirocin for all patients without screening; and (3) providing no preoperative treatment or screening. We assessed the costs and benefits over a 1-year period. Data inputs were obtained from a literature review and from our institution's internal data. Utilities were measured in quality-adjusted life-years, and costs were measured in 2005 US dollars.
Main Outcome Measure.Incremental cost-effectiveness ratio.
Results.The treat-all and screen-and-treat strategies both had lower costs and greater benefits, compared with the no-treatment strategy. Sensitivity analysis revealed that this result is stable even if the cost of mupirocin was over $100 and the cost of SSI ranged between $26,000 and $250,000. Treating all patients remains the best strategy when the prevalence of S. aureus carriers and surgical site infection is varied across plausible values as well as when the prevalence of mupirocin-resistant strains is high.
Conclusions.Empirical treatment with mupirocin ointment or use of a screen-and-treat strategy before TJA is performed is a simple, safe, and cost-effective intervention that can reduce the risk of SSI. S. aureus decolonization with nasal mupirocin for patients undergoing TJA should be considered.
Level of Evidence.Level II, economic and decision analysis.
Infect Control Hosp Epidemiol 2012;33(2):152-159
Original Article
Registration of Blood Exposure Accidents in the Netherlands by a Nationally Operating Call Center
- Peter M. Schneeberger, Annemarie E. Meiberg, Janet Warmelts, Sander C. A. P. Leenders, Paul T. L. van Wijk
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 1017-1023
-
- Article
- Export citation
-
Objective.
Healthcare providers and other employees, especially those who do not work in a hospital, may not easily find help after the occurrence of a blood exposure accident. In 2006, a national call center was established in the Netherlands to fill this gap.
Methods.All occupational blood exposure accidents reported to the 24-hours-per-day, 7-days-per-week call center from 2007, 2008, and 2009 were analyzed retrospectively for incidence rates, risk assessment, handling, and preventive measures taken.
Results.A total of 2,927 accidents were reported. The highest incidence rates were reported for private clinics and hospitals (68.5 and 54.3 accidents per 1,000 person-years, respectively). Dental practices started reporting incidents frequently after the arrangement of a collective financial agreement with the call center. Employees of ambulance services, midwife practices, and private clinics reported mostly high-risk accidents, whereas penitentiaries frequently reported low-risk accidents. Employees in mental healthcare facilities, private clinics, and midwife practices reported accidents relatively late. The extent of hepatitis B vaccination in mental healthcare facilities, penitentiaries, occupational health services, and cleaning services was low (<70%).
Conclusions.The national call center successfully organized the national registration and handling of blood exposure accidents. The risk of blood exposure accidents could be estimated on the basis of this information for several occupational branches. Targeted preventive measures for healthcare providers and other employees at risk can next be developed.
Infect Control Hosp Epidemiol 2012;33(10):1017-1023
Original Articles
Findings of the International Nosocomial Infection Control Consortium (INICC), Part I: Effectiveness of a Multidimensional Infection Control Approach on Catheter-Associated Urinary Tract Infection Rates in Pediatric Intensive Care Units of 6 Developing Countries
- Victor D. Rosenthal, Bala Ramachandran, Lourdes Dueñas, Carlos Álvarez-Moreno, J. A. Navoa-Ng, Alberto Armas-Ruiz, Gulden Ersoz, Lorena Matta-Cortés, Mandakini Pawar, Ata Nevzat-Yalcin, Marena Rodriguez-Ferrer, Ana Concepción Bran de Casares, Claudia Linares, Victoria D. Villanueva, Roberto Campuzano, Ali Kaya, Luis Fernando Rendon-Campo, Amit Gupta, Ozge Turhan, Nayide Barahona-Guzmán, Lilian de Jesús-Machuca, María Corazon V. Tolentino, Jorge Mena-Brito, Necdet Kuyucu, Yamileth Astudillo, Narinder Saini, Nurgul Gunay, Guillermo Sarmiento-Villa, Eylul Gumus, Alfredo Lagares-Guzmán, Oguz Dursun
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 696-703
-
- Article
- Export citation
-
Design.
A before-after prospective surveillance study to assess the impact of a multidimensional infection control approach for the reduction of catheter-associated urinary tract infection (CAUTI) rates.
Setting.Pediatric intensive care units (PICUs) of hospital members of the International Nosocomial Infection Control Consortium (INICC) from 10 cities of the following 6 developing countries: Colombia, El Salvador, India, Mexico, Philippines, and Turkey.
Patients.PICU inpatients.
Methods.We performed a prospective active surveillance to determine rates of CAUTI among 3,877 patients hospitalized in 10 PICUs for a total of 27,345 bed-days. The study was divided into a baseline period (phase 1) and an intervention period (phase 2). In phase 1, surveillance was performed without the implementation of the multidimensional approach. In phase 2, we implemented a multidimensional infection control approach that included outcome surveillance, process surveillance, feedback on CAUTI rates, feedback on performance, education, and a bundle of preventive measures. The rates of CAUTI obtained in phase 1 were compared with the rates obtained in phase 2, after interventions were implemented.
Results.During the study period, we recorded 8,513 urinary catheter (UC) days, including 1,513 UC-days in phase 1 and 7,000 UC-days in phase 2. In phase 1, the CAUTI rate was 5.9 cases per 1,000 UC-days, and in phase 2, after implementing the multidimensional infection control approach for CAUTI prevention, the rate of CAUTI decreased to 2.6 cases per 1,000 UC-days (relative risk, 0.43 [95% confidence interval, 0.21–1.0]), indicating a rate reduction of 57%.
Conclusions.Our findings demonstrated that implementing a multidimensional infection control approach is associated with a significant reduction in the CAUTI rate of PICUs in developing countries.
Original Article
Ability of an Antibiogram to Predict Pseudomonas aeruginosa Susceptibility to Targeted Antimicrobials Based on Hospital Day of Isolation
- Deverick J. Anderson, Becky Miller, Ruchit Marfatia, Richard Drew
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 589-593
-
- Article
- Export citation
-
Objective.
To determine the utility of an antibiogram in predicting the susceptibility of Pseudomonas aeruginosa isolates to targeted antimicrobial agents based on the day of hospitalization the specimen was collected.
Design.Single-center retrospective cohort study.
Setting.A 750-bed tertiary care medical center.
Patients and Methods.Isolates from consecutive patients with at least 1 clinical culture positive for P. aeruginosa from January 1, 2000, to June 30, 2007, were included. A study antibiogram was created by determining the overall percentages of P. aeruginosa isolates susceptible to amikacin, ceftazidime, ciprofloxacin, gentamicin, imipenem-cilastin, piperacillin-tazobactam, and tobramycin during the study period. Individual logistic regression models were created to determine the day of infection after which the study antibiogram no longer predicted susceptibility to each antibiotic.
Results.A total of 3,393 isolates were included. The antibiogram became unreliable as a predictor of susceptibility to ceftazidime, imipenem-cilastin, piperacillin-tazobactam, and tobramycin after day 10 and ciprofloxacin after day 15 but longer for gentamicin (day 21) and amikacin (day 28). Time to unreliability of the antibiogram varied for antibiotics based on location of isolation. For example, the time to unreliability of the antibiogram for ceftazidime was 5 days (95% confidence interval [CI], <1–8) in the intensive care unit (ICU) and 12 days (95% CI, 7–21) in non-ICU hospital wards (P = .003).
Conclusions.The ability of the antibiogram to predict susceptibility of P. aeruginosa decreases as duration of hospitalization increases.
Chlorhexidine Gluconate Reduces Transmission of Methicillin-Resistant Staphylococcus aureus USA300 among Marine Recruits
- Timothy J. Whitman, Carey D. Schlett, Greg A. Grandits, Eugene V. Millar, Katrin Mende, Duane R. Hospenthal, Patrick R. Murray, David R. Tribble
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 809-816
-
- Article
- Export citation
-
Background.
Methicillin-resistant Staphylococcus aureus (MRSA) pulsed-field type (PFT) USA300 causes skin and soft tissue infections in military recruits and invasive disease in hospitals. Chlorhexidine gluconate (CHG) is used to reduce MRSA colonization and infection. The impact of CHG on the molecular epidemiology of MRSA is not known.
Objective.To evaluate the impact of 2% CHG—impregnated cloths on the molecular epidemiology of MRSA colonization.
Design.Cluster-randomized, double-blind, controlled trial.
Setting.Marine Officer Candidate School, Quantico, Virginia, in 2007.
Participants.Military recruits.
Intervention.Thrice-weekly application of CHG-impregnated or control (Comfort Bath; Sage) cloths over the entire body.
Measurements.Baseline and serial (every 2 weeks) nasal and/or axillary swab samples were assessed for MRSA colonization. Molecular analysis was performed with pulsed-field gel electrophoresis.
Results.During training, 77 subjects (4.9%) acquired MRSA, 26 (3.3%) in the CHG group and 51 (6.5%) in the control group (P = .004). When analyzed for PFT, 24 subjects (3.1%) in the control group but only 6 subjects (0.8%) in the CHG group (P = .001) had USA300. Of the 167 colonizing isolates recovered from 77 subjects, 99 were recovered from the control group, including USA300 (40.4%), USA800 (38.4%), USA1000 (12.1%), and USA100 (6.1%), and 68 were recovered from the CHG group, including USA800 (51.5%), USA100 (23.5%), and USA300 (13.2%).
Conclusions.CHG decreased the transmission of MRSA—more specifically, USA300—among military recruits. In addition, USA300 and USA800 outcompeted other MRSA PFTs at incident colonization. Future studies should evaluate the broad-based use of CHG to decrease transmission of USA300 in hospital settings.
Parenteral to Oral Conversion of Fluoroquinolones: Low-Hanging Fruit for Antimicrobial Stewardship Programs?
- Makoto Jones, Benedikt Huttner, Karl Madaras-Kelly, Kevin Nechodom, Christopher Nielson, Matthew Bidwell Goetz, Melinda M. Neuhauser, Matthew H. Samore, Michael A. Rubin
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 362-367
-
- Article
- Export citation
-
Objective.
To estimate avoidable intravenous (IV) fluoroquinolone use in Veterans Affairs (VA) hospitals.
Design.A retrospective analysis of bar code medication administration (BCMA) data.
Setting.Acute care wards of 128 VA hospitals throughout the United States.
Methods.Data were analyzed for all medications administered on acute care wards between January 1, 2006, and December 31, 2010. Patient-days receiving therapy were expressed as fluoroquinolone-days (FD) and divided into intravenous (IV; all doses administered intravenously) and oral (PO; at least one dose administered per os) FD. We assumed IV fluoroquinolone use to be potentially avoidable on a given IV FD when there was at least 1 other medication administered via the enteral route.
Results.Over the entire study period, 884,740 IV and 830,572 PO FD were administered. Overall, avoidable IV fluoroquinolone use accounted for 46.8% of all FD and 90.9% of IV FD. Excluding the first 2 days of all IV fluoroquinolone courses and limiting the analysis to the non-ICU setting yielded more conservative estimates of avoidable IV use: 20.9% of all FD and 45.9% of IV FD. Avoidable IV use was more common for levofloxacin and more frequent in the ICU setting. There was a moderate correlation between avoidable IV FD and total systemic antibiotic use (r = 0.32).
Conclusions.Unnecessary IV fluoroquinolone use seems to be common in the VA system, but important variations exist between facilities. Antibiotic stewardship programs could focus on this patient safety issue as a “low-hanging fruit” to increase awareness of appropriate antibiotic use.
Original Articles
A Clinical History of Methicillin-Resistant Staphylococcus aureus Is a Poor Predictor of Preoperative Colonization Status and Postoperative Infections
- Judith Strymish, Westyn Branch-Elliman, Kamal M. F. Itani, Sandra Williams, Kalpana Gupta
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 1113-1117
-
- Article
- Export citation
-
Objective.
In the absence of established methicillin-resistant Staphylococcus aureus (MRSA) screening programs, many centers use a history of a positive culture or a nasal screen as a surrogate for preoperative MRSA colonization status. We aimed to evaluate the test characteristics of these surrogates.
Design.Retrospective cohort study.
Participants.Veterans Affairs Boston Healthcare System surgical patients with a preoperative nasal MRSA polymerase chain reaction (PCR) screen.
Methods.We assessed the performance of a history of a MRSA-positive culture or a positive nasal MRSA PCR screen during the year prior to surgery for predicting the preoperative nasal PCR screen result. The associations between MRSA history and postoperative outcomes, including MRSA cultures and infections, were also evaluated.
Results.Among 4,238 patients, a positive MRSA culture history had a sensitivity of 19.7% (95% confidence interval [CI], 15.4%–24.8%) and positive predictive value of 57.3% for the preoperative nasal MRSA status. The specificity of MRSA culture history was 99% (95% CI, 98.5%–99.2%). Prior-year nasal MRSA screen results had similar test characteristics. A history of a MRSA-positive culture was associated with an increased risk of postoperative MRSA-positive cultures (risk ratio [RR], 3.54 [95% CI, 1.70–7.37], P< .001) but not of infections (RR, 1.71 [95% CI, 0.58–5.01]), after adjustment for preoperative nasal MRSA status, vancomycin surgical prophylaxis, surgical scrub, and age.
Conclusions.A history of a MRSA-positive culture and a positive nasal PCR screen are poor surrogate markers of preoperative colonization status, missing at least 70% of MRSA-colonized patients. Prior-year history is also not independently associated with MRSA-related postoperative infections. Strong consideration should be given to preoperative MRSA screening in patients at high risk for surgical complications.
Use of Medicare Diagnosis and Procedure Codes to Improve Detection of Surgical Site Infections following Hip Arthroplasty, Knee Arthroplasty, and Vascular Surgery
- Michael S. Calderwood, Allen Ma, Yosef M. Khan, Margaret A. Olsen, Dale W. Bratzler, Deborah S. Yokoe, David C. Hooper, Kurt Stevenson, Victoria J. Fraser, Richard Platt, Susan S. Huang, CDC Prevention Epicenters Program
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 40-49
-
- Article
- Export citation
-
Objective.
To evaluate the use of routinely collected electronic health data in Medicare claims to identify surgical site infections (SSIs) following hip arthroplasty, knee arthroplasty, and vascular surgery.
Design.Retrospective cohort study.
Setting.Four academic hospitals that perform prospective SSI surveillance.
Methods.We developed lists of International Classification of Diseases, Ninth Revision, and Current Procedural Terminology diagnosis and procedure codes to identify potential SSIs. We then screened for these codes in Medicare claims submitted by each hospital on patients older than 65 years of age who had undergone 1 of the study procedures during 2007. Each site reviewed medical records of patients identified by either claims codes or traditional infection control surveillance to confirm SSI using Centers for Disease Control and Prevention/ National Healthcare Safety Network criteria. We assessed the performance of both methods against all chart-confirmed SSIs identified by either method.
Results.Claims-based surveillance detected 1.8–4.7-fold more SSIs than traditional surveillance, including detection of all previously identified cases. For hip and vascular surgery, there was a 5-fold and 1.6-fold increase in detection of deep and organ/space infections, respectively, with no increased detection of deep and organ/space infections following knee surgery. Use of claims to trigger chart review led to confirmation of SSI in 1 out of 3 charts for hip arthroplasty, 1 out of 5 charts for knee arthroplasty, and 1 out of 2 charts for vascular surgery.
Conclusion.Claims-based SSI surveillance markedly increased the number of SSIs detected following hip arthroplasty, knee arthroplasty, and vascular surgery. It deserves consideration as a more effective approach to target chart reviews for identifying SSIs.
Infect Control Hosp Epidemiol 2012;33(1):40-49
Findings of the International Nosocomial Infection Control Consortium (INICC), Part II: Impact of a Multidimensional Strategy to Reduce Ventilator-Associated Pneumonia in Neonatal Intensive Care Units in 10 Developing Countries
- Victor D. Rosenthal, Maria E. Rodríguez-Calderón, Marena Rodríguez-Ferrer, Tanu Singhal, Mandakini Pawar, Martha Sobreyra-Oropeza, Amina Barkat, Teodora Atencio-Espinoza, Regina Berba, J. A. Navoa-Ng, Lourdes Dueñas, Nejla Ben-Jaballah, Davut Ozdemir, Gulden Ersoz, Canan Aygun
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 704-710
-
- Article
- Export citation
-
Design.
Before-after prospective surveillance study to assess the efficacy of the International Nosocomial Infection Control Consortium (INICC) multidimensional infection control program to reduce the rate of occurrence of ventilator-associated pneumonia (VAP).
Setting.Neonatal intensive care units (NICUs) of INICC member hospitals from 15 cities in the following 10 developing countries: Argentina, Colombia, El Salvador, India, Mexico, Morocco, Peru, Philippines, Tunisia, and Turkey.
Patients.NICU inpatients.
Methods.VAP rates were determined during a first period of active surveillance without the implementation of the multidimensional approach (phase 1) to be then compared with VAP rates after implementation of the INICC multidimensional infection control program (phase 2), which included the following practices: a bundle of infection control interventions, education, outcome surveillance, process surveillance, feedback on VAP rates, and performance feedback on infection control practices. This study was conducted by infection control professionals who applied National Health Safety Network (NHSN) definitions for healthcare-associated infections and INICC surveillance methodology.
Results.During phase 1, we recorded 3,153 mechanical ventilation (MV)–days, and during phase 2, after the implementation of the bundle of interventions, we recorded 15,981 MV-days. The VAP rate was 17.8 cases per 1,000 MV-days during phase 1 and 12.0 cases per 1,000 MV-days during phase 2 (relative risk, 0.67 [95% confidence interval, 0.50–0.91]; P = .001 ), indicating a 33% reduction in VAP rate.
Conclusions.Our results demonstrate that an implementation of the INICC multidimensional infection control program was associated with a significant reduction in VAP rate in NICUs in developing countries.
Gaseous Chlorine Dioxide as an Alternative for Bedbug Control
- Shawn G. Gibbs, John J. Lowe, Philip W. Smith, Angela L. Hewlett
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 495-499
-
- Article
- Export citation
-
Objective.
This study evaluated the efficacy of gaseous chlorine dioxide (ClO2) for extermination of bedbugs (Cimex lectularius and Citnex hemipterus).
Background.Bedbugs have received attention because of recent outbreaks. Bedbug eradication is difficult and often requires a time-consuming multifaceted approach.
Setting.Laboratory and hospital room.
Methods.Bedbugs were exposed to concentrations of ClO2 of 362, 724, and 1,086 parts per million (ppm) in an exposure chamber. Bedbug mortality was then evaluated. The ability of ClO2 to penetrate various spaces in a hospital room was evaluated using Bacillus atropheus as a surrogate organism.
Results.Concentrations of 1,086 and 724 ppm of ClO2 yielded 100% bedbug mortality assessed immediately after exposure. Live young were not observed for any eggs exposed to ClO2 gas. ClO2 at a concentration of 362 ppm for 1,029 parts per million hours (ppm-hours) achieved 100% mortality 6 hours after exposure. A ClO2 concentration of 362 ppm for 519 ppm-hours had 100% mortality 18 hours after exposure. Up to a 6-log reduction in B. atropheus spores was achieved using similar concentrations of ClO2 in a hospital room, indicating that the concentrations needed to kill bedbugs can be achieved throughout a hospital room.
Conclusions.ClO2 is effective at killing bedbugs in the laboratory, and similar concentrations of ClO2 gas can be achieved in a hospital room. ClO2 can be removed from the room without residuals.
Temporary Central Venous Catheter Utilization Patterns in a Large Tertiary Care Center Tracking the “Idle Central Venous Catheter”
- Sheri Chernetsky Tejedor, David Tong, Jason Stein, Christina Payne, Daniel Dressler, Wenqiong Xue, James P. Steinberg
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 50-57
-
- Article
- Export citation
-
Objectives.
Although central venous catheter (CVC) dwell time is a major risk factor for catheter-related bloodstream infections (CR-BSIs), few studies reveal how often CVCs are retained when not needed (“idle”). We describe use patterns for temporary CVCs, including peripherally inserted central catheters (PICCs), on non-ICU wards.
Design.A retrospective observational study.
Setting.A 579-bed acute care, academic tertiary care facility.
Methods.A retrospective observational study of a random sample of patients on hospital wards who have a temporary, nonimplanted CVC, with a focus on on daily ward CVC justification. A uniform definition of idle CVC-days was used.
Results.We analyzed 89 patients with 146 CVCs (56% of which were PICCs); of 1,433 ward CVC-days, 361 (25.2%) were idle. At least 1 idle day was observed for 63% of patients. Patients had a mean of 4.1 idle days and a mean of 3.4 days with both a CVC and a peripheral intravenous catheter (PIV). After adjusting for ward length of stay, mean CVC dwell time was 14.4 days for patients with PICCs versus 9.0 days for patients with non-PICC temporary CVCs (other CVCs; P< .001). Patients with a PICC had 5.4 days in which they also had a PIV, compared with 10 days in other CVC patients (P< .001). Patients with PICCs had more days in which the only justification for the CVC was intravenous administration of antimicrobial agents (8.5 vs 1.6 days; P = .0013).
Conclusions.Significant proportions of ward CVC-days were unjustified. Reducing “idle CVC-days” and facilitating the appropriate use of PIVs may reduce CVC-days and CR-BSI risk.
Infect Control Hosp Epidemiol 2012;33(1):50-57
The Epidemiology of Methicillin-Resistant Staphylococcus aureus on a Burn Trauma Unit
- Marin Schweizer, Melissa Ward, Sandra Cobb, Jennifer McDanel, Laurie Leder, Lucy Wibbenmeyer, Barbara Latenser, Daniel Diekema, Loreen Herwaldt
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 1118-1125
-
- Article
- Export citation
-
Objective.
We assessed the frequency and relatedness of methicillin-resistant Staphylococcus aureus (MRSA) isolates to determine whether healthcare workers, the environment, or admitted patients could be a reservoir for MRSA on a burn trauma unit (BTU). We also assessed risk factors for MRSA colonization among BTU patients.
Design.Prospective cohort study and surveillance for MRSA carriage.
Setting.BTU of a Midwestern academic medical center.
Patients and Participants.Patients admitted to a BTU from February 2009 through January 2010 and healthcare workers on this unit during the same time period.
Methods.Samples for MRSA culture were collected on admission from the nares and wounds of all BTU patients. We also had collected culture samples from the throat, axilla, antecubital fossa, groin, and perianal area of 12 patients per month. Samples collected from healthcare workers' nares and from environmental sites were cultured quarterly. MRSA isolates were typed by pulsed-field gel electrophoresis.
Results.Of 144 patients, 24 (17%) carried MRSA in their nares on admission. Male sex (odds ratio [OR], 5.51; 95% confidence interval [95% CI], 1.25–24.30), admission for necrotizing fasciitis (OR, 7.66; 95% CI, 1.64–35.81), and MRSA colonization of a site other than the nares (OR, 23.40; 95% CI, 6.93–79.01) were independent predictors of MRSA nasal carriage. Cultures of samples collected from 4 healthcare workers and 4 environmental cultures had positive results. Two patients were colonized with strains that were indistinguishable from strains collected from a healthcare worker or the environment.
Conclusions.Patients were a major reservoir for MRSA. Infection control efforts should focus on preventing transmission of MRSA from patients who are MRSA carriers to other patients on the unit.