Original Articles
Pseudomonas aeruginosa Nosocomial Pneumonia: Impact of Pneumonia Classification
- Scott T. Micek, Marin H. Kollef, Antoni Torres, Catherine Chen, Jordi Rello, Jean Chastre, Massimo Antonelli, Tobias Welte, Bernard Clair, Helmut Ostermann, Esther Calbo, Richard Wunderink, Francesco Menichetti, Garrett Schramm, Vandana Menon
-
- Published online by Cambridge University Press:
- 20 July 2015, pp. 1190-1197
-
- Article
- Export citation
-
OBJECTIVE
To describe and compare the mortality associated with nosocomial pneumonia due to Pseudomonas aeruginosa (Pa-NP) according to pneumonia classification (community-onset pneumonia [COP], hospital-acquired pneumonia [(HAP], and ventilator-associated pneumonia [VAP]).
DESIGNWe conducted a retrospective cohort study of adults with Pa-NP. We compared mortality for Pa-NP among patients with COP, HAP, and VAP and used logistic regression to identify risk factors for hospital mortality and inappropriate initial antibiotic therapy (IIAT).
SETTINGTwelve acute care hospitals in 5 countries (United States, 3; France, 2; Germany, 2; Italy, 2; and Spain, 3).
PATIENTS/PARTICIPANTSA total of 742 patients with Pa-NP.
RESULTSHospital mortality was greater for those with VAP (41.9%) and HAP (40.1%) compared with COP (24.5%) (P<.001). In multivariate analyses, independent predictors of hospital mortality differed by pneumonia classification (COP: need for mechanical ventilation and intensive care; HAP: multidrug-resistant isolate; VAP: IIAT, increasing age, increasing Charlson comorbidity score, bacteremia, and use of vasopressors). Presence of multidrug resistance was identified as an independent predictor of IIAT for patients with COP and HAP, whereas recent antibiotic administration was protective in patients with VAP.
CONCLUSIONSAmong patients with Pa-NP, pneumonia classification identified patients with different risks for hospital mortality. Specific risk factors for hospital mortality also differed by pneumonia classification and multidrug resistance appeared to be an important risk factor for IIAT. These findings suggest that pneumonia classification for P. aeruginosa identifies patients with different mortality risks and specific risk factors for outcome and IIAT.
Infect Control Hosp Epidemiol 2015;36(10):1190–1197
Vaccination Policies Among Health Professional Schools: Evidence of Immunity and Allowance of Vaccination Exemptions
- Samantha B. Dolan, Tanya E. Libby, Megan C. Lindley, Faruque Ahmed, John Stevenson, Raymond A. Strikas
-
- Published online by Cambridge University Press:
- 29 December 2014, pp. 186-191
-
- Article
- Export citation
-
OBJECTIVE
To characterize health professional schools by their vaccination policies for acceptable forms of evidence of immunity and exemptions permitted.
METHODSData were collected between September 2011 and April 2012 using an Internet-based survey e-mailed to selected types of accredited health professional programs. Schools were identified through accrediting associations for each type of health professional program. Analysis was limited to schools requiring ≥1 vaccine recommended by the Advisory Committee on Immunization Practices (ACIP): measles, mumps, rubella, hepatitis B, varicella, pertussis, and influenza. Weighted bivariate frequencies were generated using SAS 9.3.
RESULTSOf 2,775 schools surveyed, 75% (n=2,077) responded; of responding schools, 93% (1947) required ≥1 ACIP-recommended vaccination. The proportion of schools accepting ≥1 non–ACIP-recommended form of evidence of immunity varied by vaccine: 42% for pertussis, 37% for influenza, 30% for rubella, 22% for hepatitis B, 18% for varicella, and 9% for measles and mumps. Among schools with ≥1 vaccination requirement, medical exemptions were permitted for ≥1 vaccine by 75% of schools; 54% permitted religious exemptions; 35% permitted personal belief exemptions; 58% permitted any nonmedical exemption.
CONCLUSIONSMany schools accept non–ACIP-recommended forms of evidence of immunity which could lead some students to believe they are protected from vaccine preventable diseases when they may be susceptible. Additional efforts are needed to better educate school officials about current ACIP recommendations for acceptable forms of evidence of immunity so school policies can be revised as needed.
Infect Control Hosp Epidemiol 2014;00(0): 1–6
Impact of Universal Gowning and Gloving on Health Care Worker Clothing Contamination
- Calvin Williams, Patty McGraw, Elyse E. Schneck, Anna LaFae, Jesse T. Jacob, Daniela Moreno, Katherine Reyes, G. Fernando Cubillos, Daniel H. Kett, Ronald Estrella, Daniel J. Morgan, Anthony D. Harris, Marci Drees
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 431-437
-
- Article
- Export citation
-
OBJECTIVE
To determine whether gowning and gloving for all patient care reduces contamination of healthcare worker (HCW) clothing, compared to usual practice.
DESIGNCross-sectional surveys.
SETTINGFive study sites were recruited from intensive care units (ICUs) randomized to the intervention arm of the Benefits of Universal Gown and Glove (BUGG) study.
PARTICIPANTSAll HCWs performing direct patient care in the study ICUs were eligible to participate.
METHODSSurveys were performed first during the BUGG intervention study period (July–September 2012) with universal gowning/gloving and again after BUGG study conclusion (October–December 2012), with resumption of usual care. During each phase, HCW clothing was sampled at the beginning and near the end of each shift. Cultures were performed using broth enrichment followed by selective media. Acquisition was defined as having a negative clothing culture for samples taken at the beginning of a shift and positive clothing culture at for samples taken at the end of the shift.
RESULTSA total of 348 HCWs participated (21–92 per site), including 179 (51%) during the universal gowning/gloving phase. Overall, 51 (15%) HCWs acquired commonly pathogenic bacteria on their clothing: 13 (7.1%) HCWs acquired bacteria during universal gowning/gloving, and 38 (23%) HCWs acquired bacteria during usual care (odds ratio [OR], 0.3; 95% confidence interval [CI], 0.2–0.6). Pathogens identified included S. aureus (25 species, including 7 methicillin-resistant S. aureus [MRSA]), Enterococcus spp. (25, including 1 vancomycin-resistant Enterococcus [VRE]), Pseudomonas spp. (4), Acinetobacter spp. (4), and Klebsiella (2).
CONCLUSIONNearly 25% of HCWs practicing usual care (gowning and gloving only for patients with known resistant bacteria) contaminate their clothing during their shift. This contamination was reduced by 70% by gowning and gloving for all patient interactions.
Infect Control Hosp Epidemiol 2014;00(0): 1–7
Multidrug-Resistant Gram-Negative Bacteria: Inter- and Intradissemination Among Nursing Homes of Residents With Advanced Dementia
- Erika M. C. D’Agata, Daniel Habtemariam, Susan Mitchell
-
- Published online by Cambridge University Press:
- 29 April 2015, pp. 930-935
-
- Article
- Export citation
-
OBJECTIVE
To quantify the extent of inter– and intra–nursing home transmission of multidrug-resistant gram-negative bacteria (MDRGN) among residents with advanced dementia and characterize MDRGN colonization among these residents.
DESIGNProspective cohort study.
SETTINGTwenty-two nursing homes in the greater Boston, Massachusetts, area.
PATIENTSResidents with advanced dementia.
METHODSSerial rectal surveillance cultures for MDRGN and resident characteristics were obtained every 3 months for 12 months or until death. Molecular typing of MDRGN isolates was performed by pulsed-field gel electrophoresis.
RESULTSA total of 190 MDRGN isolates from 152 residents with advanced dementia were included in the analyses. Both intra– and inter–nursing home transmission were identified. Genetically related MDRGN strains, recovered from different residents, were detected in 18 (82%) of the 22 nursing homes. The percent of clonally related strains in these nursing homes ranged from 0% to 86% (average, 35%). More than 50% of strains were clonally related in 3 nursing homes. Co-colonization with more than 1 different MDRGN species occurred among 28 residents (18.4%). A total of 168 (88.4%), 20 (10.5%), and 2 (1.0%) of MDRGN isolates were resistant to 3, 4, and 5 different antimicrobials or antimicrobial classes, respectively.
CONCLUSIONSMDRGN are spread both within and between nursing homes among residents with advanced dementia. Infection control interventions should begin to target this high-risk group of nursing home residents.
Infect Control Hosp Epidemiol 2015;36(8):930–935
Short Operative Duration and Surgical Site Infection Risk in Hip and Knee Arthroplasty Procedures
- Kristen V. Dicks, Arthur W. Baker, Michael J. Durkin, Deverick J. Anderson, Rebekah W. Moehring, Luke F. Chen, Daniel J. Sexton, David J. Weber, Sarah S. Lewis
-
- Published online by Cambridge University Press:
- 22 September 2015, pp. 1431-1436
-
- Article
- Export citation
-
OBJECTIVE
To determine the association (1) between shorter operative duration and surgical site infection (SSI) and (2) between surgeon median operative duration and SSI risk among first-time hip and knee arthroplasties.
DESIGNRetrospective cohort study
SETTINGA total of 43 community hospitals located in the southeastern United States.
PATIENTSAdults who developed SSIs according to National Healthcare Safety Network criteria within 365 days of first-time knee or hip arthroplasties performed between January 1, 2008 and December 31, 2012.
METHODSLog-binomial regression models estimated the association (1) between operative duration and SSI outcome and (2) between surgeon median operative duration and SSI outcome. Hip and knee arthroplasties were evaluated in separate models. Each model was adjusted for American Society of Anesthesiology score and patient age.
RESULTSA total of 25,531 hip arthroplasties and 42,187 knee arthroplasties were included in the study. The risk of SSI in knee arthroplasties with an operative duration shorter than the 25th percentile was 0.40 times the risk of SSI in knee arthroplasties with an operative duration between the 25th and 75th percentile (risk ratio [RR], 0.40; 95% confidence interval [CI], 0.38–0.56; P<.01). Short operative duration did not demonstrate significant association with SSI for hip arthroplasties (RR, 1.04; 95% CI, 0.79–1.37; P=.36). Knee arthroplasty surgeons with shorter median operative durations had a lower risk of SSI than surgeons with typical median operative durations (RR, 0.52; 95% CI, 0.43–0.64; P<.01).
CONCLUSIONSShort operative durations were not associated with a higher SSI risk for knee or hip arthroplasty procedures in our analysis.
Infect. Control Hosp. Epidemiol. 2015;36(12):1431–1436
Long-Term Outcomes of an Antimicrobial Stewardship Program Implemented in a Hospital with Low Baseline Antibiotic Use
- Timothy C. Jenkins, Bryan C. Knepper, Katherine Shihadeh, Michelle K. Haas, Allison L. Sabel, Andrew W. Steele, Michael L. Wilson, Connie S. Price, William J. Burman, Philip S. Mehler
-
- Published online by Cambridge University Press:
- 05 March 2015, pp. 664-672
-
- Article
- Export citation
-
OBJECTIVE
To evaluate the long-term outcomes of an antimicrobial stewardship program (ASP) implemented in a hospital with low baseline antibiotic use.
DESIGNQuasi-experimental, interrupted time-series study.
SETTINGPublic safety net hospital with 525 beds.
INTERVENTIONImplementation of a formal ASP in July 2008.
METHODSWe conducted a time-series analysis to evaluate the impact of the ASP over a 6.25-year period (July 1, 2008–September 30, 2014) while controlling for trends during a 3-year preintervention period (July 1, 2005–June 30, 2008). The primary outcome measures were total antibacterial and antipseudomonal use in days of therapy (DOT) per 1,000 patient-days (PD). Secondary outcomes included antimicrobial costs and resistance, hospital-onset Clostridium difficile infection, and other patient-centered measures.
RESULTSDuring the preintervention period, total antibacterial and antipseudomonal use were declining (−9.2 and −5.5 DOT/1,000 PD per quarter, respectively). During the stewardship period, both continued to decline, although at lower rates (−3.7 and −2.2 DOT/1,000 PD, respectively), resulting in a slope change of 5.5 DOT/1,000 PD per quarter for total antibacterial use (P=.10) and 3.3 DOT/1,000 PD per quarter for antipseudomonal use (P=.01). Antibiotic expenditures declined markedly during the stewardship period (−$295.42/1,000 PD per quarter, P=.002). There were variable changes in antimicrobial resistance and few apparent changes in C. difficile infection and other patient-centered outcomes.
CONCLUSIONIn a hospital with low baseline antibiotic use, implementation of an ASP was associated with sustained reductions in total antibacterial and antipseudomonal use and declining antibiotic expenditures. Common ASP outcome measures have limitations.
Infect Control Hosp Epidemiol 2015;00(0): 1–9
Implementing a Multifaceted Intervention to Decrease Central Line–Associated Bloodstream Infections in SEHA (Abu Dhabi Health Services Company) Intensive Care Units: The Abu Dhabi Experience
- Asad Latif, Bernadette Kelly, Hanan Edrees, Paula S. Kent, Sallie J. Weaver, Branislava Jovanovic, Hadeel Attallah, Kristin K. de Grouchy, Ali Al-Obaidli, Christine A. Goeschel, Sean M. Berenholtz
-
- Published online by Cambridge University Press:
- 14 April 2015, pp. 816-822
-
- Article
- Export citation
-
OBJECTIVE
To determine whether implementation of a multifaceted intervention would significantly reduce the incidence of central line–associated bloodstream infections.
DESIGNProspective cohort collaborative.
SETTING AND PARTICIPANTSIntensive care units of the Abu Dhabi Health Services Company hospitals in the Emirate of Abu Dhabi.
INTERVENTIONSA bundled intervention consisting of 3 components was implemented as part of the program. It consisted of a multifaceted approach that targeted clinician use of evidence-based infection prevention recommendations, tools that supported the identification of local barriers to these practices, and implementation ideas to help ensure patients received the practices. Comprehensive unit-based safety teams were created to improve safety culture and teamwork. Finally, the measurement and feedback of monthly infection rate data to safety teams, senior leaders, and staff in participating intensive care units was encouraged. The main outcome measure was the quarterly rate of central line–associated bloodstream infections.
RESULTSEighteen intensive care units from 7 hospitals in Abu Dhabi implemented the program and achieved an overall 38% reduction in their central line–associated bloodstream infection rate, adjusted at the hospital and unit level. The number of units with a quarterly central line–associated bloodstream infection rate of less than 1 infection per 1,000 catheter-days increased by almost 40% between the baseline and postintervention periods.
CONCLUSIONA significant reduction in the global morbidity and mortality associated with central line–associated bloodstream infections is possible across intensive care units in disparate settings using a multifaceted intervention.
Infect. Control Hosp. Epidemiol. 2015;36(7):816–822
Environmental Transmission of Clostridium difficile: Association Between Hospital Room Size and C. difficile Infection
- Justine Jou, John Ebrahim, Frances S. Shofer, Keith W. Hamilton, John Stern, Jennifer H. Han
-
- Published online by Cambridge University Press:
- 05 February 2015, pp. 564-568
-
- Article
- Export citation
-
OBJECTIVE
To evaluate the association between hospital room square footage and acquisition of nosocomial Clostridium difficile infection (CDI).
METHODSA case-control study was conducted at a university hospital during the calendar year of 2011. Case patients were adult inpatients with nosocomial CDI. Control patients were hospitalized patients without CDI and were randomly selected and matched to cases in a 2:1 ratio on the basis of hospital length of stay in 3-day strata. A multivariate model was developed using conditional logistic regression to evaluate risk factors for nosocomial CDI.
RESULTSA total of 75 case patients and 150 control patients were included. On multivariate analyses, greater square footage of the hospital room was associated with a significantly increased risk of acquiring CDI (odds ratio for every 50 ft2 increase, 3.00; 95% CI, 1.75–5.16; P<.001). Other factors associated with an increased risk of CDI were location in a single room (odds ratio, 3.43; 95% CI, 1.31–9.05; P=.01), malignant tumor (4.56; 1.82–11.4; P=.001), and receipt of cefepime (2.48; 1.06–5.82; P=.04) or immunosuppressants (6.90; 2.07–23.0; P=.002) within the previous 30 days.
CONCLUSIONSGreater room square footage increased the risk of acquisition of CDI in the hospital setting, likely owing to increased environmental contamination and/or difficulty in effective disinfection. Future studies are needed to determine feasible and effective cleaning protocols based on patient and room characteristics.
Infect Control Hosp Epidemiol 2015;00(0): 1–5
In the Endemic Setting, Clostridium difficile Ribotype 027 Is Virulent But Not Hypervirulent
- Samuel L. Aitken, M. Jahangir Alam, Mohammed Khaleduzzuman, Seth T. Walk, William L. Musick, Vy P. Pham, Jennifer L. Christensen, Robert L. Atmar, Yang Xie, Kevin W. Garey
-
- Published online by Cambridge University Press:
- 20 August 2015, pp. 1318-1323
-
- Article
- Export citation
-
BACKGROUND
Conflicting reports have been published on the association between Clostridium difficile ribotypes and severe disease outcomes in patients with C. difficile infection (CDI); several so-called hypervirulent ribotypes have been described. We performed a multicenter study to assess severe disease presentation and severe outcomes among CDI patients infected with different ribotypes.
METHODSStool samples that tested positive for C. difficile toxin were collected and cultured from patients who presented to any of 7 different hospitals in Houston, Texas (2011–2013). C. difficile was characterized using a fluorescent PCR ribotyping method. Medical records were reviewed to determine clinical characteristics and ribotype association with severe CDI presentation (ie, leukocytosis and/or hypoalbuminemia) and severe CDI outcomes (ie, ICU admission, ileus, toxic megacolon, colectomy, and/or in-hospital death).
RESULTSOur study included 715 patients aged 61±18 years (female: 63%; median Charlson comorbidity index: 2.5±2.4; hospital-onset CDI: 45%; severe CDI: 36.7%; severe CDI outcomes: 12.3%). The most common ribotypes were 027, 014-020, FP311, 002, 078-126, and 001. Ribotype 027 was a significant independent predictor of severe disease (adjusted odds ratio [aOR], 2.24; 95% confidence interval [CI], 1.53–3.29; P<.001) and severe CDI outcomes (aOR, 1.71; 95% CI, 1.02–2.85; P=.041) compared with all other ribotypes in aggregate. However, in an analysis using all common ribotypes as individual variables, ribotype 027 was not associated with severe CDI outcomes more often than other ribotypes.
CONCLUSIONRibotype 027 showed virulence equal to that of other ribotypes identified in this endemic setting. Clinical severity markers of CDI may be more predictive of severe CDI outcomes than a particular ribotype.
Infect. Control Hosp. Epidemiol. 2015;36(11):1318–1323
Clinical Diagnoses and Antimicrobials Predictive of Pediatric Antimicrobial Stewardship Recommendations: A Program Evaluation
- Jennifer L. Goldman, Brian R. Lee, Adam L. Hersh, Diana Yu, Leslie M. Stach, Angela L. Myers, Mary Anne Jackson, James C. Day, Russell J. McCulloh, Jason G. Newland
-
- Published online by Cambridge University Press:
- 16 March 2015, pp. 673-680
-
- Article
- Export citation
-
BACKGROUND
The number of pediatric antimicrobial stewardship programs (ASPs) is increasing and program evaluation is a key component to improve efficiency and enhance stewardship strategies.
OBJECTIVETo determine the antimicrobials and diagnoses most strongly associated with a recommendation provided by a well-established pediatric ASP.
DESIGN AND SETTINGRetrospective cohort study from March 3, 2008, to March 2, 2013, of all ASP reviews performed at a free-standing pediatric hospital.
METHODSASP recommendations were classified as follows: stop therapy, modify therapy, optimize therapy, or consult infectious diseases. A multinomial distribution model to determine the probability of each ASP recommendation category was performed on the basis of the specific antimicrobial agent or disease category. A logistic model was used to determine the odds of recommendation disagreement by the prescribing clinician.
RESULTSThe ASP made 2,317 recommendations: stop therapy (45%), modify therapy (26%), optimize therapy (19%), or consult infectious diseases (10%). Third-generation cephalosporins (0.20) were the antimicrobials with the highest predictive probability of an ASP recommendation whereas linezolid (0.05) had the lowest probability. Community-acquired pneumonia (0.26) was the diagnosis with the highest predictive probability of an ASP recommendation whereas fever/neutropenia (0.04) had the lowest probability. Disagreement with ASP recommendations by the prescribing clinician occurred 22% of the time, most commonly involving community-acquired pneumonia and ear/nose/throat infections.
CONCLUSIONSEvaluation of our pediatric ASP identified specific clinical diagnoses and antimicrobials associated with an increased likelihood of an ASP recommendation. Focused interventions targeting these high-yield areas may result in increased program efficiency and efficacy.
Infect Control Hosp Epidemiol 2015;00(0): 1–8
Evaluation of a Pulsed Xenon Ultraviolet Disinfection System for Reduction of Healthcare-Associated Pathogens in Hospital Rooms
- Michelle M. Nerandzic, Priyaleela Thota, Thriveen Sankar C., Annette Jencson, Jennifer L. Cadnum, Amy J. Ray, Robert A. Salata, Richard R. Watkins, Curtis J. Donskey
-
- Published online by Cambridge University Press:
- 05 January 2015, pp. 192-197
-
- Article
-
- You have access Access
- HTML
- Export citation
-
OBJECTIVE
To determine the effectiveness of a pulsed xenon ultraviolet (PX-UV) disinfection device for reduction in recovery of healthcare-associated pathogens.
SETTINGTwo acute-care hospitals.
METHODSWe examined the effectiveness of PX-UV for killing of Clostridium difficile spores, methicillin-resistant Staphylococcus aureus (MRSA), and vancomycin-resistant Enterococcus (VRE) on glass carriers and evaluated the impact of pathogen concentration, distance from the device, organic load, and shading from the direct field of radiation on killing efficacy. We compared the effectiveness of PX-UV and ultraviolet-C (UV-C) irradiation, each delivered for 10 minutes at 4 feet. In hospital rooms, the frequency of native pathogen contamination on high-touch surfaces was assessed before and after 10 minutes of PX-UV irradiation.
RESULTSOn carriers, irradiation delivered for 10 minutes at 4 feet from the PX-UV device reduced recovery of C. difficile spores, MRSA, and VRE by 0.55±0.34, 1.85±0.49, and 0.6±0.25 log10 colony-forming units (CFU)/cm2, respectively. Increasing distance from the PX-UV device dramatically reduced killing efficacy, whereas pathogen concentration, organic load, and shading did not. Continuous UV-C achieved significantly greater log10CFU reductions than PX-UV irradiation on glass carriers. On frequently touched surfaces, PX-UV significantly reduced the frequency of positive C. difficile, VRE, and MRSA culture results.
CONCLUSIONSThe PX-UV device reduced recovery of MRSA, C. difficile, and VRE on glass carriers and on frequently touched surfaces in hospital rooms with a 10-minute UV exposure time. PX-UV was not more effective than continuous UV-C in reducing pathogen recovery on glass slides, suggesting that both forms of UV have some effectiveness at relatively short exposure times.
Infect Control Hosp Epidemiol 2014;00(0): 1–6
Interferon-γ Release Assay vs. Tuberculin Skin Test for Tuberculosis Screening in Exposed Healthcare Workers: A Longitudinal Multicenter Comparative Study
- Jean-Christophe Lucet, Dominique Abiteboul, Candice Estellat, Carine Roy, Sylvie Chollet-Martin, Florence Tubach, Guislaine Carcelain, and the QUANTIPS Study Group
-
- Published online by Cambridge University Press:
- 16 February 2015, pp. 569-574
-
- Article
- Export citation
-
OBJECTIVE
Healthcare workers (HCWs), especially those caring for patients with tuberculosis (TB), are at high risk of acquiring that disease. The poor specificity of tuberculin skin testing (TST) prompted us to evaluate the effectiveness of the interferon-γ release assay (IGRA) in comparison with TST in a large prospective, multicenter, 1-year study of HCWs with occupational exposure to TB.
METHODSHCWs from high-risk units at 14 university hospitals were invited to participate and underwent both TST and IGRA (first Quantiferon TB Gold-IT®, QFT-G, then T-SPOT.TB® if QFT-G was indeterminate) at baseline and after 1 year. We collected demographic characteristics, country of birth, history of TB, immunosuppression, past exposure to TB, history of BCG vaccination, results of most recent TST, job category, and duration of current function.
RESULTSAmong 807 HCWs enrolled, current or past TST at baseline was positive (≥15 mm) in 282 (34.9%); the IGRA was positive in 113 (14.0%) and indeterminate in 3 (0.4%). After 1 year, 594 HCWs had both an IGRA and TST (or prior TST≥15 mm) at baseline and an IGRA and TST (if indicated). The conversion rate was 2.5% (9 of 367) with TST and 7.6% (45 of 594) with IGRA, with poor agreement between the 2 tests. Using only QFT-G, conversion (9.9%) and reversion (17.8%) rates were higher for baseline QFT-G positive quantitative values <1 IU/mL.
CONCLUSIONTST and the IGRA yielded discordant results. The value of IGRA in addition to TST remains undetermined; the two should be jointly interpreted in decision-making (clinical trial registration NCT00797836).
Infect Control Hosp Epidemiol 2015;00(0): 1–6
Prevention of Needle-Stick Injuries in Healthcare Facilities: A Meta-Analysis
- Lukman H. Tarigan, Manuel Cifuentes, Margaret Quinn, David Kriebel
-
- Published online by Cambridge University Press:
- 13 March 2015, pp. 823-829
-
- Article
- Export citation
-
OBJECTIVE
To estimate the summary effectiveness of different needle-stick injury (NSI)-prevention interventions.
DESIGNWe conducted a meta-analysis of English-language articles evaluating methods for reducing needle stick, sharp, or percutaneous injuries published from 2002 to 2012 identified using PubMed and Medline EBSCO databases. Data were extracted using a standardized instrument. Random effects models were used to estimate the summary effectiveness of 3 interventions: training alone, safety-engineered devices (SEDs) alone, and the combination of training and SEDs.
SETTINGHealthcare facilities, mainly hospitals
PARTICIPANTSHealthcare workers including physicians, midwives, and nurses
RESULTSFrom an initial pool of 250 potentially relevant studies, 17 studies met our inclusion criteria. Six eligible studies evaluated the effectiveness of training interventions, and the summary effect of the training intervention was 0.66 (95% CI, 0.50–0.89). The summary effect across the 5 studies that assessed the efficacy of SEDs was 0.51 (95% CI, 0.40–0.64). A total of 8 studies evaluated the effectiveness of training plus SEDs, with a summary effect of 0.38 (95% CI, 0.28–0.50).
CONCLUSIONTraining combined with SEDs can substantially reduce the risk of NSIs.
Infect Control Hosp Epidemiol 2015;36(7):823–829
Surgical Site Infection After Primary Hip and Knee Arthroplasty: A Cohort Study Using a Hospital Database
- Leslie Grammatico-Guillon, Sabine Baron, Philippe Rosset, Christophe Gaborit, Louis Bernard, Emmanuel Rusch, Pascal Astagneau
-
- Published online by Cambridge University Press:
- 08 July 2015, pp. 1198-1207
-
- Article
- Export citation
-
BACKGROUND
Hip or knee arthroplasty infection (HKAI) leads to heavy medical consequences even if rare.
OBJECTIVETo assess the routine use of a hospital discharge detection algorithm of prosthetic joint infection as a novel additional tool for surveillance.
METHODSA historic 5-year cohort study was built using a hospital database of people undergoing a first hip or knee arthroplasty in 1 French region (2.5 million inhabitants, 39 private and public hospitals): 32,678 patients with arthroplasty code plus corresponding prosthetic material code were tagged. HKAI occurrence was then tracked in the follow-up on the basis of a previously validated algorithm using International Statistical Classification of Disease, Tenth Revision, codes as well as the surgical procedures coded. HKAI density incidence was estimated during the follow-up (up to 4 years after surgery); risk factors were analyzed using Cox regression.
RESULTSA total of 604 HKAI patients were identified: 1-year HKAI incidence was1.31%, and density incidence was 2.2/100 person-years in hip and 2.5/100 person-years in knee. HKAI occurred within the first 30 days after surgery for 30% but more than 1 year after replacement for 29%. Patients aged 75 years or older, male, or having liver diseases, alcohol abuse, or ulcer sore had higher risk of infection. The inpatient case fatality in HKAI patients was 11.4%.
CONCLUSIONSThe hospital database method used to measure occurrence and risk factors of prosthetic joint infection helped to survey HKAI and could optimize healthcare delivery.
Infect Control Hosp Epidemiol 2015;36(10):1198–1207
Cost-Effectiveness Analysis of Fecal Microbiota Transplantation for Recurrent Clostridium difficile Infection
- Raghu U. Varier, Eman Biltaji, Kenneth J. Smith, Mark S. Roberts, M. Kyle Jensen, Joanne LaFleur, Richard E. Nelson
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 438-444
-
- Article
- Export citation
-
OBJECTIVE
Clostridium difficile infection (CDI) places a high burden on the US healthcare system. Recurrent CDI (RCDI) occurs frequently. Recently proposed guidelines from the American College of Gastroenterology (ACG) and the American Gastroenterology Association (AGA) include fecal microbiota transplantation (FMT) as a therapeutic option for RCDI. The purpose of this study was to estimate the cost-effectiveness of FMT compared with vancomycin for the treatment of RCDI in adults, specifically following guidelines proposed by the ACG and AGA.
DESIGNWe constructed a decision-analytic computer simulation using inputs from the published literature to compare the standard approach using tapered vancomycin to FMT for RCDI from the third-party payer perspective. Our effectiveness measure was quality-adjusted life years (QALYs). Because simulated patients were followed for 90 days, discounting was not necessary. One-way and probabilistic sensitivity analyses were performed.
RESULTSBase-case analysis showed that FMT was less costly ($1,669 vs $3,788) and more effective (0.242 QALYs vs 0.235 QALYs) than vancomycin for RCDI. One-way sensitivity analyses showed that FMT was the dominant strategy (both less expensive and more effective) if cure rates for FMT and vancomycin were ≥70% and <91%, respectively, and if the cost of FMT was <$3,206. Probabilistic sensitivity analysis, varying all parameters simultaneously, showed that FMT was the dominant strategy over 10, 000 second-order Monte Carlo simulations.
CONCLUSIONSOur results suggest that FMT may be a cost-saving intervention in managing RCDI. Implementation of FMT for RCDI may help decrease the economic burden to the healthcare system.
Infect Control Hosp Epidemiol 2014;00(0): 1–7
Factors Influencing Field Testing of Alcohol-Based Hand Rubs
- Raphaële Girard, Emmanuelle Carre, Valérie Mermet, Crespin C. Adjide, Sylviane Blaise, Monique Dagain, Christine Debeuret, Stéphane Delande, Valérie Dubois, Pascal Fascia, Caroline Hadjadj, Marianne Honnart, Christelle Labrande, Agnès Lasheras Bauduin, Adeline Martin, Françoise Petiteau Moreau, Nicole Roattino, Estelle Rougeot, Jacqueline Shum Cheong Sing, Martine Urban, Marie Laure Valdeyron
-
- Published online by Cambridge University Press:
- 29 December 2014, pp. 302-310
-
- Article
- Export citation
-
BACKGROUND
According to the World Health Organization guidelines, field tests, in the context of a bid for the supply of alcohol-based hand rubs, should take into account climatic region, test period, products already in use, and type of use (hygienic or surgical) when assessing tolerance. This laborious method is often contested.
OBJECTIVETo conduct a post hoc analysis of the data of a large bid, including 5 factors, to validate the relevance of their inclusion.
METHODSFor the purposes of the bid, products were compared in terms of the 4 World Health Organization tolerance criteria (appearance, intactness, moisture content, sensation) during product testing and were separated into groups on the basis of the studied factors. The post hoc analysis method included (1) comparison of the mean before-and-after difference based on the self-evaluation of the skin with the 4 World Health Organization tolerance criteria, between climatic regions, periods, products in use, test product, and the type of use; (2) generalized linear models, taking into account all studied factors.
RESULTSThe analysis included data for 1,925 pairs of professionals. The means of the differences observed were independently and significantly associated with the test period (P<.001), the hygienic or surgical use (P=.010 to .041, not significant for appearance), the product already in use (significant for appearance P=.021), and the test product (P<.001). The association with climatic region was found to be significant only in the nonadjusted analysis.
CONCLUSIONThe type of use, the test period, and the product in use should be taken into account when designing field tests of alcohol-based hand rubs.
Infect Control Hosp Epidemiol 2014;00(0): 1–9
Usefulness of Adenosinetriphosphate Bioluminescence Assay (ATPmetry) for Monitoring the Reprocessing of Endoscopes
- Pierre Batailler, Philippe Saviuc, Romain Picot-Gueraud, Jean-Luc Bosson, Marie-Reine Mallaret
-
- Published online by Cambridge University Press:
- 20 October 2015, pp. 1437-1443
-
- Article
- Export citation
-
OBJECTIVE
To assess the diagnostic value of an adenosinetriphosphate bioluminescence assay (ATPmetry) to monitor the effectiveness of the reprocessing of endoscopes compared with microbiologic sampling.
DESIGNDiagnostic study.
SETTINGA 2,200-bed teaching hospital performing 5,000 to 6,000 endoscopic procedures annually.
INCLUSION CRITERIAAll samples from bronchial or gastrointestinal endoscopes whatever the context.
METHODSSamples for microbiologic analysis and ATPmetry measurements were taken when each endoscope was inspected following reprocessing. Sampling was performed by flushing each endoscope with 300 mL Neutralizing Pharmacopeia Diluent thiosulfate rinsing solution divided equally between the endoscope channels. For each endoscope a series of 3 ATPmetry measurements were made on a vial containing the first jet from each channel and a second series on the whole sample.
RESULTSOf 165 samples from endoscopes, 11 exceeded the acceptability threshold of 25 colony-forming units/endoscope. In the first jet collected, the median (interquartile range) level of ATPmetry was 30.5 (15.3–37.7) relative light units (RLU) for samples with 25 or fewer colony-forming units compared with 37.0 (34.7–39.3) RLU for samples with more than 25 colony-forming units (P=.008). For the whole sample, the median (interquartile range) level of ATPmetry was 24.8 (14.3–36.3) RLU and 36.3 (36.0–38.3) RLU (P=.006), respectively. After adjusting on the batch of cleansing solution used, no difference in ATPmetry values was found between microbiologically acceptable and unacceptable samples.
CONCLUSIONATPmetry cannot be used as an alternative or complementary approach to microbiologic tests for monitoring the reprocessing of endoscopes in France
Infect. Control Hosp. Epidemiol. 2015;36(12):1437–1443
Whole Genome Sequencing in Real-Time Investigation and Management of a Pseudomonas aeruginosa Outbreak on a Neonatal Intensive Care Unit
- Rebecca J. Davis, Slade O. Jensen, Sebastiaan Van Hal, Björn Espedido, Adrienne Gordon, Rima Farhat, Raymond Chan
-
- Published online by Cambridge University Press:
- 08 June 2015, pp. 1058-1064
-
- Article
- Export citation
-
OBJECTIVE
To use whole genome sequencing to describe the likely origin of an outbreak of Pseudomonas aeruginosa in a neonatal unit.
DESIGNOutbreak investigation.
SETTINGThe neonatal intensive care unit service of a major obstetric tertiary referral center.
PATIENTSInfants admitted to the neonatal unit who developed P. aeruginosa colonization or infection.
METHODSWe undertook whole genome sequencing of P. aeruginosa strains isolated from colonized infants and from the neonatal unit environment.
RESULTSEighteen infants were colonized with P. aeruginosa. Isolates from 12 infants and 7 environmental samples were sequenced. All but one of the clinical isolates clustered in ST253 and no differences were detected between unmapped reads. The environmental isolates revealed a variety of sequence types, indicating a large diverse bioburden within the unit, which was subsequently confirmed via enterobacterial repetitive intergenic consensus–polymerase chain reaction typing of post-outbreak isolates. One environmental isolate, obtained from a sink in the unit, clustered within ST253 and differed from the outbreak strain by 9 single-nucleotide polymorphisms only. This information allowed us to focus infection control activities on this sink.
CONCLUSIONSWhole genome sequencing can provide detailed information in a clinically relevant time frame to aid management of outbreaks in critical patient management areas. The superior discriminatory power of this method makes it a powerful tool in infection control.
Infect. Control Hosp. Epidemiol. 2015;36(9):1058–1064
Validation of an Automated Surveillance Approach for Drain-Related Meningitis: A Multicenter Study
- Maaike S. M. van Mourik, Annet Troelstra, Jan Willem Berkelbach van der Sprenkel, Marischka C. E. van der Jagt-Zwetsloot, Jolande H. Nelson, Piet Vos, Mark P. Arts, Paul J. W. Dennesen, Karel G. M. Moons, Marc J. M. Bonten
-
- Published online by Cambridge University Press:
- 05 January 2015, pp. 65-75
-
- Article
- Export citation
-
OBJECTIVE
Manual surveillance of healthcare-associated infections is cumbersome and vulnerable to subjective interpretation. Automated systems are under development to improve efficiency and reliability of surveillance, for example by selecting high-risk patients requiring manual chart review. In this study, we aimed to validate a previously developed multivariable prediction modeling approach for detecting drain-related meningitis (DRM) in neurosurgical patients and to assess its merits compared to conventional methods of automated surveillance.
METHODSProspective cohort study in 3 hospitals assessing the accuracy and efficiency of 2 automated surveillance methods for detecting DRM, the multivariable prediction model and a classification algorithm, using manual chart review as the reference standard. All 3 methods of surveillance were performed independently. Patients receiving cerebrospinal fluid drains were included (2012–2013), except children, and patients deceased within 24 hours or with pre-existing meningitis. Data required by automated surveillance methods were extracted from routine care clinical data warehouses.
RESULTSIn total, DRM occurred in 37 of 366 external cerebrospinal fluid drainage episodes (12.3/1000 drain days at risk). The multivariable prediction model had good discriminatory power (area under the ROC curve 0.91–1.00 by hospital), had adequate overall calibration, and could identify high-risk patients requiring manual confirmation with 97.3% sensitivity and 52.2% positive predictive value, decreasing the workload for manual surveillance by 81%. The multivariable approach was more efficient than classification algorithms in 2 of 3 hospitals.
CONCLUSIONSAutomated surveillance of DRM using a multivariable prediction model in multiple hospitals considerably reduced the burden for manual chart review at near-perfect sensitivity.
Infect Control Hosp Epidemiol 2015;36(1): 65–75
Risk Factors for Recurrence of Carbapenem-Resistant Enterobacteriaceae Carriage: Case-Control Study
- Yossi Bart, Mical Paul, Orna Eluk, Yuval Geffen, Galit Rabino, Khetam Hussein
-
- Published online by Cambridge University Press:
- 14 April 2015, pp. 936-941
-
- Article
- Export citation
-
BACKGROUND
The natural history of carbapenem-resistant Enterobacteriaceae (CRE) carriage and the timing and procedures required to safely presume a CRE-free status are unclear.
OBJECTIVETo determine risk factors for recurrence of CRE among presumed CRE-free patients.
METHODSCase-control study including CRE carriers in whom CRE carriage presumably ended, following at least 2 negative screening samples on separate days. Recurrence of CRE carriage was identified through clinical samples and repeated rectal screening in subsequent admissions to any healthcare facility in Israel. Patients with CRE recurrence (cases) were compared with recurrence-free patients (controls). The duration of follow-up was 1 year for all surviving patients.
RESULTSIncluded were 276 prior CRE carriers who were declared CRE-free. Thirty-six persons (13%) experienced recurrence of CRE carriage within a year after presumed eradication. Factors significantly associated with CRE recurrence on multivariable analysis were the time in months between the last positive CRE sample and presumed eradication (odds ratio, 0.94 [95% CI, 0.89–0.99] per month), presence of foreign bodies at the time of presumed eradication (4.6 [1.64–12.85]), and recurrent admissions to healthcare facilities during follow-up (3.15 [1.05–9.47]). The rate of CRE recurrence was 25% (11/44) when the carrier status was presumed to be eradicated 6 months after the last known CRE-positive sample, compared with 7.5% (10/134) if presumed to be eradicated after 1 year.
CONCLUSIONSWe suggest that the CRE-carrier status be maintained for at least 1 year following the last positive sample. Screening of all prior CRE carriers regardless of current carriage status is advised.
Infect. Control Hosp. Epidemiol. 2015;36(8):936–941