Original Articles
Comparison of the Use of Administrative Data and an Active System for Surveillance of Invasive Aspergillosis
- Douglas C. Chang, Lauren A. Burwell, G. Marshall Lyon, Peter G. Pappas, Tom M. Chiller, Kathleen A. Wannemuehler, Scott K. Fridkin, Benjamin J. Park
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 25-30
-
- Article
- Export citation
-
Background.
Administrative data, such as International Classification of Diseases, Ninth Revision (ICD-9) codes, are readily available and are an attractive option for surveillance and quality assessment within a single institution or for interinstitutional comparisons. To understand the usefulness of administrative data for the surveillance of invasive aspergillosis, we compared information obtained from a system based on ICD-9 codes with information obtained from an active, prospective surveillance system, which used more extensive case-finding methods (Transplant Associated Infection Surveillance Network).
Methods.Patients with suspected inyasive aspergillosis were identified by aspergillosis-related ICD-9 codes assigned to hematopoietic stem cell transplant recipients and solid organ transplant recipients at a single hospital from April 1, 2001, through January 31, 2005. Suspected cases were classified as proven or probable invasive aspergillosis by medical record review using standard definitions. We calculated the sensitivity and positive predictive value (PPV) of identifying invasive aspergillosis by individual ICD-9 codes and by combinations of codes.
Results.The sensitivity of code 117.3 was modest (63% [95% confidence interval {CI}, 38%-84%]), as was the PPV (71% [95% CI, 44%-90%]); the sensitivity of code 117.9 was poor (32% [95% CI, 13%-57%]), as was the PPV (15% [95% CI, 6%-31%]). The sensitivity of codes 117.3 and 117.9 combined was 84% (95% CI, 60%-97%); the PPV of the combined codes was 30% (95% CI, 18%-44%). Overall, ICD-9 codes triggered a review of medical records for 64 medical patients, only 16 (25%) of whom had proven or probable invasive aspergillosis.
Conclusions.A surveillance system that involved multiple ICD-9 codes was sufficiently sensitive to identify most cases of invasive aspergillosis; however, the poor PPV of ICD-9 codes means that this approach is not adequate as the sole tool used to classify cases. Screening ICD-9 codes to trigger a medical record review might be a useful method of surveillance for invasive aspergillosis and quality assessment, although more investigation is needed.
Extended Use of Urinary Catheters in Older Surgical Patients: A Patient Safety Problem?
- Heidi L. Wald, Anne M. Epstein, Tiffany A. Radcliff, Andrew M. Kramer
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 116-124
-
- Article
- Export citation
-
Objectives.
To explore the relationship between the extended postoperative use of indwelling urinary catheters and outcomes for older patients who have undergone cardiac, vascular, gastrointestinal, or orthopedic surgery in skilled nursing facilities and to describe patient and hospital characteristics associated with the extended use of indwelling urinary catheters.
Design.Retrospective cohort study.
Setting.US acute care hospitals and skilled nursing facilities.
Patients.A total of 170,791 Medicare patients aged 65 years or more who were admitted to skilled nursing facilities after discharge from a hospital with a primary diagnosis code indicating major cardiac, vascular, orthopedic, or gastrointestinal surgery in 2001.
Main Outcome Measures.Patient-specific 30-day rate of rehospitalization for urinary tract infection (UTI) and 30-day mortality rate, as well as the risk of having an indwelling urinary catheter at the time of admission to a skilled nursing facility.
Results.A total of 39,282 (23.0%) of the postoperative patients discharged to skilled nursing facilities had indwelling urinary catheters. After adjusting for patient characteristics, the patients with catheters had greater odds of rehospitalization for UTI and death within 30 days than patients who did not have catheters. The adjusted odds ratios (aORs) for UTI ranged from 1.34 for patients who underwent gastrointestinal surgery (P <.001) to 1.85 for patients who underwent cardiac surgery (P <.001); the aORs for death ranged from 1.25 for cardiac surgery (P = .01) to 1.48 for orthopedic surgery (P = .002) and for gastrointestinal surgery (P < .001). After controlling for patient characteristics, hospitalization in the northeastern or southern regions of the United States was associated with a lower likelihood of having an indwelling urinary catheter, compared with hospitalization in the western region (P = .002 vs P = .03).
Conclusions.Extended postoperative use of indwelling urinary catheters is associated with poor outcomes for older patients. The likelihood of having an indwelling urinary catheter at the time of discharge after major surgery is strongly associated with a hospital's geographic region, which reflects a variation in practice that deserves further study.
Original Article
Rising Economic Impact of Clostridium difficile-Associated Disease in Adult Hospitalized Patient Population
- Xiaoyan Song, John G Bartlett, Kathleen Speck, April Naegeli, Karen Carroll, Trish M. Perl
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 823-828
-
- Article
- Export citation
-
Background.
Clostridium difficile-associated disease (CDAD) is responsible for increased morbidity and a substantial economic burden. Incidences of CDAD, including those with a severe course of illness, have been increasing rapidly.
Objective.To evaluate the excess mortality, increased length of stay (LOS) in the hospital, and additional costs associated with CDAD.
Design.A retrospective matched cohort study.
Patients.Adult patients admitted to a large tertiary care hospital between January 2000 and October 2005.
Methods.Adult patients were tested with a C. difficile laboratory assay at admission or 72 hours after admission. Infected patients had lor more positive assay results and were individually matched to 1 uninfected patient who had negative assay results, by exposure time, age, ward, and at least 2 measurements for comorbidity and severity of illness.
Results.The incidence rate of CDAD among adult patients increased from 0.57 cases per 1,000 patient-days at risk before 2004 to 0.88 cases per 1,000 patient-days at risk after 2004 (P < .001). The 630 infected patients had a mortality rate of 11.9%; the 630 uninfected patients had a mortality rate of 15.1% (P = .02). After adjustment in the multivariate analysis, we found that the LOS for infected patients was 4 days longer than that for uninfected patients (P < .001). If CDAD occurred after 2004, the additional LOS increased to 5.5 days. The direct cost associated with CDAD was $306 per case; after year 2004, it increased to $6,326 per case.
Conclusions.There may be no excess mortality among patients with CDAD, compared with patients without it, but the economic burden of CDAD is increasing. By 2004, CDAD-associated medical expenditures approached $1,000,000 per year at our institution alone.
Use of a Mandatory Declination Form in a Program for Influenza Vaccination of Healthcare Workers
- Bruce S. Ribner, Cynthia Hall, James P. Steinberg, William A. Bornstein, Rosette Chakkalakal, Amir Emamifar, Irving Eichel, Peter C. Lee, Penny Z. Castellano, Gilbert D. Grossman
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 302-308
-
- Article
- Export citation
-
Objective.
To evaluate the utility and impact of using a declination form in the context of an influenza immunization program for healthcare workers.
Methods.A combined form for documentation of vaccination consent, medical contraindication(s) for vaccination, or vaccination declination was used during the 2006-2007 influenza season in a healthcare system employing approximately 9,200 nonphysician employees in 3 hospitals; a skilled nursing care facility; a large, multisite, faculty-practice plan; and an administrative building. Responses were entered into a database that contained files from human resources departments, which allowed correlation with job category and work location.
Results.The overall levels of influenza vaccination coverage of employees increased from 43% (3,892 of 9,050) during the 2005-2006 season to 66.5% (6,123 of 9,214) during the 2006-2007 season. Of 9,214 employees, 1,898 (20.6%) signed the declination statement. Among the occupation groups, nurses had the lowest rate of declining vaccination (13.2% [393 of 2,970]; P < .0001), followed by pharmacy personnel (18.1% [40 of 221]), ancillary personnel with frequent patient contact (21.9% [169 of 771), and all others (24.7% [1,296 of 5,252]). Among the employees who declined vaccination, nurses were the least likely to select the reasons “afraid of needles” (3.8% [15 of 393], vs. 9.1% [137 of 1,505] for all other groups; P < .001) and “fear of getting influenza from the vaccine” (13.5% [53 of 393], vs. 20.5% [309 of 1,505]; P = .002). Seven pregnant nurses had been advised by their obstetricians to avoid vaccination. When declination of influenza vaccination was analyzed by age, 16% of personnel (797 of 4,980) 50 years of age and older declined to be vaccinated, compared with 26% of personnel (1,101 of 4,234) younger than 50 years of age {P < .0001).
Conclusions.Implementing use of the declination form during the 2006-2007 influenza season was one of several measures that led to a 55% increase in the acceptance of influenza vaccination by healthcare workers in our healthcare system. Although we cannot determine to what degree use of the declination form contributed to the increased rate of vaccination, use of this form helped the vaccination program assess the reasons for declination and will help to focus future vaccination campaigns.
SHEA/IDSA Practice Recommendations
Strategies to Prevent Central Line–Associated Bloodstream Infections in Acute Care Hospitals
- Jonas Marschall, Leonard A. Mermel, David Classen, Kathleen M. Arias, Kelly Podgorny, Deverick J. Anderson, Helen Burstin, David P. Calfee, Susan E. Coffin, Erik R. Dubberke, Victoria Fraser, Dale N. Gerding, Frances A. Griffin, Peter Gross, Keith S. Kaye, Michael Klompas, Evelyn Lo, Lindsay Nicolle, David A. Pegues, Trish M. Perl, Sanjay Saint, Cassandra D. Salgado, Robert A. Weinstein, Robert Wise, Deborah S. Yokoe
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. S22-S30
-
- Article
- Export citation
-
Previously published guidelines are available that provide comprehensive recommendations for detecting and preventing healthcare-associated infections. The intent of this document is to highlight practical recommendations in a concise format designed to assist acute care hospitals in implementing and prioritizing their central line–associated bloodstream infection (CLABSI) prevention efforts. Refer to the Society for Healthcare Epidemiology of America/Infectious Diseases Society of America “Compendium of Strategies to Prevent Healthcare-Associated Infections” Executive Summary and Introduction and accompanying editorial for additional discussion.
1. Patients at risk for CLABSIs in acute care facilities
a. Intensive care unit (ICU) population: The risk of CLABSI in ICU patients is high. Reasons for this include the frequent insertion of multiple catheters, the use of specific types of catheters that are almost exclusively inserted in ICU patients and associated with substantial risk (eg, arterial catheters), and the fact that catheters are frequently placed in emergency circumstances, repeatedly accessed each day, and often needed for extended periods.
b. Non-ICU population: Although the primary focus of attention over the past 2 decades has been the ICU setting, recent data suggest that the greatest numbers of patients with central lines are in hospital units outside the ICU, where there is a substantial risk of CLABSI.
2. Outcomes associated with hospital-acquired CLABSI
a. Increased length of hospital stay
b. Increased cost; the non-inflation-adjusted attributable cost of CLABSIs has been found to vary from $3,700 to $29,000 per episode
Original Articles
Rates of Surgical Site Infection After Hip Replacement as a Hospital Performance Indicator: Analysis of Data From the English Mandatory Surveillance System
- J. Wilson, A. Charlett, G. Leong, C. McDougall, G. Duckworth
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 219-226
-
- Article
- Export citation
-
Objective.
To describe rates of surgical site infection (SSI) after hip replacement and to use these data to provide a simple mechanism for identifying poorly performing hospitals that takes into account variations in sample size.
Design.Prospective surveillance study.
Setting.A total of 125 acute care hospitals in England that participated in mandatory SSI surveillance from April 1, 2004 through March 31, 2005.
Patients.Patients who underwent total hip replacement (THR) or hip hemiarthroplasty (HH).
Methods.A standard data set was collected for all eligible operations at participating hospitals for a minimum of 3 months annually. Defined methods were used to identify SSIs that occurred during the inpatient stay. Data were checked for quality and accuracy, and funnel plots were constructed by plotting the incidence of SSI against the number of operations.
Results.Data were collected on 16,765 THRs and 5,395 HHs. The cumulative SSI incidence rates were 1.26% for THR and 4.06% for HH; the incidence densities were 1.38 SSIs per 1,000 postoperative inpatient days for THR and 2.3 SSIs per 1,000 postoperative inpatient days for HH. The risk of infection associated with revision surgery was significantly higher than that associated with primary surgery (2.7% [95% confidence interval, 2.0%-3.5%] vs. 1.1% [95% confidence interval, 1.0%-1.2%];P = .003). Rates varied considerably among hospitals. Nineteen hospitals had rates above the 90th percentile. However, the use of funnel plots to adjust for the precision of estimated SSI rates identified 7 hospitals that warranted further investigation, including 2 with crude rates below the 90th percentile.
Conclusions.Funnel plots of rates of SSI after hip replacement provide a valuable method of presenting hospital performance data, clearly identifying hospitals with unusually high or low rates while adjusting for the precision of the estimated rate. This information can be used to target and support local interventions to reduce the risk of infection.
Original Article
Implementation of an Industrial Systems-Engineering Approach to Reduce the Incidence of Methicillin-Resistant Staphylococcus aureus Infection
- Robert R. Muder, Candace Cunningham, Ellesha McCray, Cheryl Squier, Peter Perreiah, Rajiv Jain, Ronda L. Sinkowitz-Cochran, John A. Jernigan
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 702-708
-
- Article
- Export citation
-
Objective.
To measure the effectiveness of an industrial systems-engineering approach to a methicillin-resistant Staphylococcus aureus (MRSA) prevention program.
Design.Before-after intervention study
Setting.An intensive care unit (ICU) and a surgical unit that was not an ICU in the Pittsburgh Veterans Administration hospital
Patients.Allpatientsadmittedtothe study units
Intervention.We implemented an MRSA infection control program that consisted of the following 4 elements: (1) the use of standard precautions for all patient contact, with emphasis on hand hygiene; (2) the use of contact precautions for interactions with patients known to be infected or colonized with MRSA; (3) the use of active surveillance cultures to identify patients who were asymptomatically colonized with MRSA; and (4) use of an industrial systems-engineering approach, the Toyota Production System, to facilitate consistent and reliable adherence to the infection control program.
Results.The rate of healthcare-associated MRSA infection in the surgical unit decreased from 1.56 infections per 1,000 patient-days in the 2 years before the intervention to 0.63 infections per 1,000 patient-days in the 4 years after the intervention (a 60% reduction; P = .003). The rate of healthcare-associated MRSA infection in the ICU decreased from 5.45 infections per 1,000 patient-days in the 2 years before to the intervention to 1.35 infections per 1,000 patient-days in the 3 years after the intervention (a 75% reduction; P = .001). The combined estimate for reduction in the incidence of infection after the intervention in the 2 units was 68% (95% confidence interval, 50%-79%; P < .001).
Conclusions.Sustained reduction in the incidence of MRSA infection is possible in a setting where this pathogen is endemic. An industrial systems-engineering approach can be adapted to facilitate consistent and reliable adherence to MRSA infection prevention practices in healthcare facilities.
Original Articles
Risk Factors for Methicillin-Resistant Staphylococcus aureus (MRSA) Acquisition in Roommate Contacts of Patients Colonized or Infected With MRSA in an Acute-Care Hospital
- Christine Moore, Jastej Dhaliwal, Agnes Tong, Sarah Eden, Cindi Wigston, Barbara Willey, Mount Sinai Hospital Infection Control Team, Allison McGeer
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 600-606
-
- Article
- Export citation
-
Objective.
To identify risk factors for acquisition of methicillin-resistant Staphylococcus aureus (MRSA) in patients exposed to an MRSA-colonized roommate.
Design.Retrospective cohort study.
Setting.A 472-bed acute-care teaching hospital in Toronto, Canada.
Patients.Inpatients who shared a room between 1996 and 2004 with a patient who had unrecognized MRSA colonization.
Methods.Exposed roommates were identified from infection-control logs and from results of screening for MRSA in the microbiology database. Completed follow-up was defined as completion of at least 2 sets of screening cultures (swab samples from the nares, the rectum, and skin lesions), with at least 1 set of samples obtained 7–10 days after the last exposure. Chart reviews were performed to compare those who did and did not become colonized with MRSA.
Results.Of 326 roommates, 198 (61.7%) had completed follow-up, and 25 (12.6%) acquired MRSA by day 7–10 after exposure was recognized, all with strains indistinguishable by pulsed-field gel electrophoresis from those of their roommate. Two (2%) of 101 patients were not colonized at day 7–10 but, with subsequent testing, were identified as being colonized with the same strain as their roommate (one at day 16 and one at day 18 after exposure). A history of alcohol abuse (odds ratio [OR], 9.8 [95% confidence limits {CLs}, 1.8, 53]), exposure to a patient with nosocomially acquired MRSA (OR, 20 [95% CLs, 2.4,171]), increasing care dependency (OR per activity of daily living, 1.7 [95% CLs, 1.1, 2.7]), and having received levofloxacin (OR, 3.6 [95% CLs, 1.1,12]) were associated with MRSA acquisition.
Conclusions.Roommates of patients with MRSA are at significant risk for becoming colonized. Further study is needed of the impact of hospital antimicrobial formulary decisions on the risk of acquisition of MRSA.
Original Article
Epidemiology of Vancomycin-Resistant Enterococci Among Patients on an Adult Stem Cell Transplant Unit: Observations From an Active Surveillance Program
- Michael S. Calderwood, Andreas Mauer, Jocelyn Tolentino, Ernesto Flores, Koen van Besien, Ken Pursell, Stephen G. Weber
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 1019-1025
-
- Article
- Export citation
-
Objective.
To use the findings of an active surveillance program to delineate the unique epidemiology of vancomycin-resistant enterococci (VRE) in a mixed population of transplant and nontransplant patients hospitalized on a single patient care unit.
Design.Surveillance survey and case-control analysis.
Setting.A 19-bed adult bone marrow and stem cell transplant unit at a referral and primary-care center.
Patients.The study included patients undergoing transplantation, patients who had previously received bone marrow or stem cell transplants, and patients with other malignancies and hematological disorders who were admitted to the study unit.
Methods.Patients not previously identified as colonized with VRE had perirectal swab specimens collected at admission and once weekly while hospitalized on the unit. The prevalence of VRE colonization at admission and the incidence throughout the hospital stay, genotypes of VRE specimens as determined by pulsed field gel electrophoresis, and risk factors related to colonization were analyzed.
Results.There was no significant difference in the prevalence or incidence of new colonization between nontransplant patients and prior or current transplant recipients, although overall prevalence at admission was significantly higher in the prior transplant group. Preliminary genotypic analysis of VRE isolates from transplant patients suggests that a proportion of cases of newly detected VRE carriage may represent prior colonization not detected at admission, with different risk factors suggestive of a potential epidemiological distinction.
Conclusion.Examination of epidemiological and microbiological data collected by an active surveillance program provides useful information about the epidemiology of VRE that can be applied to inform rational infection control strategies.
Original Articles
Ecological Study of the Effectiveness of Isolation Precautions in the Management of Hospitalized Patients Colonized or Infected With Acinetobacter baumannii
- Houssein Gbaguidi-Haore, Sophie Legast, Michelle Thouverez, Xavier Bertrand, Daniel Talon
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 1118-1123
-
- Article
- Export citation
-
Objective.
To assess the impact of isolation precautions on the incidence of patients colonized or infected with Acinetobacter baumannii (case patients) in a university hospital during the period from 1999 to 2006.
Design.Ecological study.
Setting.The Besançon University Hospital in France, a 1,200-bed acute care hospital with approximately 50,000 admissions per year.
Methods.Using Poisson regression analysis, we evaluated a total of 350,000 patient-days to determine the annual incidence of case patients. This annual incidence was used as the outcome variable, and infection control practices, antibiotic use, and other aggregated data regarding patients' age, sex, McCabe score, and immune status were used as covariates.
Results.The implementation of isolation precautions was independently and negatively associated with the incidence of patients colonized or infected with A. baumannii (relative risk, 0.50 [95% confidence interval, 0.40–0.64]; P < .001).
Conclusions.Our study suggests that the implementation of isolation precautions, in addition to standard precautions, effectively prevents the spread of A. baumannii in a hospital setting.
Risk of Vancomycin-Resistant Enterococcus (VRE) Bloodstream Infection Among Patients Colonized With VRE
- Chamion N. Olivier, Ruth K. Blake, Lisa L. Steed, Cassandra D. Salgado
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 404-409
-
- Article
- Export citation
-
Background.
Colonization with vancomycin-resistant Enterococcus (VRE) is a risk factor for subsequent VRE bloodstream infection (BSI); however, risk factors for BSI among colonized patients have not been adequately described. We sought to determine the proportion of VRE-colonized patients who subsequently develop VRE BSI and to identify risk factors for VRE BSI among these patients.
Methods.Records of 768 patients colonized with VRE from January 2002 through June 2005 were reviewed. The proportion of patients who developed VRE BSI was calculated, and the characteristics of these patients were compared, in a 2 : 1 ratio, with those of patients who did not develop VRE BSI. To identify risk factors for VRE BSI and for death, we used univariate logistic regression analysis and then multivariate logistic regression analysis. Using pulsed-field gel electrophoresis (PFGE), we compared the isolate recovered when the patient was colonized and the isolate recovered when the patient developed VRE BSI.
Results.Of the 768 patients colonized with VRE, 31 (4.0%) developed VRE BSI. Multivariate analysis identified the following idependent risk factors for developing VRE BSI: infection of an additional body site other than blood (adjusted odds ratio [aOR], 3.9; P = .04), admission to the hospital from a long-term care facility (aOR, 12.6; P = .04), and receipt of vancomycin (aOR, 10.6; P < .001). The independent risk factors for death among patients colonized with VRE were immunosuppression (aOR, 12.9; P = .001 ) and VRE BSI (aOR, 9.1; P = .002). Of the 31 patients who developed VRE BSI, 23 (74%) had a pair of isolates representing VRE colonization and VRE BSI. For 19 (83%) of these 23 patients, the isolate representing BSI was genetically related to the isolate representing VRE colonization: 12 pairs of isolates (52%) had identical banding patterns, 5 had closely related patterns, and 2 had possibly related patterns.
Conclusion.Of the 768 patients colonized with VRE, 31 (4.0%) usually developed VRE BSI due to a related strain. Independent risk factors for BSI among colonized patients were admission from a long-term care facility, infection of an additional body site, and exposure to vancomycin. Independent risk factors for death were immunosuppression and VRE BSI.
Original Article
Derivation and Validation of a Clinical Prediction Score for Isolation of Inpatients With Suspected Pulmonary Tuberculosis
- Kara S. Rakoczy, Stuart H. Cohen, Hien H. Nguyen
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 927-932
-
- Article
- Export citation
-
Background.
The use of a clinical prediction score to improve the practice of instituting airborne-transmission precautions in patients with suspected tuberculosis holds promise for increasing appropriate isolation and decreasing unnecessary isolation. The objective of this study was to derive and validate a clinical prediction score for patients with suspected tuberculosis.
Methods.We used a case—control study design to evaluate differences between patients with a diagnosis of tuberculosis and those placed under airborne precautions who had negative culture results. We developed risk scores based on a multivariable analysis of independently significant factors associated with tuberculosis. Subsequently, we evaluated the sensitivity and specificity of the score in a separate (validation) cohort of patients.
Results.Within our population, we found 4 clinical factors associated with tuberculosis: chronic symptoms (odds ratio [OR], 10.2 [95% confidence interval {CI}, 2.95-35.4]), upper lobe disease on chest radiograph (OR, 5.27 [95% CI, 1.6-17.23]), foreign-born status (OR, 7.01 [95% CI, 2.1-23.8]), and immunocompromised state other than human immunodeficiency virus infection (OR, 8.14 [95% CI, 2.08-31.8]). Shortness of breath (OR, 0.13 [95% CI, 0.04-0.45]) was found to be associated with non-tuberculosis diagnoses and considered a negative predictor in the model. Using a cut-off point to maximize sensitivity, we applied the prediction rule to the validation cohort, resulting in a sensitivity of 97% and a specificity of 42%.
Conclusion.The tuberculosis prediction rule derived from our patient population could improve utilization of airborne precautions. Clinical prediction rules continue to show their utility for improvement in isolation practices in different demographic areas.
Original Articles
Validation Study of Artificial Neural Network Models for Prediction of Methicillin-Resistant Staphylococcus aureus Carriage
- Cheng-Chuan Hsu, Yusen E. Lin, Yao-Shen Chen, Yung-Ching Liu, Robert R. Muder
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 607-614
-
- Article
- Export citation
-
Objective.
Use of active surveillance cultures for methicillin-resistant Staphylococcus aureus (MRSA) for all patients admitted to the intensive care unit has been shown to reduce nosocomial transmission. However, the cost-effectiveness and the utility of implementing use of active surveillance cultures nationwide remain controversial. We sought to develop an artificial neural network (ANN) model that would predict the likelihood of MRSA colonization.
Setting.Two acute care hospitals, one in Pittsburgh (hospital A) and one in Kaohsiung, Taiwan (hospital B).
Methods.Nasal cultures were performed for all patients admitted to the hospitals. A total of 46 potential risk factors in hospital A and 86 potential risk factors in hospital B associated with MRSA colonization were assessed. Culture results were obtained; 75% of the data were used for training our ANN model, and the remaining 25% were used for validating our ANN model. The culture results were the “gold standard” for determining the accuracy of the model predictions.
Results.The ANN model predictions were accurate 95.2% of the time for hospital A (sensitivity, 94.3%; specificity, 96.0%) and 94.2% of the time for hospital B (sensitivity, 96.6%; specificity, 91.8%), integrating all potential risk factors into the model. Only 17 potential risk factors were needed for the hospital A ANN model (accuracy, 90.9%; sensitivity, 98.5%; specificity, 83.4%), and only 20 potential risk factors were needed for the hospital B ANN model (accuracy, 90.5%; sensitivity, 96.6%; specificity, 84.3%), if the minimal risk factor method was used. Cross-validation analysis showed an average accuracy of 85.6% (sensitivity, 91.3%; specificity, 80.0%).
Conclusion.Our ANN model can be used to predict with an accuracy of more than 90% which patients carry MRSA. The false-negative rates were significantly lower than the false-positive rates in the ANN predictions, which can serve as a safety buffer in case of patient misclassification.
Original Article
Antibiotic Exposure and Room Contamination Among Patients Colonized With Vancomycin-Resistant Enterococci
- Marci Drees, David R. Snydman, Christopher H. Schmid, Laurie Barefoot, Karen Hansjosten, Padade M. Vue, Michel Cronin, Stanley A. Nasraway, Yoav Golan
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 709-715
-
- Article
- Export citation
-
Objective.
To determine whether total and antianaerobic antibiotic exposure increases the risk of room contamination among vancomycin-resistant enterococci (VRE)–colonized patients.
Design And Setting.A 14-month study in 2 intensive care units at an academic tertiary care hospital in Boston, Massachusetts.
Patients.All patients who acquired VRE or were VRE-colonized on admission and who had environmental cultures performed.
Methods.We performed weekly environmental cultures (2 sites per room) and considered a room to be contaminated if there was a VRE-positive environmental culture during the patient's stay. We determined risk factors for room contamination by use of the Cox proportional hazards model.
Results.Of 142 VRE-colonized patients, 35 (25%) had an associated VRE-positive environmental culture. Patients who contaminated their rooms were more likely to have diarrhea than those who did not contaminate their rooms (23 [66%] of 35 vs 41 [38%] of 107; P = .005) and more likely to have received antibiotics while VRE colonized (33 [94%] of 35 vs 86 [80%] of 107; P = .02). There was no significant difference in room contamination rates between patients exposed to antianaerobic regimens and patients exposed to nonantianaerobic regimens or between patients with and patients without diarrhea, but patients without any antibiotic exposure were unlikely to contaminate their rooms. Diarrhea and antibiotic use were strongly confounded; although two-thirds of room contamination occurred in rooms of patients with diarrhea, nearly all of these patients received antibiotics. In multivariable analysis, higher mean colonization pressure in the ICU increased the risk of room contamination (adjusted hazard ratio per 10% increase, 1.44 [95% confidence interval, 1.04–2.04]), whereas no antibiotic use during VRE colonization was protective (adjusted hazard ratio, 0.21 [95% confidence interval, 0.05–0.89]).
Conclusions.Room contamination with VRE was associated with increased mean colonization pressure in the ICU and diarrhea in the VRE-colonized patient, whereas no use of any antibiotics during VRE colonization was protective.
Commentary
Preventing Clostridium difficile–Associated Disease: Is It Time to Pay the Piper?
- Eli N. Perencevich, Kerri A. Thom
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 829-831
-
- Article
- Export citation
Original Article
Clinical and Molecular Epidemiology of Healthcare-Associated Infections Due to Extended-Spectrum β-Lactamase (ESBL)–Producing Strains of Escherichia coli and Klebsiella pneumoniae That Harbor Multiple ESBL Genes
- Anucha Apisarnthanarak, Pattarachai Kiratisin, Linda M. Mundy
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 1026-1034
-
- Article
- Export citation
-
Objectives.
To characterize healthcare-associated infections due to extended-spectrum β-lactamase (ESBL)-producing strains of Escherichia coli and Klebsiella pneumoniae that harbor multiple ESBL genes, as opposed to a single ESBL gene.
Methods.All patients with a confirmed healthcare-associated infection due to an ESBL-producing strain of E. coli or K. pneumoniae were enrolled in the study. Molecular typing of isolates was performed, and the comparative risks and outcomes of patients were analyzed.
Results.Among 71 patients with healthcare-associated infection due to an ESBL-producing strain of E. coli or K. pneumoniae, the gene for CTX-M, with or without other ESBL genes, was identified in all 51 (100%) of the patients infected with an E. coli strain and in 18 (90%) of the 20 patients infected with a K. pneumoniae strain. Of these 71 patients, 17 (24%) met the definition of healthcare-associated infection due to an ESBL-producing strain that harbored multiple genes; in multivariate analysis, previous exposure to 3 or more classes of antibiotics (adjusted odds ratio, 4.5 [95% confidence interval, 1.7-75.2]) was the sole risk factor for healthcare-associated infection due to an ESBL-producing strain that harbored multiple ESBL genes. Isolates recovered from patients with healthcare-associated infection due to an ESBL-producing strain that harbored multiple ESBL genes were more resistant to various antibiotic classes, and, compared with patients with healthcare-associated infection due to an ESBL-producing strain that harbored a single ESBL gene, they were more likely to have ineffective initial empirical antimicrobial therapy (52% vs 94%; odds ratio, 5.1 [95% confidence interval, 1.04-14.5]).
Conclusions.CTX-M ESBL is highly prevalent in Thailand. Patients with healthcare-associated infection due to an ESBL-producing strain that harbored multiple ESBL genes were more likely to have had ineffective initial empirical antimicrobial therapy, and, given that antibiotic selection pressure was the only associated risk, we suggest focused antimicrobial stewardship programs to limit the emergence and spread of healthcare-associated infection due to ESBL-producing strains in this middle-income country.
Original Articles
Downward Trends in Surgical Site and Urinary Tract Infections After Cesarean Delivery in a French Surveillance Network, 1997–2003
- Agnès Vincent, Louis Ayzac, Raphaële Girard, Emmanuelle Caillat-Vallet, Catherine Chapuis, Florence Depaix, Anne-Marie Dumas, Chantal Gignoux, Catherine Haond, Joëlle Lafarge-Leboucher, Carine Launay, Françoise Tissot-Guerraz, Jacques Fabry, Mater Sud-Est Study Group
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 227-233
-
- Article
- Export citation
-
Objective.
To evaluate whether the adjusted rates of surgical site infection (SSI) and urinary tract infection (UTI) after cesarean delivery decrease in maternity units that perform active healthcare-associated infection surveillance.
Design.Trend analysis by means of multiple logistic regression.
Setting.A total of 80 maternity units participating in the Mater Sud-Est surveillance network.
Patients.A total of 37,074 cesarean deliveries were included in the surveillance from January 1, 1997, through December 31, 2003.
Methods.We used a logistic regression model to estimate risk-adjusted post–cesarean delivery infection odds ratios. The variables included were the maternity units' annual rate of operative procedures, the level of dispensed neonatal care, the year of delivery, maternal risk factors, and the characteristics of cesarean delivery. The trend of risk-adjusted odds ratios for SSI and UTI during the study period was studied by linear regression.
Results.The crude rates of SSI and UTI after cesarean delivery were 1.5% (571 of 37,074 patients) and 1.8% (685 of 37,074 patients), respectively. During the study period, the decrease in SSI and UTI adjusted odds ratios was statistically significant (R = −0.823 [P = .023] and R = −0.906 [P = .005], respectively).
Conclusion.Reductions of 48% in the SSI rate and 52% in the UTI rate were observed in the maternity units. These unbiased trends could be related to progress in preventive practices as a result of the increased dissemination of national standards and a collaborative surveillance with benchmarking of rates.
Salvage of Long-Term Central Venous Catheters During an Outbreak of Pseudomonas putida and Stenotrophomonas maltophilia Infections Associated With Contaminated Heparin Catheter-Lock Solution
- M. Beatriz Souza Dias, Alina Bernardes Habert, Vera Borrasca, Valeska Stempliuk, Aina Ciolli, M. Rita E. Araújo, Silvia F. Costa, Anna S. Levin
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 125-130
-
- Article
- Export citation
-
Objective.
To describe the management of patients with long-term central venous catheters (CVCs) during an outbreak of infection due to Pseudomonas putida and Stenotrophomonas maltophilia associated with contaminated heparin catheter-lock solution.
Design.Descriptive study.
Setting.Private, 250-bed tertiary-care hospital.
Methods.In March 2003, we identified 2 febrile cancer patients with P. putida bacteremia. Over 2 days, 7 cases of bacteremia were identified; lots of syringes prefilled with heparin catheter-lock solution, supplied by a compounding pharmacy, were recalled and samples were cultured. More cases of bacteremia appeared during the following days, and any patient who had had a catheter lock infused with the suspect solution was asked to provide blood samples for culture, even if the patient was asymptomatic. Isolates that were recovered from culture were typed by pulsed-field gel electrophoresis. Antimicrobial salvage treatment of long-term CVCs was attempted.
Results.A total of 154 patients had had their catheter lock infused with solution from the lots that were suspected of being contaminated. Only 48 of these patients had CVCs. By day 7 of the outbreak, 18 of these patients had become symptomatic. Twenty-six of the remaining 30 asymptomatic patients then also provided blood samples for culture, 10 of whom developed fever shortly after samples were collected. Thirty-two patients were identified who had P. putida bacteremia; 9 also had infection due to S. maltophilia. Samples from 1 of the 3 lots of prefilled syringes in use at the time of the outbreak also grew P. putida on culture. Molecular typing identified 3 different clones of P. putida from patients and heparin catheter-lock solution, and 1 clone of S. maltophilia. A total of 27 patients received antimicrobial therapy regimens, some of which included decontamination of the catheter lock with anti-infective lock solution. Of 27 patients, 19 (70%) retained their long-term CVC during the 6-month follow-up period.
Conclusions.To our knowledge, this is one of the largest prospective experiences in the management of bloodstream infection associated with long-term CVCs. The infections were caused by gram-negative bacilli and were managed without catheter removal, with a high response rate. We emphasize the risks of using intravenous formulations of medications supplied by compounding pharmacies that produce large quantities of drugs.
Summer Peaks in the Incidences of Gram-Negative Bacterial Infection Among Hospitalized Patients
- Eli N. Perencevich, Jessina C. McGregor, Michelle Shardell, Jon P. Furuno, Anthony D. Harris, J. Glenn Morris, David N. Fisman, Judith A. Johnson
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 1124-1131
-
- Article
- Export citation
-
Objective.
Recognition of seasonal trends in hospital infections may improve diagnosis, use of empirical therapy, and infection prevention interventions. There are very few data available regarding the seasonal variability of these infections. We quantified the seasonal variation in the incidences of hospital infection caused by common bacterial pathogens and estimated the association between temperature changes and infection rates.
Methods.A cohort of all adult patients admitted to the University of Maryland Medical Center during the period from 1998 through 2005 was analyzed. Time-series analyses were used to estimate the association of the number of infections per month caused by Pseudomonas aeruginosa, Acinetobacter baumannii, Enterobacter cloacae, Escherichia coli, Staphylococcus aureus, and enterococci with season and temperature, while controlling for long-term trends.
Results.There were 218,594 admissions to the index hospital, and analysis of 26,624 unique clinical cultures that grew the organisms of interest identified increases in the mean monthly rates of infection caused by P. aeruginosa (28% of isolates recovered; P < .01), E. cloacae (46%; P < .01), E. coli (12%; P < .01), and A. baumannii (21%; P = .06). For each 10°F increase, we observed a 17% increase in the monthly rates of infection caused by P. aeruginosa (P = .01) and A. baumanii (P = .05).
Conclusion.Significantly higher rates of gram-negative infection were observed during the summer months, compared with other seasons. For some pathogens, higher temperatures were associated with higher infection rates, independent of seasonality. These findings have important implications for infection prevention, such as enhanced surveillance during the warmer months, and for choice of empirical antimicrobial therapy among hospitalized adults. Future, quasi-experimental investigations of gram-negative infection prevention initiatives should control for seasonal variation.
Development of an Algorithm for Surveillance of Ventilator-Associated Pneumonia With Electronic Data and Comparison of Algorithm Results With Clinician Diagnoses
- Michael Klompas, Ken Kleinman, Richard Platt
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 31-37
-
- Article
- Export citation
-
Objective.
Surveillance for ventilator-associated pneumonia (VAP) using standard Centers for Disease Control and Prevention (CDC) criteria is labor intensive and involves many subjective assessments. We sought to improve the efficiency and objectivity of VAP surveillance by adapting the CDC criteria to make them amenable to evaluation with electronic data.
Design.Prospective comparison of the accuracy of VAP surveillance by use of an algorithm with responses to prospective queries made to intensive care physicians. CDC criteria for VAP were used as a reference standard to evaluate the algorithm and clinicians' reports.
Setting.Three surgical intensive care units and 2 medical intensive care units at an academic hospital.
Methods.A total of 459 consecutive patients who received mechanical ventilation for a total of 2,540 days underwent surveillance by both methods during consecutive 3-month periods. Electronic surveillance criteria were chosen to mirror the CDC definition. Quantitative thresholds were substituted for qualitative criteria. Purely subjective criteria were eliminated. Increases in ventilator-control settings were taken to indicate worsening oxygenation. Semiquantitative Gram stain of pulmonary secretion samples was used to assess whether there was sputum purulence.
Results.The algorithm applied to electronic data detected 20 patients with possible VAP. All cases of VAP were confirmed in accordance with standard CDC criteria (100% positive predictive value). Prospective survey of clinicians detected 33 patients with possible VAP. Seventeen of the 33 possible cases were confirmed (52% positive predictive value). Overall, 21 cases of confirmed VAP were identified by either method. The algorithm identified 20 (95%) of 21 known cases, whereas the survey of clinicians identified 17 (81%) of 21 cases.
Conclusions.Surveillance for VAP using electronic data is feasible and has high positive predictive value for cases that meet CDC criteria. Further validation of this method is warranted.