Original Articles
Differences in the Risk Factors for Surgical Site Infection between Total Hip Arthroplasty and Total Knee Arthroplasty in the Korean Nosocomial Infections Surveillance System (KONIS)
- Kyoung-Ho Song, Eu Suk Kim, Young Keun Kim, Hye Young Jin, Sun Young Jeong, Yee Gyung Kwak, Yong Kyun Cho, Joohon Sung, Yeong-Seon Lee, Hee-Bok Oh, Tae Kyun Kim, Kyung-Hoi Koo, Eui-Chong Kim, June Myung Kim, Tae Yeol Choi, Hyo Youl Kim, Hee Jung Choi, Hong Bin Kim, KONIS Study Group
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 1086-1093
-
- Article
- Export citation
-
Objective.
To compare the characteristics and risk factors for surgical site infections (SSIs) after total hip arthroplasty (THA) and total knee arthroplasty (TKA) in a nationwide survey, using shared case detection and recording systems.
Design.Retrospective cohort study.
Setting.Twenty-six hospitals participating in the Korean Nosocomial Infections Surveillance System (KONIS).
Patients.From 2006 to 2009, all patients undergoing THA and TKA in KONIS were enrolled.
Results.SSI occurred in 161 (2.35%) of 6,848 cases (3,422 THAs and 3,426 TKAs). Pooled mean SSI rates were 1.69% and 2.82% for THA and TKA, respectively. Of the cases we examined, 42 (26%) were superficial-incisional SSIs and 119 (74%) were “severe” SSIs; of the latter, 24 (15%) were deep-incisional SSIs and 95 (59%) were organ/space SSIs. In multivariate analysis, a duration of preoperative hospital stay of greater than 3 days was a risk factor for total SSI after both THA and TKA. Diabetes mellitus, revision surgery, prolonged duration of surgery (above the 75th percentile), and the need for surgery due to trauma were independent risk factors for total and severe SSI after THA, while male sex and an operating room without artificial ventilation were independent risk factors for total and severe SSI after TKA. A large volume of surgeries (more than 10 procedures per month) protected against total and severe SSI, but only in patients who underwent TKA.
Conclusions.Risk factors for SSI after arthroplasty differ according to the site of the arthroplasty. Therefore, clinicians should take into account the site of arthroplasty in the analysis of SSI and the development of strategies for reducing SSI.
Original Article
Incidence, Secular Trends, and Outcomes of Prosthetic Joint Infection: A Population-Based Study, Olmsted County, Minnesota, 1969–2007
- Geoffrey Tsaras, Douglas R. Osmon, Tad Mabry, Brian Lahr, Jennifer St. Sauveur, Barbara Yawn, Robert Kurland, Elie F. Berbari
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 1207-1212
-
- Article
- Export citation
-
Context.
The epidemiology of prosthetic joint infection (PJI) in a population-based cohort has not been studied in the United States.
Objectives.To provide an accurate assessment of the true incidence, secular trends, clinical manifestations, microbiology, and treatment outcomes of PJI in a population-based cohort.
Design.Historical cohort study.
Setting.Olmsted County, Minnesota.
Participants.Residents who underwent total knee arthroplasty (TKA) or total hip arthroplasty (THA) between January 1, 1969, and December 31, 2007.
Methods.Incidence rates and trends in PJI were assessed using the Kaplan-Meier method and log-rank test, as were treatment outcomes among PJI case patients.
Results.A total of 7,375 THAs or TKAs were implanted in residents of Olmsted County during the study period. Seventy-five discrete joints in 70 individuals developed PJI, during a mean ± SD follow-up of 6.8 ± 6.1 years. The cumulative incidence of PJI was 0.5%, 0.8%, and 1.4% after 1, 5, and 10 years after arthroplasty, respectively. Overall, the rate of survival free of clinical failure after treatment of PJI was 76.8% (95% confidence interval [CI], 64.3–85.2) and 65.2% (95% CI, 33.1–76.2) at 3 and 5 years, respectively. The incidence and treatment outcomes did not significantly differ by decade of implantation, patient age at implantation, gender, or joint location.
Conclusions.The incidence of PJI is relatively low in a population-based cohort and is a function of age of the prosthesis. Incidence trends and outcomes have not significantly changed over the past 40 years.
Assessing the Burden of Acinetobacter baumannii in Maryland: A Statewide Cross-Sectional Period Prevalence Survey
- Kerri A. Thorn, Lisa L. Maragakis, Katie Richards, J. Kristie Johnson, Brenda Roup, Patricia Lawson, Anthony D. Harris, Elizabeth P. Fuss, Margaret A. Pass, David Blythe, Eli N. Perencevich, Lucy Wilson, Maryland MDRO Prevention Collaborative
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 883-888
-
- Article
- Export citation
-
Objective.
To determine the prevalence of Acinetobacter baumannii, an important healthcare-associated pathogen, among mechanically ventilated patients in Maryland.
Design.The Maryland MDRO Prevention Collaborative performed a statewide cross-sectional active surveillance survey of mechanically ventilated patients residing in acute care and long-term care (LTC) facilities. Surveillance cultures (sputum and perianal) were obtained from all mechanically ventilated inpatients at participating facilities during a 2-week period.
Setting.All healthcare facilities in Maryland that provide care for mechanically ventilated patients were invited to participate.
Patients.Mechanically ventilated patients, known to be at high risk for colonization and infection with A. baumannii, were included.
Results.Seventy percent (40/57) of all eligible healthcare facilities participated in the survey, representing both acute care (n = 30) and LTC (n = 10) facilities in all geographic regions of Maryland. Surveillance cultures were obtained from 92% (358/390) of eligible Patients. A. baumannii was identified in 34% of all mechanically ventilated patients in Maryland; multidrug-resistant A. baumannii was found in 27% of all Patients. A. baumannii was detected in at least 1 patient in 49% of participating facilities; 100% of LTC facilities had at least 1 patient with A. baumannii, compared with 31% of acute care facilities. A. baumannii was identified from all facilities in which 10 or more patients were sampled.
Conclusions.A. baumannii is common among mechanically ventilated patients in both acute care and LTC facilities throughout Maryland, with a high proportion of isolates demonstrating multidrug resistance.
Outbreak of Carbapenem-Resistant Enterobacteriaceae at a Long-Term Acute Care Hospital: Sustained Reductions in Transmission through Active Surveillance and Targeted Interventions
- Part of:
- Amit S. Chitnis, Pam S. Caruthers, Agam K. Rao, JoAnne Lamb, Robert Lurvey, Valery Beau De Rochars, Brandon Kitchel, Margarita Cancio, Thomas J. Török, Alice Y. Guh, Carolyn V. Gould, Matthew E. Wise
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 984-992
-
- Article
- Export citation
-
Objective.
To describe a Klebsiella pneumoniae carbapenemase (KPC)–producing carbapenem-resistant Enterobacteriaceae (CRE) outbreak and interventions to prevent transmission.
Design, Setting, and Patients.Epidemiologic investigation of a CRE outbreak among patients at a long-term acute care hospital (LTACH).
Methods.Microbiology records at LTACH A from March 2009 through February 2011 were reviewed to identify CRE transmission cases and cases admitted with CRE. CRE bacteremia episodes were identified during March 2009–July 2011. Biweekly CRE prevalence surveys were conducted during July 2010–July 2011, and interventions to prevent transmission were implemented, including education and auditin? of staff and isolation and cohorting of CRE patients with dedicated nursing staff and shared medical equipment. Trends were evaluated using weighted linear or Poisson regression. CRE transmission cases were included in a case-control study to evaluate risk factors for acquisition. A real-time polymerase chain reaction assay was used to detect the blaKPC gene, and pulsed-field gel electrophoresis was performed to assess the genetic relatedness of isolates.
Results.Ninety-nine CRE transmission cases, 16 admission cases (from 7 acute care hospitals), and 29 CRE bacteremia episodes were identified. Significant reductions were observed in CRE prevalence (49% vs 8%), percentage of patients screened with newly detected CRE (44% vs 0%), and CRE bacteremia episodes (2.5 vs 0.0 per 1,000 patient-days). Cases were more likely to have received β-lactams, have diabetes, and require mechanical ventilation. All tested isolates were KPC-producing K. pneumoniae, and nearly all isolates were genetically related.
Conclusion.CRE transmission can be reduced in LTACHs through surveillance testing and targeted interventions. Sustainable reductions within and across healthcare facilities may require a regional public health approach.
Infect Control Hosp Epidemiol 2012;33(10):984-992
The Utility of Acute Physiology and Chronic Health Evaluation II Scores for Prediction of Mortality among Intensive Care Unit (ICU) and Non-ICU Patients with Methicillin-Resistant Staphylococcus aureus Bacteremia
- Vanessa Stevens, Thomas P. Lodise, Brian Tsuji, Meagan Stringham, Jill Butterfield, Elizabeth Dodds Ashley, Kristen Brown, Alan Forrest, Jack Brown
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 558-564
-
- Article
- Export citation
-
Objective.
Bloodstream infections due to methicillin-resistant Staphylococcus aureus (MRSA) have been associated with significant risk of in-hospital mortality. The acute physiology and chronic health evaluation (APACHE) II score was developed and validated for use among intensive care unit (ICU) patients, but its utility among non-ICU patients is unknown. The aim of this study was to determine the ability of APACHE II to predict death at multiple time points among ICU and non-ICU patients with MRSA bacteremia.
Design.Retrospective cohort study.
Participants.Secondary analysis of data from 200 patients with MRSA bacteremia at 2 hospitals.
Methods.Logistic regression models were constructed to predict overall in-hospital mortality and mortality at 48 hours, 7 days, 14 days, and 30 days using APACHE II scores separately in ICU and non-ICU patients. The performance of APACHE II scores was compared with age adjustment alone among all patients. Discriminatory ability was assessed using the c-statistic and was compared at each time point using X2 tests. Model calibration was assessed using the Hosmer-Lemeshow goodness-of-fit test.
Results.APACHE II was a significant predictor of death at all time points in both ICU and non-ICU patients. Discrimination was high in all models, with c-statistics ranging from 0.72 to 0.84, and was similar between ICU and non-ICU patients at all time points. APACHE II scores significantly improved the prediction of overall and 48-hour mortality compared with age adjustment alone.
Conclusions.The APACHE II score may be a valid tool to control for confounding or for the prediction of death among ICU and non-ICU patients with MRSA bacteremia.
Original Articles
Healthcare-Associated Bloodstream Infections Secondary to a Urinary Focus The Québec Provincial Surveillance Results
- Elise Fortin, Isabelle Rocher, Charles Frenette, Claude Tremblay, Caroline Quach
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 456-462
-
- Article
- Export citation
-
Objective.
Urinary tract infections (UTIs) are an important source of secondary healthcare-associated bloodstream infections (BSIs), where a potential for prevention exists. This study describes the epidemiology of BSIs secondary to a urinary source (U-BSIs) in the province of Québec and predictors of mortality.
Design.Dynamic cohort of 9,377,830 patient-days followed through a provincial voluntary surveillance program targeting all episodes of healthcare-associated BSIs occurring in acute care hospitals.
Setting.Sixty-one hospitals in Québec, followed between April 1, 2007, and March 31, 2010.
Participants.Patients admitted to participating hospitals for 48 hours or longer.
Methods.Descriptive statistics were used to summarize characteristics of U-BSIs and microorganisms involved. Wilcoxon and X2 tests were used to compare U-BSI episodes with other BSIs. Negative binomial regression was used to identify hospital characteristics associated with higher rates. We explored determinants of mortality using logistic regression.
Results.Of the 7,217 reported BSIs, 1,510 were U-BSIs (21%), with an annual rate of 1.4 U-BSIs per 10,000 patient-days. A urinary device was used in 71% of U-BSI episodes. Identified institutional risk factors were average length of stay, teaching status, and hospital size. Increasing hospital size was influential only in nonteaching hospitals. Age, nonhematogenous neoplasia, Staphylococcus aureus, and Foley catheters were associated with mortality at 30 days.
Conclusion.U-BSI characteristics suggest that urinary catheters may remain in patients for ease of care or because practitioners forget to remove them. Ongoing surveillance will enable hospitals to monitor trends in U-BSIs and impacts of process surveillance that will be implemented shortly.
Association of Bacillus cereus Infection with Contaminated Alcohol Prep Pads
- Susan A. Dolan, Cynthia Littlehorn, Mary P. Glodé, Elaine Dowell, Karen Xavier, Ann-Christine Nyquist, James K. Todd
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 666-671
-
- Article
- Export citation
-
Background.
Bacillus species have caused healthcare-associated outbreaks of invasive disease as well as pseudo-outbreaks. We report an outbreak investigation of blood cultures positive for Bacillus cereus associated with alcohol prep pads (APPs) contaminated with B. cereus and Bacillus species resulting in a rapid internal product recall and subsequent international product recall.
Design.Epidemiologic and microbiologic outbreak investigation.
Setting.A 300-bed tertiary care children's hospital in Aurora, Colorado.
Patients.Patients with blood or cerebrospinal fluid cultures positive for B. cereus.
Methods.Three patients with blood cultures positive for B. cereus were identified in late 2010. Breaches in procedural and surgical techniques, common interventions, and products were explored. The following 3 common products were cultured: sterile saline syringes, chlorhexidine/alcohol skin preparation solution, and APPs. Repetitive sequence-based polymerase chain reaction (Rep-PCR) was used to compare isolates obtained from patients and from APPs and was confirmed by independent pulsed-field gel electrophoresis.
Results.There appeared to be a significant increase in blood cultures positive for B. cereus during 2009-2010. B. cereus and other Bacillus species were cultured from the internal contents of 63.3% of APPs not labeled as sterile, and 8 of the 10 positive lots were manufactured after 2007. None of the isolates obtained from the patients matched strains isolated from the APPs. However, some lots of APPs had strains that were indistinguishable from one another.
Conclusions.APPs that were not labeled as sterile were contaminated with Bacillus species. The product was immediately recalled internally and replaced with APPs from another manufacturer that were labeled as sterile. On January 3, 2011, the manufacturer voluntarily recalled its APPs. Healthcare facilities, healthcare providers, and users of APPs should avoid the use of APPs not specifically labeled as sterile.
Review Article
Antimicrobial Stewardship—the State of the Art in 2011 Focus on Outcome and Methods
- John E. McGowan, Jr
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 331-337
-
- Article
- Export citation
-
Antimicrobial stewardship programs attempt to optimize prescribing of these drugs to benefit both current and future patients. Recent regulatory and other incentives have led to widespread adoption of such programs. Measurements of the success of these programs have focused primarily on process measures. However, evaluation of outcome measures will be needed to ensure sustainability of these efforts. Outcome efforts to date provide some evidence for improved care of individual patients, some evidence for minimizing emergence of resistance, and ample evidence for cost reduction. Attention to evaluation methods must be increased to provide convincing evidence for the continuation of such programs.
Original Article
On the Role of Length of Stay in Healthcare-Associated Bloodstream Infection
- Christie Y. Jeon, Matthew Neidell, Haomiao Jia, Matt Sinisi, Elaine Larson
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 1213-1218
-
- Article
- Export citation
-
Design.
We conducted a retrospective cohort study to examine the role played by length of hospital stay in the risk of healthcare-associated bloodstream infection (BSI), independent of demographic and clinical risk factors for BSI.
Patients.We employed data from 113,893 admissions from inpatients discharged between 2006 and 2008.
Setting.Large tertiary healthcare center in New York City.
Methods.We estimated the crude and adjusted hazard of BSI by conducting logistic regression using a person-day data structure. The covariates included in the fully adjusted model included age, sex, Charlson score of comorbidity, renal failure, and malignancy as static variables and central venous catheterization, mechanical ventilation, and intensive care unit stay as time-varying variables.
Results.In the crude model, we observed a nonlinear increasing hazard of BSI with increasing hospital stay. This trend was reduced to a constant hazard when fully adjusted for demographic and clinical risk factors for BSI.
Conclusion.The association between longer length of hospital stay and increased risk of infection can largely be explained by the increased duration of stay among those who have underlying morbidity and require invasive procedures. We should take caution in attributing the association between length of stay and BSI to a direct negative impact of the healthcare environment.
Antimicrobial Stewardship at a Large Tertiary Care Academic Medical Center: Cost Analysis Before, During, and After a 7-Year Program
- Harold C. Standiford, Shannon Chan, Megan Tripoli, Elizabeth Weekes, Graeme N. Forrest
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 338-345
-
- Article
- Export citation
-
Background.
An antimicrobial stewardship program was fully implemented at the University of Maryland Medical Center in July 2001 (beginning of fiscal year [FY] 2002). Essential to the program was an antimicrobial monitoring team (AMT) consisting of an infectious diseases-trained clinical pharmacist and a part-time infectious diseases physician that provided real-time monitoring of antimicrobial orders and active intervention and education when necessary. The program continued for 7 years and was terminated in order to use the resources to increase infectious diseases consults throughout the medical center as an alternative mode of stewardship.
Design.A descriptive cost analysis before, during, and after the program.
Patients/Setting.A large tertiary care teaching medical center.
Methods.Monitoring the utilization (dispensing) costs of the antimicrobial agents quarterly for each FY.
Results.The utilization costs decreased from $44,181 per 1,000 patient-days at baseline prior to the full implementation of the program (FY 2001) to $23,933 (a 45.8% decrease) by the end of the program (FY 2008). There was a reduction of approximately $3 million within the first 3 years, much of which was the result of a decrease in the use of antifungal agents in the cancer center. After the program was discontinued at the end of FY 2008, antimicrobial costs increased from $23,933 to $31,653 per 1,000 patient-days, a 32.3% increase within 2 years that is equivalent to a $2 million increase for the medical center, mostly in the antibacterial category.
Conclusions.The antimicrobial stewardship program, using an antimicrobial monitoring team, was extremely cost effective over this 7-year period.
Original Articles
Positive Cultures of Organ Preservation Fluid Predict Postoperative Infections in Solid Organ Transplantation Recipients
- Cedric P. Yansouni, Nandini Dendukuri, Guoyuan Liu, Myriam Fernandez, Charles Frenette, Steven Paraskevas, Donald C. Sheppard
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 672-680
-
- Article
- Export citation
-
Objective.
The significance of positive cultures of organ preservation fluid (OPF) in solid organ transplantation is not known. We sought to describe the microbiology and define the clinical impact of positive OPF cultures.
Design.Retrospective cohort study.
Setting.Tertiary care hospital.
Patients.A consecutive sample of all solid organ transplantations at our center between July 2006 and January 2009 was reviewed. A total of 331 allografts (185 kidneys, 104 livers, 31 pancreases, and 11 hearts) met the inclusion criterion of having OPF cultures taken from the transplanted allograft.
Methods.Organisms recovered from OPF were classified as high or low risk according to their virulence. Clinical outcomes were compared between recipients of organs with positive OPF cultures and recipients of organs with negative OPF cultures.
Results.OPF cultures were positive in 62.2% of allografts and yielded high-risk organisms in 17.8%. Normal skin flora constituted the majority of positive OPF cultures, while Enterobacteriaceae spp. and Staphylococcus aureus made up the majority of high-risk organisms. Recipients of allografts with positive OPF cultures developed more frequent bacterial infections, regardless of allograft type (relative risk, 2.39; 95% confidence interval [CI], 1.61–3.54). Moreover, isolation of a given organism in OPF samples was associated with the development of a clinical infection with the same organism, regardless of allograft type.
Conclusions.Positive cultures of OPF are common events in solid organ transplantation, frequently involve high-risk organisms, and are associated with the development of postoperative clinical bacterial infections. Further study is required to determine the optimal strategies for their prevention and management.
Performance, Revision, and Extension of the National Nosocomial Infections Surveillance System's Risk Index in Brazilian Hospitals
- Fernando Martín Biscione, Renato Camargos Couto, Tânia M. G. Pedrosa
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 124-134
-
- Article
- Export citation
-
Objective.
To assess the benefit of using procedure-specific alternative cutoff points for National Nosocomial Infections Surveillance (NNIS) risk index variables and of extending surgical site infection (SSI) risk prediction models with a postdischarge surveillance indicator.
Design.Open, retrospective, validation cohort study.
Setting.Five private, nonuniversity Brazilian hospitals.
Patients.Consecutive inpatients operated on between January 1993 and May 2006 (other operations of the genitourinary system [n = 20,723], integumentary system [n = 12,408], or musculoskeletal system [n = 15,714] and abdominal hysterectomy [n = 11,847]).
Methods.For each procedure category, development and validation samples were defined nonrandomly. In the development samples, alternative SSI prognostic scores were constructed using logistic regression: (i) alternative NNIS scores used NNIS risk index covariates and cutoff points but locally derived SSI risk strata and rates, (ii) revised scores used procedure-specific alternative cutoff points, and (iii) extended scores expanded revised scores with a postdischarge surveillance indicator. Performances were compared in the validation samples using calibration, discrimination, and overall performance measures.
Results.The NNIS risk index showed low discrimination, inadequate calibration, and predictions with high variability. The most consistent advantage of alternative NNIS scores was regarding calibration (prevalence and dispersion components). Revised scores performed slightly better than the NNIS risk index for most procedures and measures, mainly in calibration. Extended scores clearly performed better than the NNIS risk index, irrespective of the measure or operative procedure.
Conclusions.Locally derived SSI risk strata and rates improved the NNIS risk index's calibration. Alternative cutoff points further improved the specification of the intrinsic SSI risk component. Controlling for incomplete postdischarge SSI surveillance provided consistently more accurate SSI risk adjustment.
Infect Control Hosp Epidemiol 2012;33(2):124-134
Original Article
Electronic Surveillance for Infectious Disease Trend Analysis following a Quality Improvement Intervention
- Kari E. Peterson, Donna M. Hacek, Ari Robicsek, Richard B. Thomson, Jr, Lance R. Peterson
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 790-795
-
- Article
- Export citation
-
Objective.
Interventions for reducing methicillin-resistant Staphylococcus aureus (MRSA) healthcare-associated disease require outcome assessment; this is typically done by manual chart review to determine infection, which can be labor intensive. The purpose of this study was to validate electronic tools for MRSA healthcare-associated infection (HAI) trending that can replace manual medical record review.
Design and Setting.This was an observational study comparing manual medical record review with 3 electronic methods: raw culture data from the laboratory information system (LIS) in use by our healthcare organization, LIS data combined with admission-discharge-transfer (ADT) data to determine which cultures were healthcare associated (LIS + ADT), and the CareFusion MedMined Nosocomial Infection Marker (NIM). Each method was used for the same 7-year period from August 2003 through July 2010.
Patients.The data set was from a 3-hospital organization covering 342,492 admissions.
Results.Correlation coefficients for raw LIS, LIS + ADT, and NIM were 0.976, 0.957, and 0.953, respectively, when assessed on an annual basis. Quarterly performance for disease trending was also good, with R2 values exceeding 0.7 for all methods.
Conclusions.The electronic tools accurately identified trends in MRSA HAI incidence density when all infections were combined as quarterly or annual data; the performance is excellent when annual assessment is done. These electronic surveillance systems can significantly reduce (93% [in-house-developed program] to more than 99.9999% [commercially available systems]) the personnel resources needed to monitor the impact of a disease control program.
Relationship between Chlorhexidine Gluconate Skin Concentration and Microbial Density on the Skin of Critically Ill Patients Bathed Daily with Chlorhexidine Gluconate
- Kyle J. Popovich, Rosie Lyles, Robert Hayes, Bala Hota, William Trick, Robert A. Weinstein, Mary K. Hayden
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 889-896
-
- Article
- Export citation
-
Objective and Design.
Previous work has shown that daily skin cleansing with Chlorhexidine gluconate (CHG) is effective in preventing infection in the medical intensive care unit (MICU). A colorimetric, semiquantitative indicator was used to measure CHG concentration on skin (neck, antecubital fossae, and inguinal areas) of patients bathed daily with CHG during their MICU stay and after discharge from the MICU, when CHG bathing stopped.
Patients and Setting.MICU patients at Rush University Medical Center.
Methods.CHG concentration on skin was measured and skin sites were cultured quantitatively. The relationship between CHG concentration and microbial density on skin was explored in a mixed-effects model using gram-positive colony-forming unit (CFU) counts.
Results.For 20 MICU patients studied (240 measurements), the lowest CHG concentrations (0–18.75 μg/mL) and the highest gram-positive CFU counts were on the neck (median, 1.07 log10 CFUs; P = .014). CHG concentration increased postbath and decreased over 24 hours (P < .001). In parallel, median log10 CFUs decreased pre- to postbath (0.78 to 0) and then increased over 24 hours to the baseline of 0.78 (P = .001). A CHG concentration above 18.75 μg/mL was associated with decreased gram-positive CFUs (P = .004). In all but 2 instances, CHG was detected on patient skin during the entire interbath (approximately 24-hour) period (18 [90%] of 20 patients). In 11 patients studied after MICU discharge (80 measurements), CHG skin concentrations fell below effective levels after 1–3 days.
Conclusion.In MICU patients bathed daily with CHG, CHG concentration was inversely associated with microbial density on skin; residual antimicrobial activity on skin persisted up to 24 hours. Determination of CHG concentration on the skin of patients may be useful in monitoring the adequacy of skin cleansing by healthcare workers.
Original Articles
Electronic-Eye Faucets: Legionella Species Contamination in Healthcare Settings
- Emily R. M. Sydnor, Gregory Bova, Anatoly Gimburg, Sara E. Cosgrove, Trish M. Perl, Lisa L. Maragakis
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 235-240
-
- Article
- Export citation
-
Objective.
To compare heterotrophic plate counts (HPCs) and Legionella species growth from electronic and manual faucet water samples.
Design.Proportions of water samples with growth and colony-forming units were compared using Fisher's exact test and the Wilcoxon rank-sum test, respectively.
Setting.Two psychiatric units and 1 medical unit in a 1,000-bed university hospital.
Methods.Water samples were collected from 20 newly installed electronic faucets and 20 existing manual faucets in 3 hospital units. Manual faucets were located in rooms adjacent to the electronic faucets and received water from the same source. Water samples were collected between December 15, 2008, and January 29, 2009. Four electronic faucets were dismantled, and faucet components were cultured. Legionella species and HPC cultures were performed using standard methods.
Results.Nearly all electronic faucets (19/20 [95%]) grew Legionella species from at least 1 water sample, compared with less than half (9/20 [45%]) of manual faucets (P = .001). Fifty-four (50%) of 108 electronic faucet water cultures grew Legionella species, compared with 11 (15%) of 75 manual faucet water cultures (P< .001). After chlorine dioxide remediation, 4 (14%) of 28 electronic faucet and 1 (3%) of 30 manual faucet water cultures grew Legionella species (P = .19), and 8 (29%) electronic faucet and 2 (7%) manual faucet cultures had significant HPC growth (P = .04). All 12 (100%) of die internal faucet components from 2 electronic faucets grew Legionella species.
Conclusions.Electronic faucets were more commonly contaminated with Legionella species and other bacteria and were less likely to be disinfected after chlorine dioxide remediation. Electronic faucet components may provide points of concentrated bacterial growth.
Infect Control Hosp Epidemiol 2012;33(3):235-240
Original Article
Use of Administrative Data in Efficient Auditing of Hospital-Acquired Surgical Site Infections, New York State 2009–2010
- Valerie B. Haley, Carole Van Antwerpen, Boldtsetseg Tserenpuntsag, Kathleen A. Gase, Peggy Hazamy, Diana Doughty, Marie Tsivitis, Rachel L. Stricof
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 565-571
-
- Article
- Export citation
-
Objective.
To efficiently validate the accuracy of surgical site infection (SSI) data reported to the National Healthcare Safety Network (NHSN) by New York State (NYS) hospitals.
Design.Validation study.
Setting.176 NYS hospitals.
Methods.NYS Department of Health staff validated the data reported to NHSN by review of a stratified sample of medical records from each hospital. The four strata were (1) SSIs reported to NHSN; (2) records with an indication of infection from diagnosis codes in administrative data but not reported to NHSN as SSIs; (3) records with discordant procedure codes in NHSN and state data sets; (4) records not in the other three strata.
Results.A total of 7,059 surgical charts (6% of the procedures reported by hospitals) were reviewed. In stratum 1, 7% of reported SSIs did not meet the criteria for inclusion in NHSN and were subsequently removed. In stratum 2, 24% of records indicated missed SSIs not reported to NHSN, whereas in strata 3 and 4, only 1% of records indicated missed SSIs; these SSIs were subsequently added to NHSN. Also, in stratum 3, 75% of records were not coded for the correct NHSN procedure. Errors were highest for colon data; the NYS colon SSI rate increased by 7.5% as a result of hospital audits.
Conclusions.Audits are vital for ensuring the accuracy of hospital-acquired infection (HAI) data so that hospital HAI rates can be fairly compared. Use of administrative data increased the efficiency of identifying problems in hospitals' SSI surveillance that caused SSIs to be unreported and caused errors in denominator data.
Device-Associated Infection Rates, Device Utilization, and Antimicrobial Resistance in Long-Term Acute Care Hospitals Reporting to the National Healthcare Safely Network, 2010
- Amit S. Chitnis, Jonathan R. Edwards, Phillip M. Ricks, Dawn M. Sievert, Scott K. Fridkin, Carolyn V. Gould
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 993-1000
-
- Article
- Export citation
-
Objective.
To evaluate national data on healthcare-associated infections (HAIs), device utilization, and antimicrobial resistance in long-term acute care hospitals (LTACHs).
Design and Setting.Comparison of data from LTACHs and from medical and medical-surgical intensive care units (ICUs) in short-stay acute care hospitals reporting to the National Healthcare Safety Network (NHSN) during 2010.
Methods.Rates of central line–associated bloodstream infections (CLABSIs), catheter-associated urinary tract infections (CAUTIs), and ventilator-associated pneumonia (VAP) as well as device utilization ratios were calculated. For each HAI, pathogen profiles and antimicrobial resistance prevalence were evaluated. Comparisons were made using Poisson regression and the Mood median and x2 tests.
Results.In 2010, 104 LTACHs reported CLABSIs and 57 reported CAUTIs and VAP to the NHSN. Median CLABSI rates in LTACHs (1.25 events per 1,000 device-days reported; range, 0.0-5.96) were comparable to rates in major teaching ICUs and were higher than those in other ICUs. CAUTI rates in LTACHs (median, 2.61; range, 0.0-9.92) were higher and VAP rates (median, 0.0; range, 0.0-3.29) were generally lower than those in ICUs. Central line utilization in LTACHs was higher than that in ICUs, whereas urinary catheter and ventilator utilization was lower. Methicillin resistance among Staphylococcus aureus CLABSIs (83%) and vancomycin resistance among Enterococcus faecalis CAUTIs (44%) were higher in LTACHs than in ICUs. Multidrug resistance among Pseudomonas aeruginosa CAUTIs (25%) was higher in LTACHs than in most ICUs.
Conclusions.CLABSIs and CAUTIs associated with multidrug-resistant organisms present a challenge in LTACHs. Continued HAI surveillance with pathogen-level data can guide prevention efforts in LTACHs.
Infect Control Hosp Epidemiol 2012;33(10):993-1000
Original Articles
Effect of Hospital-Wide Chlorhexidine Patient Bathing on Healthcare-Associated Infections
- Mark E. Rupp, R. Jennifer Cavalieri, Elizabeth Lyden, Jennifer Kucera, MaryAnn Martin, Teresa Fitzgerald, Kate Tyner, James R. Anderson, Trevor C. VanSchooneveld
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 1094-1100
-
- Article
- Export citation
-
Background.
Chlorhexidine gluconate (CHG) bathing has been used primarily in critical care to prevent central line-associated bloodstream infections and infections due to multidrug-resistant organisms. The objective was to determine the effect of hospital-wide CHG patient bathing on healthcare-associated infections (HAIs).
Design.Quasi-experimental, staged, dose-escalation study for 19 months followed by a 4-month washout period, in 3 cohorts.
Setting.Academic medical center.
Patients.All patients except neonates and infants.
Intervention and Measurements.CHG bathing in the form of bed basin baths or showers administered 3 days per week or daily. CHG bathing compliance was monitored, and the rate of HAIs was measured.
Results.Over 188,859 patient-days, 68,302 CHG baths were administered. Adherence to CHG bathing in the adult critical care units (90%) was better than that observed in other units (57.7%, P< .001). A significant decrease in infections due to Clostridium difficile was observed in all cohorts of patients during the intervention period, followed by a significant rise during the washout period. For all cohorts, the relative risk of C. difficile infection compared to baseline was 0.71 (95% confidence interval [CI], 0.57–0.89; P = .003) for 3-days-per-week CHG bathing and 0.41 (95% CI, 0.29–0.59; P < .001) for daily CHG bathing. During the washout period, the relative risk of infection was 1.85 (95% CI, 1.38–2.53; P =< .001), compared to that with daily CHG bathing. A consistent effect of CHG bathing on other HAIs was not observed. No adverse events related to CHG bathing were reported.
Conclusions.CHG bathing was well tolerated and was associated with a significant decrease in C. difficile infections in hospitalized patients.
Improved Risk Adjustment in Public Reporting: Coronary Artery Bypass Graft Surgical Site Infections
- Sandra I. Berríos-Torres, Yi Mu, Jonathan R. Edwards, Teresa C. Horan, Scott K. Fridkin
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 463-469
-
- Article
- Export citation
-
Objective.
The objective was to develop a new National Healthcare Safety Network (NHSN) risk model for sternal, deep incisional, and organ/space (complex) surgical site infections (SSIs) following coronary artery bypass graft (CABG) procedures, detected on admission and readmission, consistent with public reporting requirements.
Patients and Setting.A total of 133,503 CABG procedures with 4,008 associated complex SSIs reported by 293 NHSN hospitals in the United States.
Methods.CABG procedures performed from January 1, 2006, through December 31, 2008, were analyzed. Potential SSI risk factors were identified by univariate analysis. Multivariate analysis with forward stepwise logistic regression modeling was used to develop the new model. The c-index was used to compare the predictive power of the new and NHSN risk index models.
Results.Multivariate analysis independent risk factors included ASA score, procedure duration, female gender, age, and medical school affiliation. The new risk model has significantly improved predictive performance over the NHSN risk index (c-index, 0.62 and 0.56, respectively).
Conclusions.Traditionally, the NHSN surveillance system has used a risk index to provide procedure-specific risk-stratified SSI rates to hospitals. A new CABG sternal, complex SSI risk model developed by multivariate analysis has improved predictive performance over the traditional NHSN risk index and is being considered for endorsement as a measure for public reporting.
Frequent Hospital Readmissions for Clostridium difficile Infection and the Impact on Estimates of Hospital-Associated C. difficile Burden
- Courtney R. Murphy, Taliser R. Avery, Erik R. Dubberke, Susan S. Huang
-
- Published online by Cambridge University Press:
- 02 January 2015, pp. 20-28
-
- Article
- Export citation
-
Objective.
Clostridium difficile infection (CDI) is associated with hospitalization and may cause readmission following admission for any reason. We aimed to measure the incidence of readmissions due to CDI.
Design.Retrospective cohort study.
Patients.Adult inpatients in Orange County, California, who presented with new-onset CDI within 12 weeks of discharge.
Methods.We assessed mandatory 2000–2007 hospital discharge data for trends in hospital-associated CDI (HA-CDI) incidence, with and without inclusion of postdischarge CDI (PD-CDI) events resulting in rehospitalization within 12 weeks of discharge. We measured the effect of including PD-CDI events on hospital-specific CDI incidence, a mandatory reporting measure in California, and on relative hospital ranks by CDI incidence.
Results.From 2000 to 2007, countywide hospital-onset CDI (HO-CDI) incidence increased from 15 per 10,000 to 22 per 10,000 admissions. When including PD-CDI events, HA-CDI incidence doubled (29 per 10,000 in 2000 and 52 per 10,000 in 2007). Overall, including PD-CDI events resulted in significantly higher hospital-specific CDI incidence, although hospitals had disproportionate amounts of HA-CDI occurring postdischarge. This resulted in substantial shifts in some hospitals' rankings by CDI incidence. In multivariate models, both HO and PD-CDI were associated with increasing age, higher length of stay, and select comorbidities. Race and Hispanic ethnicity were predictive of PD-CDI but not HO-CDI.
Conclusions.PD-CDI events associated with rehospitalization are increasingly common. The majority of HA-CDI cases may be occurring postdischarge, raising important questions about both accurate reporting and effective prevention strategies. Some risk factors for PD-CDI may be different than those for HO-CDI, allowing additional identification of high-risk groups before discharge.
Infect Control Hosp Epidemiol 2012;33(1):20-28