Original Articles
Lack of Patient Understanding of Hospital-Acquired Infection Data Published on the Centers for Medicare and Medicaid Services Hospital Compare Website
- Max Masnick, Daniel J. Morgan, John D. Sorkin, Elizabeth Kim, Jessica P. Brown, Penny Rheingans, Anthony D. Harris
-
- Published online by Cambridge University Press:
- 23 November 2015, pp. 182-187
-
- Article
- Export citation
-
BACKGROUND
Public reporting of hospital quality data is a key element of US healthcare reform. Data for hospital-acquired infections (HAIs) are especially complex.
OBJECTIVETo assess interpretability of HAI data as presented on the Centers for Medicare and Medicaid Services Hospital Compare website among patients who might benefit from access to these data.
METHODSWe randomly selected inpatients at a large tertiary referral hospital from June to September 2014. Participants performed 4 distinct tasks comparing hypothetical HAI data for 2 hospitals, and the accuracy of their comparisons was assessed. Data were presented using the same tabular formats used by Centers for Medicare and Medicaid Services. Demographic characteristics and healthcare experience data were also collected.
RESULTSParticipants (N=110) correctly identified the better of 2 hospitals when given written descriptions of the HAI measure in 72% of the responses (95% CI, 66%–79%). Adding the underlying numerical data (number of infections, patient-time, and standardized infection ratio) to the written descriptions reduced correct responses to 60% (55%–66%). When the written HAI measure description was not informative (identical for both hospitals), 50% answered correctly (42%–58%). When no written HAI measure description was provided and hospitals differed by denominator for infection rate, 38% answered correctly (31%–45%).
CONCLUSIONSCurrent public HAI data presentation methods may be inadequate. When presented with numeric HAI data, study participants incorrectly compared hospitals on the basis of HAI data in more than 40% of the responses. Research is needed to identify better ways to convey these data to the public.
Infect. Control Hosp. Epidemiol. 2016;37(2):182–187
Central Line–Associated Bloodstream Infection Reduction and Bundle Compliance in Intensive Care Units: A National Study
- E. Yoko Furuya, Andrew W. Dick, Carolyn T. A. Herzig, Monika Pogorzelska-Maziarz, Elaine L. Larson, Patricia W. Stone
-
- Published online by Cambridge University Press:
- 07 April 2016, pp. 805-810
-
- Article
- Export citation
-
OBJECTIVES
To describe compliance with the central line (CL) insertion bundle overall and with individual bundle elements in US adult intensive care units (ICUs) and to determine the relationship between bundle compliance and central line–associated bloodstream infection (CLABSI) rates.
DESIGNCross-sectional study.
PARTICIPANTSNational sample of adult ICUs participating in National Healthcare Safety Network (NHSN) surveillance.
METHODSHospitals were surveyed to determine compliance with CL insertion bundle elements in ICUs. Corresponding NHSN ICU CLABSI rates were obtained. Multivariate Poisson regression models were used to assess associations between CL bundle compliance and CLABSI rates, controlling for hospital and ICU characteristics.
RESULTSA total of 984 adult ICUs in 632 hospitals were included. Most ICUs had CL bundle policies, but only 69% reported excellent compliance (≥95%) with at least 1 element. Lower CLABSI rates were associated with compliance with just 1 element (incidence rate ratio [IRR] 0.77; 95% confidence interval [CI], 0.64–0.92); however, ≥95% compliance with all 5 elements was associated with the greatest reduction (IRR, 0.67; 95% CI, 0.59–0.77). There was no association between CLABSI rates and simply having a written CL bundle policy nor with bundle compliance <75%. Additionally, better-resourced infection prevention departments were associated with lower CLABSI rates.
CONCLUSIONSOur findings demonstrate the impact of transferring infection prevention interventions to the real-world setting. Compliance with the entire bundle was most effective, although excellent compliance with even 1 bundle element was associated with lower CLABSI rates. The variability in compliance across ICUs suggests that, at the national level, there is still room for improvement in CLABSI reduction.
Infect Control Hosp Epidemiol 2016;37:805–810
Mortality and Costs in Clostridium difficile Infection Among the Elderly in the United States
- Andrew F. Shorr, Marya D. Zilberberg, Li Wang, Onur Baser, Holly Yu
-
- Published online by Cambridge University Press:
- 30 August 2016, pp. 1331-1336
-
- Article
- Export citation
-
OBJECTIVE
To examine attributable mortality and costs of Clostridium difficile infection (CDI) in the Medicare population.
DESIGNA population-based cohort study among US adults aged at least 65 years in the 2008–2010 Medicare 5% sample, with follow-up of 12 months.
PATIENTSIncident CDI episode was defined by the International Classification of Diseases, Ninth Revision, Clinical Modification code of 008.45 and no other occurrences within the preceding 12 months. To quantify the adjusted mortality and costs we developed a 1:1 propensity-matched sample of CDI and non-CDI patients.
RESULTSAmong 1,165,165 patients included, 6,838 (0.6%) had a CDI episode in 2009 (82.5% healthcare-associated). Patients with CDI were older (mean [SD] age, 81.0±8.0 vs 77.0±7.7 years, P<.001), were more likely to come from the Northeast (27.4% vs 18.6%, P<.001), and had a higher comorbidity burden (Charlson score, 4.6±3.3 vs 1.7±2.1, P<.001). Hospitalizations (63.2% vs 6.0%, P<.001) and antibiotics (33.9% vs 12.5%, P<.001) within the prior 90 days were more common in the group with CDI. In the propensity-adjusted analysis, CDI was associated with near doubling of both mortality (42.6% vs 23.4%, P<.001) and total healthcare costs ($64,807±$66,480 vs $38,128±$46,485, P<.001).
CONCLUSIONSAmong elderly patients, CDI is associated with an increase in adjusted mortality and healthcare costs following a CDI episode. Nationwide annually this equals 240,000 patients with CDI, 46,000 potential deaths, and more than $6 billion in costs.
Infect Control Hosp Epidemiol 2016;1–6
Risk Factors for Central Venous Catheter–Associated Bloodstream Infection in Pediatric Patients: A Cohort Study
- Jillian Hansen Carter, Joanne Marie Langley, Stefan Kuhle, Susan Kirkland
-
- Published online by Cambridge University Press:
- 03 May 2016, pp. 939-945
-
- Article
- Export citation
-
OBJECTIVE
To examine the incidence of central-line–associated bloodstream infection (CLABSI) over time and to determine risk factors for CLABSI in hospitalized children.
DESIGNProspective cohort study.
SETTINGPediatric tertiary care referral center in Halifax, Nova Scotia, serving a population of 2.3 million.
PARTICIPANTSPatients ages 0–18 years with central venous catheters (CVCs) inserted at this facility between 1995 and 2013.
METHODSParticipants were followed from CVC insertion to CLABSI event or until CVC removal. Data were prospectively collected by clinicians, infection prevention and control staff, and nursing staff for the purposes of patient care, surveillance, and quality improvement. Cox proportional hazards regression was used to identify risk factors for CLABSI.
RESULTSAmong 5,648 patients, 385 developed CLABSI (0.74 CLABSI per 1,000 line days; or 3.87 per 1,000 in-hospital line days). Most infections occurred within 60 days of insertion. CLABSI rates decreased from 4.87 per 1,000 in-hospital line days in 1995 to 0.78 per 1,000 in-hospital line days in 2013, corresponding to an 84% reduction. A temporal association of CLABSI reduction with a hand hygiene promotion campaign was identified. CVC type, number of lumens, dressing type, insertion vein, and being in the critical care unit were statistically significantly associated with CLABSI.
CONCLUSIONSHospital-wide surveillance over an 18-year period identified children at highest risk for CLABSI and decreasing risk over time; this decrease was temporally associated with a hand hygiene campaign.
Infect Control Hosp Epidemiol 2016;37:939–945
“Bundle” Practices and Ventilator-Associated Events: Not Enough
- John C. O’Horo, Haitao Lan, Charat Thongprayoon, Louis Schenck, Adil Ahmed, Mikhail Dziadzko, Ognjen Gajic, Priya Sampathkumar
-
- Published online by Cambridge University Press:
- 19 September 2016, pp. 1453-1457
-
- Article
- Export citation
-
OBJECTIVE
Ventilator-associated events (VAEs) are nosocomial events correlated with length of stay, costs, and mortality. Current ventilator bundle practices target the older definition of ventilator-associated pneumonia and have not been systematically evaluated for their impact on VAEs.
DESIGNRetrospective cohort study.
SETTINGTertiary medical center between January 2012 and August 2014.
PARTICIPANTSAll adult patients ventilated for at least 24 hours at our institution.
INTERVENTIONSWe conducted univariate analyses for compliance with each element; we focused on VAEs occurring within a 2-day window of failure to meet any ventilator bundle element. We used Cox proportional hazard models to assess the effect of stress ulcer prophylaxis, deep vein thrombosis (DVT) prophylaxis, oral care, and sedation breaks on VAEs. We adjusted models for gender, age, and Acute Physiology and Chronic Health Evaluation (APACHE) III scores.
RESULTSOur cohort comprised 2,660 patients with 16,858 ventilator days and 77 VAEs. Adjusting for APACHE score and gender, only oral care was associated with a reduction in the risk of VAE (hazard ratio [HR], 0.44; 95% confidence interval [CI], 0.26–0.77). The DVT prophylaxis and sedation breaks did not show any significant impact on VAEs. Stress ulcer prophylaxis trended toward an increased risk of VAE (HR, 1.59; 95% CI, 1.00–2.56).
CONCLUSIONAlthough limited by a low baseline rate of VAEs, existing ventilator bundle practices do not appear to target VAEs well. Oral care is clearly important, but the impact of DVT prophylaxis, sedation breaks, and especially stress ulcer prophylaxis are questionable at best.
Infect Control Hosp Epidemiol 2016;1453–1457
Increasing Incidence of Extended-Spectrum β-Lactamase-Producing Escherichia coli in Community Hospitals throughout the Southeastern United States
- Joshua T. Thaden, Vance G. Fowler, Jr, Daniel J. Sexton, Deverick J. Anderson
-
- Published online by Cambridge University Press:
- 13 October 2015, pp. 49-54
-
- Article
-
- You have access Access
- HTML
- Export citation
-
OBJECTIVE
To describe the epidemiology of extended-spectrum β-lactamase (ESBL)-producing Escherichia coli (ESBL-EC) and Klebsiella pneumoniae (ESBL-KP) infections
DESIGNRetrospective cohort
SETTINGInpatient care at community hospitals
PATIENTSAll patients with ESBL-EC or ESBL-KP infections
METHODSESBL-EC and ESBL-KP infections from 26 community hospitals were prospectively entered into a centralized database from January 2009 to December 2014.
RESULTSA total of 925 infections caused by ESBL-EC (10.5 infections per 100,000 patient days) and 463 infections caused by ESBL-KP (5.3 infections per 100,000 patient days) were identified during 8,791,243 patient days of surveillance. The incidence of ESBL-EC infections increased from 5.28 to 10.5 patients per 100,000 patient days during the study period (P=.006). The number of community hospitals with ESBL-EC infections increased from 17 (65%) in 2009 to 20 (77%) in 2014. The median ESBL-EC infection rates among individual hospitals with ≥1 ESBL-EC infection increased from 11.1 infections/100,000 patient days (range, 2.2–33.9 days) in 2009 to 22.1 infections per 100,000 patient days (range, 0.66–134 days) in 2014 (P=.05). The incidence of ESBL-KP infections remained constant over the study period (P=.14). Community-associated and healthcare-associated ESBL-EC infections trended upward (P=.006 and P=.02, respectively), while hospital-onset infections remained stable (P=.07). ESBL-EC infections were more common in females (54% vs 44%, P<.001) and Caucasians (50% vs 40%, P<.0001), and were more likely to be isolated from the urinary tract (61% vs 52%, P<.0001) than ESBL-KP infections.
CONCLUSIONSThe incidence of ESBL-EC infection has increased in community hospitals throughout the southeastern United States, while the incidence of ESBL-KP infection has remained stable. Community- and healthcare-associated ESBL-EC infections are driving the upward trend.
Infect. Control Hosp. Epidemiol. 2015;37(1):49–54
Central-Line–Associated Bloodstream Infections in Québec Intensive Care Units: Results from the Provincial Healthcare-Associated Infections Surveillance Program (SPIN)
- Lynne Li, Elise Fortin, Claude Tremblay, Muleka Ngenda-Muadi, Caroline Quach, for SPIN-BACC
-
- Published online by Cambridge University Press:
- 19 July 2016, pp. 1186-1194
-
- Article
- Export citation
-
BACKGROUND
Following implementation of bundled practices in 2009 in Quebec and Canadian intensive care units (ICUs), we describe CLABSI epidemiology during the last 8 years in the province of Québec (Canada) and compare rates with Canadian and American benchmarks.
METHODSCLABSI incidence rates (IRs) and central venous catheter utilization ratios (CVCURs) by year and ICU type were calculated using 2007–2014 data from the Surveillance Provinciale des Infections Nosocomiales (SPIN) program. Using American and Canadian surveillance data, we compared SPIN IRs to rates in other jurisdictions using standardized incidence ratios (SIRs).
RESULTSIn total, 1,355 lab-confirmed CLABSIs over 911,205 central venous catheter days (CVC days) were recorded. The overall pooled incidence rate (IR) was 1.49 cases per 1,000 CVC days. IRs for adult teaching ICUs, nonteaching ICUs, neonatal ICUs (NICUs), and pediatric ICUs (PICUs) were 1.04, 0.91, 4.20, and 2.15 cases per 1,000 CVC days, respectively. Using fixed SPIN 2007–2009 benchmarks, CLABSI rates had decreased significantly in all ICUs except for PICUs by 2014. Rates declined by 55% in adult teaching ICUs, 52% in adult nonteaching ICUs, and 38% in NICUs. Using dynamic American and Canadian CLABSI rates as benchmarks, SPIN adult teaching ICU rates were significantly lower and adult nonteaching ICUs had lower or comparable rates, whereas NICU and PICU rates were higher.
CONCLUSIONQuébec ICU CLABSI surveillance shows declining CLABSI rates in adult ICUs. The absence of a decrease in CLABSI rate in NICUs and PICUs highlights the need for continued surveillance and analysis of factors contributing to higher rates in these populations.
Infect Control Hosp Epidemiol 2016;1–9
The Economic Burden of Hospital-Acquired Clostridium difficile Infection: A Population-Based Matched Cohort Study
- Natasha Nanwa, Jeffrey C. Kwong, Murray Krahn, Nick Daneman, Hong Lu, Peter C. Austin, Anand Govindarajan, Laura C. Rosella, Suzanne M. Cadarette, Beate Sander
-
- Published online by Cambridge University Press:
- 20 June 2016, pp. 1068-1078
-
- Article
- Export citation
-
BACKGROUND
High-quality cost estimates for hospital-acquired Clostridium difficile infection (CDI) are vital evidence for healthcare policy and decision-making.
OBJECTIVETo evaluate the costs attributable to hospital-acquired CDI from the healthcare payer perspective.
METHODSWe conducted a population-based propensity-score matched cohort study of incident hospitalized subjects diagnosed with CDI (those with the International Statistical Classification of Diseases and Related Health Problems, 10th Revision, Canada code A04.7) from January 1, 2003, through December 31, 2010, in Ontario, Canada. Infected subjects were matched to uninfected subjects (those without the code A04.7) on age, sex, comorbidities, geography, and other variables, and followed up through December 31, 2011. We stratified results by elective and nonelective admissions. The main study outcomes were up-to-3-year costs, which were evaluated in 2014 Canadian dollars.
RESULTSWe identified 28,308 infected subjects (mean annual incidence, 27.9 per 100,000 population, 3.3 per 1,000 admissions), with a mean age of 71.5 years (range, 0–107 years), 54.0% female, and 8.0% elective admissions. For elective admission subjects, cumulative mean attributable 1-, 2-, and 3-year costs adjusted for survival (undiscounted) were $32,151 (95% CI, $28,192–$36,005), $34,843 ($29,298–$40,027), and $37,171 ($30,364–$43,415), respectively. For nonelective admission subjects, the corresponding costs were $21,909 ($21,221–$22,609), $26,074 ($25,180–$27,014), and $29,944 ($28,873–$31,086), respectively.
CONCLUSIONSHospital-acquired CDI is associated with substantial healthcare costs. To the best of our knowledge, this study is the first CDI costing study to present longitudinal costs. New strategies may be warranted to mitigate this costly infectious disease.
Infect Control Hosp Epidemiol 2016;37:1068–1078
Limiting the Number of Lumens in Peripherally Inserted Central Catheters to Improve Outcomes and Reduce Cost: A Simulation Study
- David Ratz, Timothy Hofer, Scott A. Flanders, Sanjay Saint, Vineet Chopra
-
- Published online by Cambridge University Press:
- 01 April 2016, pp. 811-817
-
- Article
- Export citation
-
BACKGROUND
The number of peripherally inserted central catheter (PICC) lumens is associated with thrombotic and infectious complications. Because multilumen PICCs are not necessary in all patients, policies that limit their use may improve safety and cost.
OBJECTIVETo design a simulation-based analysis to estimate outcomes and cost associated with a policy that encourages single-lumen PICC use.
METHODSModel inputs, including risk of complications and costs associated with single- and multilumen PICCs, were obtained from available literature and a multihospital collaborative quality improvement project. Cost savings and reduction in central line–associated bloodstream infection and deep vein thrombosis events from institution of a single-lumen PICC default policy were reported.
RESULTSAccording to our model, a hospital that places 1,000 PICCs per year (25% of which are single-lumen and 75% multilumen) experiences annual PICC-related maintenance and complication costs of $1,228,598 (95% CI, $1,053,175–$1,430,958). In such facilities, every 5% increase in single-lumen PICC use would prevent 0.5 PICC-related central line-associated bloodstream infections and 0.5 PICC-related deep vein thrombosis events, while saving $23,500. Moving from 25% to 50% single-lumen PICC utilization would result in total savings of $119,283 (95% CI, $74,030–$184,170) per year. Regardless of baseline prevalence, a single-lumen default PICC policy would be associated with approximately 10% cost savings. Findings remained robust in multiway sensitivity analyses.
CONCLUSIONHospital policies that limit the number of PICC lumens may enhance patient safety and reduce healthcare costs. Studies measuring intended and unintended consequences of this approach, followed by rapid adoption, appear necessary.
Infect Control Hosp Epidemiol 2016;37:811–817
Transmission Clusters of Methicillin-Resistant Staphylococcus Aureus in Long-Term Care Facilities Based on Whole-Genome Sequencing
- O. Colin Stine, Shana Burrowes, Sophia David, J. Kristie Johnson, Mary-Claire Roghmann
-
- Published online by Cambridge University Press:
- 04 March 2016, pp. 685-691
-
- Article
- Export citation
-
OBJECTIVE
To define how often methicillin-resistant Staphylococcus aureus (MRSA) is spread from resident to resident in long-term care facilities using whole-genome sequencing
DESIGNProspective cohort study
SETTINGA long-term care facility
PARTICIPANTSElderly residents in a long-term care facility
METHODSCultures for MRSA were obtained weekly from multiple body sites from residents with known MRSA colonization over 12-week study periods. Simultaneously, cultures to detect MRSA acquisition were obtained weekly from 2 body sites in residents without known MRSA colonization. During the first 12-week cycle on a single unit, we sequenced 8 MRSA isolates per swab for 2 body sites from each of 6 residents. During the second 12-week cycle, we sequenced 30 MRSA isolates from 13 residents with known MRSA colonization and 3 residents who had acquired MRSA colonization.
RESULTSMRSA isolates from the same swab showed little genetic variation between isolates with the exception of isolates from wounds. The genetic variation of isolates between body sites on an individual was greater than that within a single body site with the exception of 1 sample, which had 2 unrelated strains among the 8 isolates. In the second cycle, 10 of 16 residents colonized with MRSA (63%) shared 1 of 3 closely related strains. Of the 3 residents with newly acquired MRSA, 2 residents harbored isolates that were members of these clusters.
CONCLUSIONSPoint prevalence surveys with whole-genome sequencing of MRSA isolates may detect resident-to-resident transmission more accurately than routine surveillance cultures for MRSA in long-term care facilities.
Infect Control Hosp Epidemiol 2016;37:685–691
Hydrogen Peroxide Vapor Decontamination in a Patient Room Using Feline Calicivirus and Murine Norovirus as Surrogate Markers for Human Norovirus
- Torsten Holmdahl, Mats Walder, Nathalie Uzcátegui, Inga Odenholt, Peter Lanbeck, Patrik Medstrand, Anders Widell
-
- Published online by Cambridge University Press:
- 10 February 2016, pp. 561-566
-
- Article
- Export citation
-
OBJECTIVE
To determine whether hydrogen peroxide vapor (HPV) could be used to decontaminate caliciviruses from surfaces in a patient room.
DESIGNFeline calicivirus (FCV) and murine norovirus (MNV) were used as surrogate viability markers to mimic the noncultivable human norovirus. Cell culture supernatants of FCV and MNV were dried in triplicate 35-mm wells of 6-well plastic plates. These plates were placed in various positions in a nonoccupied patient room that was subsequently exposed to HPV. Control plates were positioned in a similar room but were never exposed to HPV.
METHODSVirucidal activity was measured in cell culture by reduction in 50% tissue culture infective dose titer for FCV and by both 50% tissue culture infective dose titer and plaque reduction for MNV.
RESULTSNeither viable FCV nor viable MNV could be detected in the test room after HPV treatment. At least 3.65 log reduction for FCV and at least 3.67 log reduction for MNV were found by 50% tissue culture infective dose. With plaque assay, measurable reduction for MNV was at least 2.85 log units.
CONCLUSIONSThe successful inactivation of both surrogate viruses indicates that HPV could be a useful tool for surface decontamination of a patient room contaminated by norovirus. Hence nosocomial spread to subsequent patients can be avoided.
Infect Control Hosp Epidemiol 2016;37:561–566
Seasonal Variation of Escherichia coli, Staphylococcus aureus, and Streptococcuspneumoniae Bacteremia According to Acquisition and Patient Characteristics: A Population-Based Study
- Kim Oren Gradel, Stig Lønberg Nielsen, Court Pedersen, Jenny Dahl Knudsen, Christian Østergaard, Magnus Arpi, Thøger Gorm Jensen, Hans Jørn Kolmos, Mette Søgaard, Annmarie Touborg Lassen, Henrik Carl Schønheyder, for the Danish Collaborative Bacteraemia Network (DACOBAN) and the Danish Observational Registry of Infectious Syndromes (DORIS)
-
- Published online by Cambridge University Press:
- 04 May 2016, pp. 946-953
-
- Article
- Export citation
-
OBJECTIVE
Seasonal variation is a characteristic of many infectious diseases, but relatively little is known about determinants thereof. We studied the impact of place of acquisition and patient characteristics on seasonal variation of bacteremia caused by the 3 most common pathogens.
DESIGNSeasonal variation analysis.
METHODSIn 3 Danish health regions (2.3 million total inhabitants), patients with bacteremia were identified from 2000 through 2011 using information from laboratory information systems. Analyses were confined to Escherichia coli, Staphylococcus aureus, and Streptococcus pneumoniae. Additional data were obtained from the Danish National Hospital Registry for the construction of admission histories and calculation of the Charlson comorbidity index (CCI). Bacteremias were categorized as community acquired, healthcare associated (HCA), and hospital acquired. We defined multiple subgroups by combining the following characteristics: species, acquisition, age group, gender, CCI level, and location of infection. Assuming a sinusoidal model, seasonal variation was assessed by the peak-to-trough (PTT) ratio with a 95% confidence interval (CI).
RESULTSIn total, we included 16,006 E. coli, 6,924 S. aureus, and 4,884 S. pneumoniae bacteremia cases. For E. coli, the seasonal variation was highest for community-acquired cases (PTT ratio, 1.24; 95% CI, 1.17–1.32), was diminished for HCA (PTT ratio, 1.14; 95% CI, 1.04–1.25), and was missing for hospital-acquired cases. No seasonal variation was observed for S. aureus. S. pneumoniae showed high seasonal variation, which did not differ according to acquisition (overall PTT ratio, 3.42; 95% CI, 3.10–3.83).
CONCLUSIONSSeasonal variation was mainly related to the species although the place of acquisition was important for E. coli.
Infect Control Hosp Epidemiol 2016;37:946–953
Current Capabilities and Capacity of Ebola Treatment Centers in the United States
- Jocelyn J. Herstein, Paul D. Biddinger, Colleen S. Kraft, Lisa Saiman, Shawn G. Gibbs, Aurora B. Le, Philip W. Smith, Angela L. Hewlett, John J. Lowe
-
- Published online by Cambridge University Press:
- 08 December 2015, pp. 313-318
-
- Article
- Export citation
-
OBJECTIVE
To describe current Ebola treatment center (ETC) locations, their capacity to care for Ebola virus disease patients, and infection control infrastructure features.
DESIGNA 19-question survey was distributed electronically in April 2015. Responses were collected via email by June 2015 and analyzed in an electronic spreadsheet.
SETTINGThe survey was sent to and completed by site representatives of each ETC.
PARTICIPANTSThe survey was sent to all 55 ETCs; 47 (85%) responded.
RESULTSOf the 47 responding ETCs, there are 84 isolation beds available for adults and 91 for children; of these pediatric beds, 35 (38%) are in children’s hospitals. In total, the simultaneous capacity of the 47 reporting ETCs is 121 beds. On the basis of the current US census, there are 0.38 beds per million population. Most ETCs have negative pressure isolation rooms, anterooms, and a process for category A waste sterilization, although only 11 facilities (23%) have the capability to sterilize infectious waste on site.
CONCLUSIONSFacilities developed ETCs on the basis of Centers for Disease Control and Prevention guidance, but specific capabilities are not mandated at this present time. Owing to the complex and costly nature of Ebola virus disease treatment and variability in capabilities from facility to facility, in conjunction with the lack of regulations, nationwide capacity in specialized facilities is limited. Further assessments should determine whether ETCs can adapt to safely manage other highly infectious disease threats.
Infect. Control Hosp. Epidemiol. 2016;37(3):313–318
Predictors of Persistent Carbapenem-Resistant Enterobacteriaceae Carriage upon Readmission and Score Development
- Pnina Ciobotaro, Natalie Flaks-Manov, Maly Oved, Ami Schattner, Moshe Hoshen, Eli Ben-Yosef, Ran D. Balicer, Oren Zimhony
-
- Published online by Cambridge University Press:
- 11 January 2016, pp. 188-196
-
- Article
- Export citation
-
BACKGROUND
Carriers of carbapenem-resistant Enterobacteriaceae (CRE) are often readmitted, exposing patients to CRE cross-transmission.
OBJECTIVETo identify predictors of persistent CRE carriage upon readmission, directing a risk prediction score.
DESIGNRetrospective cohort study.
SETTINGUniversity-affiliated general hospital.
PATIENTSA cohort of 168 CRE carriers with 474 readmissions.
METHODSThe primary and secondary outcomes were CRE carriage status at readmission and length of CRE carriage. Predictors of persistent CRE carriage upon readmission were analyzed using a generalized estimating equations (GEE) multivariable model. Readmissions were randomly divided into derivation and validation sets. A CRE readmission score was derived to predict persistent CRE carriage in 3 risk groups: high, intermediate, and low. The discriminatory ability of the model and the score were expressed as C statistics.
RESULTSCRE carrier status persisted for 1 year in 33% of CRE carriers. Positive CRE status was detected in 202 of 474 readmissions (42.6%). The following 4 variables were associated with persistent CRE carriage at readmission: readmission within 1 month (odds ratio [OR], 6.95; 95% confidence interval [CI], 2.79–17.30), positive CRE status on preceding admission (OR, 5.46; 95% CI, 3.06–9.75), low Norton score (OR, 3.07; 95% CI, 1.26–7.47), and diabetes mellitus (OR, 1.84; 95% CI, 0.98–3.44). The C statistics were 0.791 and 0.789 for the derivation set (n=322) model and score, respectively, and the C statistic was 0.861 for the validation set of the score (n=152). The rates of CRE carriage at readmissions (validation set) for the groups with low, intermediate, and high scores were 8.6%, 38.9%, and 77.6%, respectively.
CONCLUSIONSCRE carrier state commonly persists upon readmission, and this risk can be estimated to guide screening policy and infection control measures.
Infect. Control Hosp. Epidemiol. 2016;37(2):188–196
Cost-Effectiveness Analysis of the Use of Probiotics for the Prevention of Clostridium difficile–Associated Diarrhea in a Provincial Healthcare System
- Jenine R. Leal, Steven J. Heitman, John M. Conly, Elizabeth A. Henderson, Braden J. Manns
-
- Published online by Cambridge University Press:
- 05 July 2016, pp. 1079-1086
-
- Article
- Export citation
-
OBJECTIVE
To conduct a full economic evaluation assessing the costs and consequences related to probiotic use for the primary prevention of Clostridium difficile–associated diarrhea (CDAD).
DESIGNCost-effectiveness analysis using decision analytic modeling.
METHODSA cost-effectiveness analysis was used to evaluate the risk of CDAD and the costs of receiving oral probiotics versus not over a time horizon of 30 days. The target population modeled was all adult inpatients receiving any therapeutic course of antibiotics from a publicly funded healthcare system perspective. Effectiveness estimates were based on a recent systematic review of probiotics for the primary prevention of CDAD. Additional estimates came from local data and the literature. Sensitivity analyses were conducted to assess how plausible changes in variables impacted the results.
RESULTSTreatment with oral probiotics led to direct costs of CDN $24 per course of treatment per patient. On average, patients treated with oral probiotics had a lower overall cost compared with usual care (CDN $327 vs $845). The risk of CDAD was reduced from 5.5% in those not receiving oral probiotics to 2% in those receiving oral probiotics. These results were robust to plausible variation in all estimates.
CONCLUSIONSOral probiotics as a preventive strategy for CDAD resulted in a lower risk of CDAD as well as cost-savings. The cost-savings may be greater in other healthcare systems that experience a higher incidence and cost associated with CDAD.
Infect Control Hosp Epidemiol 2016;37:1079–1086
A Regional Outbreak of Clostridium difficile PCR-Ribotype 027 Infections in Southeastern France from a Single Long-Term Care Facility
- Nadim Cassir, Jean-Christophe Delarozière, Gregory Dubourg, Marion Delord, Jean-Christophe Lagier, Phillipe Brouqui, Florence Fenollar, Didier Raoult, Pierre Edouard Fournier
-
- Published online by Cambridge University Press:
- 03 August 2016, pp. 1337-1341
-
- Article
- Export citation
-
OBJECTIVE
To describe and analyze a large outbreak of Clostridium difficile 027 (CD-027) infections.
METHODSConfirmed CD-027 cases were defined as CD infection plus real-time polymerase chain reaction assay (PCR) positive for CD-027. Clinical and microbiological data on patients with CD-027 infection were collected from January 2013 to December 2015 in the Provence-Alpes-Côte-d’Azur region (southeastern France).
RESULTSIn total, 19 healthcare facilities reported 144 CD-027 infections (112 confirmed and 32 probable CD-027 infections) during a 22-month period outbreak. Although the incidence rate per 10,000 bed days was lower in long-term care facilities (LTCFs) than in acute care facilities (0.05 vs 0.14; P<.001), cases occurred mainly in LTCFs, one of which was the probable source of this outbreak. After centralization of CD testing, the rate of confirmed CD-027 cases from LTCFs or residential-care homes increased significantly (69% vs 92%; P<.001). Regarding confirmed CD-027 patients, the sex ratio and the median age were 0.53 and 84.2 years, respectively. The 30-day crude mortality rate was 31%. Most patients (96%) had received antibiotics within 3 months prior to the CD colitis diagnosis. During the study period, the rate of patients with CD-027 (compared with all patients tested in the point-of-care laboratories) decreased significantly (P=.03).
CONCLUSIONSA large CD-027 outbreak occurred in southeastern France as a consequence of an initial cluster of cases in a single LTCF. Successful interventions included rapid isolation and testing of residents with potentially infectious diarrhea and cohorting of case patients in a specialized infectious diseases ward to optimize management.
Infect Control Hosp Epidemiol 2016;1–5
The Epidemiology of Carbapenem-Resistant Klebsiella pneumoniae Colonization and Infection among Long-Term Acute Care Hospital Residents
- John P Mills, Naasha J Talati, Kevin Alby, Jennifer H Han
-
- Published online by Cambridge University Press:
- 12 October 2015, pp. 55-60
-
- Article
- Export citation
-
OBJECTIVE
An improved understanding of carbapenem-resistant Klebsiella pneumoniae (CRKP) in long-term acute care hospitals (LTACHs) is needed. The objective of this study was to assess risk factors for colonization or infection with CRKP in LTACH residents.
METHODSA case-control study was performed at a university-affiliated LTACH from 2008 to 2013. Cases were defined as all patients with clinical cultures positive for CRKP and controls were those with clinical cultures positive for carbapenem-susceptible K. pneumoniae (CSKP). A multivariate model was developed to identify risk factors for CRKP infection or colonization.
RESULTSA total of 222 patients were identified with K. pneumoniae clinical cultures during the study period; 99 (45%) were case patients and 123 (55%) were control patients. Our multivariate analysis identified factors associated with a significant risk for CRKP colonization or infection: solid organ or stem cell transplantation (OR, 5.05; 95% CI, 1.23–20.8; P=.03), mechanical ventilation (OR, 2.56; 95% CI, 1.24–5.28; P=.01), fecal incontinence (OR, 5.78; 95% CI, 1.52–22.0; P=.01), and exposure in the prior 30 days to meropenem (OR, 3.55; 95% CI, 1.04–12.1; P=.04), vancomycin (OR, 2.94; 95% CI, 1.18–7.32; P=.02), and metronidazole (OR, 4.22; 95% CI, 1.28–14.0; P=.02).
CONCLUSIONSRates of colonization and infection with CRKP were high in the LTACH setting, with nearly half of K. pneumoniae cultures demonstrating carbapenem resistance. Further studies are needed on interventions to limit the emergence of CRKP in LTACHs, including targeted surveillance screening of high-risk patients and effective antibiotic stewardship measures.
Infect. Control Hosp. Epidemiol. 2015;37(1):55–60
Current State of Antimicrobial Stewardship at Solid Organ and Hematopoietic Cell Transplant Centers in the United States
- Susan K. Seo, Kaming Lo, Lilian M. Abbo
-
- Published online by Cambridge University Press:
- 26 July 2016, pp. 1195-1200
-
- Article
- Export citation
-
OBJECTIVE
To assess the extent of antimicrobial stewardship programs (ASPs) at solid organ transplant (SOT) and hematopoietic cell transplant (HCT) centers in the United States.
DESIGNAn 18-item voluntary survey was developed to gauge current antimicrobial stewardship practices in transplant patients, examine the availability and perceived usefulness of novel diagnostics and azole levels to guide therapy, and identify challenges for implementation of ASPs at these centers.
PARTICIPANTSThe survey was distributed electronically to infectious disease physicians and pharmacists at adult and pediatric SOT and HCT centers during May 1–22, 2015. Facilities were deidentified.
RESULTSAfter duplicate removal, 71 (56%) of 127 unique transplant centers in 32 states were analyzed. Forty-four sites (62%) performed at least 100 SOT annually, and 40 (56%) performed at least 100 HCT annually. Top 5 stewardship activities encompassing transplant patients were formulary restriction, guideline development, prospective audit and feedback, education, and dose optimization. Respiratory viral panels (66/66 [100%]), azole levels (64/66 [97%]), and serum/bronchoalveolar lavage galactomannan (58/66 [88%]) were perceived as most useful to guide therapy. Apparent challenges to antimicrobial stewardship included undefined duration for certain infections (53/59 [90%]), diagnostic uncertainty (47/59 [80%]), the perception that antibiotic-resistant infections required escalation (42/59 [71%]), prescriber opposition (41/59 [69%]), and costly drugs (37/59 [63%]).
CONCLUSIONSASP activities were performed at many adult and pediatric SOT and HCT centers in the United States. Diagnostic and therapeutic uncertainty in transplant patients is challenging for ASPs. Collaborative research should examine the impact of antimicrobial stewardship practices in SOT and HCT.
Infect Control Hosp Epidemiol 2016;1–6
Risk Factors for Surgical Site Infections Following Adult Spine Operations
- Ambar Haleem, Hsiu-Yin Chiang, Ravindhar Vodela, Andrew Behan, Jean M. Pottinger, Joseph Smucker, Jeremy D. Greenlee, Charles Clark, Loreen A. Herwaldt
-
- Published online by Cambridge University Press:
- 30 August 2016, pp. 1458-1467
-
- Article
- Export citation
-
OBJECTIVE
To identify risk factors for surgical site infections (SSIs) after spine operations.
DESIGNCase-control study of SSIs among patients undergoing spine operations.
SETTINGAn academic health center.
PATIENTSWe studied patients undergoing spinal fusions or laminectomies at the University of Iowa Hospitals and Clinics from January 1, 2007, through June 30, 2009. We included patients who acquired SSIs meeting the National Healthcare Safety Network definition. We randomly selected controls among patients who had spine operations during the study period and did not meet the SSI definition.
RESULTSIn total, 54 patients acquired SSIs after 2,309 spine operations (2.3 per 100 procedures). SSIs were identified a median of 20 days after spinal fusions and 17 days after laminectomies; 90.7% were identified after discharge and 72.2% were deep incisional or organ-space infections. Staphylococcus aureus caused 53.7% of SSIs. Of patients with SSIs, 64.9% (fusion) and 70.6% (laminectomy) were readmitted and 59.5% (fusion) and 64.7% (laminectomy) underwent reoperation. By multivariable analysis, increased body mass index, Surgical Department A, fusion of 4–8 vertebrae, and operation at a thoracic or lumbar/sacral level were significant risk factors for SSIs after spinal fusions. Lack of private insurance and hypertension were significant risk factors for SSIs after laminectomies. Surgeons from Department A were more likely to use nafcillin or vancomycin for perioperative prophylaxis and to do more multilevel fusions than surgeons from Department B.
CONCLUSIONSSSIs after spine operations significantly increase utilization of healthcare resources. Possible remediable risk factors include obesity, hypertension, and perioperative antimicrobial prophylaxis.
Infect Control Hosp Epidemiol 2016;1458–1467
Antimicrobial Stewardship in a Long-Term Acute Care Hospital Using Offsite Electronic Medical Record Audit
- Kirthana Beaulac, Silvia Corcione, Lauren Epstein, Lisa E. Davidson, Shira Doron
-
- Published online by Cambridge University Press:
- 11 January 2016, pp. 433-439
-
- Article
- Export citation
-
OBJECTIVE
To offer antimicrobial stewardship to a long-term acute care hospital using telemedicine.
METHODSWe conducted an uninterrupted time-series analysis to measure the impact of antimicrobial stewardship on hospital-acquired Clostridium difficile infection (CDI) rates and antimicrobial use. Simple linear regression was used to analyze changes in antimicrobial use; Poisson regression was used to estimate the incidence rate ratio in CDI rates. The preimplementation period was April 1, 2010–March 31, 2011; the postimplementation period was April 1, 2011–March 31, 2014.
RESULTSDuring the preimplementation period, total antimicrobial usage was 266 defined daily doses (DDD)/1,000 patient-days (PD); it rose 4.54 (95% CI, −0.19 to 9.28) per month then significantly decreased from preimplementation to postimplementation (−6.58 DDD/1,000 PD [95% CI, −11.48 to −1.67]; P=.01). The same trend was observed for antibiotics against methicillin-resistant Staphylococcus aureus (−2.97 DDD/1,000 PD per month [95% CI, −5.65 to −0.30]; P=.03). There was a decrease in usage of anti-CDI antibiotics by 50.4 DDD/1,000 PD per month (95% CI, −71.4 to −29.2; P<.001) at program implementation that was maintained afterwards. Anti-Pseudomonas antibiotics increased after implementation (30.6 DDD/1,000 PD per month [95% CI, 4.9–56.3]; P=.02) but with ongoing education this trend reversed. Intervention was associated with a decrease in hospital-acquired CDI (incidence rate ratio, 0.57 [95% CI, 0.35–0.92]; P=.02).
CONCLUSIONAntimicrobial stewardship using an electronic medical record via remote access led to a significant decrease in antibacterial usage and a decrease in CDI rates.
Infect. Control Hosp. Epidemiol. 2016;37(4):433–439