The Sixth Decennial International Conference on Healthcare-Associated Infections Abstracts, March 2020: Global Solutions to Antibiotic Resistance in Healthcare
Oral Presentations
Regional Public Health Response to Emerging Carbapenamase-Producing Organisms in Central Florida, 2019
- Danielle A. Rankin, Justine M. Celli, Danielle L. Walden, Allison Chan, Albert Burks, Nicole Castro, Sasha Nelson, Marion A. Kainer, Alvina K. Chu, Nychie Q. Dotson, Maroya S. Walters
-
- Published online by Cambridge University Press:
- 02 November 2020, p. s50
-
- Article
-
- You have access Access
- Export citation
-
Background: Detection of unusual carbapenemase-producing organisms (CPOs) in a healthcare facility may signify broader regional spread. During investigation of a VIM-producing Pseudomonas aeruginosa (VIM-CRPA) outbreak in a long-term acute-care hospital in central Florida, enhanced surveillance identified VIM-CRPA from multiple facilities, denoting potential regional emergence. We evaluated infection control and performed screening for CPOs in skilled nursing facilities (SNFs) across the region to identify potential CPO reservoirs and improve practices. Methods: All SNFs in 2 central Florida counties were offered a facility-wide point-prevalence survey (PPS) for CPOs and a nonregulatory infection control consultation. PPSs were conducted using a PCR-based screening method; specimens with a carbapenemase gene detected were cultured to identify the organisms. Infection control assessments focused on direct observations of hand hygiene (HH), environmental cleaning, and the sink splash zone. Thoroughness of environmental cleaning was evaluated using fluorescent markers applied to 6 standardized high-touch surfaces in at least 2 rooms per facility. Results: Overall, 21 (48%) SNFs in the 2-county region participated; 18 conducted PPS. Bed size ranged from 40 to 391, 5 (24%) facilities were ventilator-capable SNFs (vSNFs), and 12 had short-stay inpatient rehabilitation units. Of 1,338 residents approached, 649 agreed to rectal screening, and 14 (2.2%) carried CPOs. CPO-colonized residents were from the ventilator-capable units of 3 vSNFs (KPC-CRE=7; KPC-CRPA=1) and from short-stay units of 2 additional facilities (VIM-CRPA, n = 5; KPC-CRE, n = 1). Among the 5 facilities where CPO colonization was identified, the prevalence ranged from 1.1% in a short-stay unit to 16.1% in a ventilator unit. All facilities had access to soap and water in resident bathrooms; 14 (67%) had alcohol-based hand rubs accessible. Overall, mean facility HH adherence was 52% (range, 37%–66%; mean observations per facility = 106) (Fig. 1). We observed the use of non–EPA-registered disinfectants and cross contamination from dirty to clean areas during environmental cleaning; the overall surface cleaning rate was 46% (n = 178 rooms); only 1 room had all 6 markers removed. Resident supplies were frequently stored in the sink splash zone. Conclusions: A regional assessment conducted in response to emergence of VIM-CRPA identified a relatively low CPO prevalence at participating SNFs; CPOs were primarily identified in vSNFs and among short-stay residents. Across facilities, we observed low adherence to core infection control practices that could facilitate spread of CPOs and other resistant organisms. In this region, targeting ventilator and short-stay units of SNFs for surveillance and infection control efforts may have the greatest prevention impact.
Funding: None
Disclosures: None
Room for Improvement: Results of a Baseline Evaluation of Environmental Cleaning in a Resource-Limited Neonatal Unit
- Angela Dramowski, Marina Aucamp, Adrie Bekker, Kedisaletse Moloto, Sheylyn Pillay, Mark Cotton, Susan Coffin, Andrew Whitelaw
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s50-s51
-
- Article
-
- You have access Access
- Export citation
-
Background: Contamination of the near-patient hospital environment including work surfaces and equipment, contributes to skin colonization and subsequent invasive bacterial infections in hospitalized neonates. In resource-limited settings, cleaning of the neonatal ward environment and equipment is seldom standardized and infrequently audited. Methods: A baseline multimodal assessment of surface and equipment cleaning was performed in a 30-bed high-care neonatal ward in Cape Town, South Africa, October 7–9, 2019. Adequacy of routine cleaning was evaluated using ATP bioluminescence assays, fluorescent ultraviolet (UV) markers, and quantitative bacterial surface cultures. For flat surfaces (eg, tables, incubators, trolleys), a 10×10-cm template was used to standardize the swab inoculum; for small equipment and devices with complex surfaces (eg, humidifiers, suction apparatus, stethoscopes), a standard swabbing protocol was developed for each item. Swabs in liquid transport medium were processed in the laboratory by vortexing for 30 seconds, plating onto blood and MacConkey agars, and incubating at 37°C for 48 hours. Manual counting of bacterial colony forming units was performed, followed by conventional biochemical testing and/or VITEK automated identification. Results: Of 100 swabs (58 from surfaces and 42 from equipment), 11 yielded growth of known neonatal pathogens (Enterobacteriaceae, A. baumannii, P. aeruginosa, S. aureus, S. agalactiae, and enterococci), 36 isolated potential neonatal pathogens (mostly coagulase-negative staphylococci). In addition, 4 grew environmental organisms and 49 showed no growth. The highest aerobic colony counts (ACCs) were obtained from swabs of suction tubing, milk kitchen surfaces, humidifiers, and sinks; the median ACC from swabs with any bacterial growth (n = 51) was 3 (IQR, 1–22). Only 40% of the 100 surface and equipment swabs had ATP values <200 relative light units (RLU) threshold for cleanliness. Median ATP values were 301 (IQ range, 179–732) RLUs for surface swabs versus 230 (IQ range, 78–699) RLUs for equipment swabs (P = .233). Of the 100 fluorescent UV markers placed on near-patient surfaces and high-touch equipment, only 23% had been removed after 2 staff shift changes (24 hours later). Surfaces had a higher proportion of UV marker removal than equipment (19 of 58 [32.8%] vs 4 of 42 [9.5%]; P = .008). Conclusions: Environmental cleaning of this neonatal ward was suboptimal, especially for equipment. Improvement of environmental cleaning practices is an important intervention for neonatal infection prevention in resource-limited settings. Future studies should evaluate the impact of staff training, environmental cleaning tools and repeated audit with feedback, on the adequacy of cleaning in neonatal wards.
Funding: Funding: for the laboratory work was provided by The Society for Healthcare Epidemiology of America (SHEA) International Ambassador Alumni Research Award and a South African Medical Research Council Self-initiated Research (SIR) Grant to Angela Dramowski, who is supported by a NIH Fogarty Emerging Global Leader Award K43 TW010682.
Disclosures: None
Significant Regional Differences in Antibiotic Use Across 576 US Hospitals and 11,701,326 Million Admissions, 2016–2017
- Katherine Goodman, Sara Cosgrove, Lisa Pineles, Laurence Magder, Deverick John Anderson, Elizabeth Dodds Ashley, Ronald Polk, Hude Quan, William Trick, Keith Woeltje, Surbhi Leekha, Anthony Harris
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s51-s52
-
- Article
-
- You have access Access
- Export citation
-
Background: Reducing inappropriate antibiotic use is critical for fighting antibiotic resistance. Quantifying the amount and diversity of antibiotic use in US hospitals is foundational to these efforts but hampered by limited national surveillance. The current study aims to address this knowledge gap by examining adult inpatient antibiotic usage, including regional, facility, and case-mix differences, across 576 hospitals and nearly 12 million encounters in 2016–2017. Methods: We conducted a retrospective cohort study of patients aged ≥18 years discharged from hospitals in the Premier Healthcare Database, a repository of nearly 1 of every 4 annual US hospitalizations, between January 1, 2016, and December 31, 2017. Detailed hospital- and patient-level data were extracted for each admission. Facilities were classified geographically by census division. Using daily antibiotic charge data, we mapped antibiotics to 18 mutually exclusive classes and to categories based upon spectrum of activity. Patient-level data were transformed into hospital case-mix variables (eg, hospital mean patient age), and relationships between antibiotic days of therapy (DOTs), and these and other facility-level variables were evaluated in negative binomial regression models. Results: The study included 11,701,326 adult admissions, totaling 64,064,632 patient days across 576 US hospitals. Overall, antibiotics were used in 65% of all hospitalizations, at a rate of 870 DOTs per 1,000 patient days. The most commonly used classes per patient days were
β-lactam/β-lactamase inhibitor combinations (206 DOTs), third- and fourth-generation cephalosporins (128 DOTs), and glycopeptides (113 DOTs) (Fig. 1). By spectrum of activity, antipseudomonal agents (245 DOTs) were the most common. Crude usage rates varied by geographic region (Fig. 2). In multivariable analyses, teaching hospitals, and/or larger bed sizes were independently associated with lower use across a range of antibiotic classes (adjusted IRR ranges, 0.90–0.94 and 0.96–0.98, respectively). Significant regional differences also persisted. Compared to the South Atlantic region (chosen as the reference category because it had the largest representation in the cohort), rates of total antibiotic use were 6%, 15%, and 18% lower on average in the Pacific, New England, and the Middle Atlantic regions, respectively. By class, carbapenems reflected the most geographic variability. Conclusions: In a large, diverse cohort of US hospitals, adult inpatients received antibiotics at a rate similar to, but higher than, previously published estimates. In adjusted models, lower antibiotic use was frequently associated with facilities likely to have robust antibiotic stewardship programs—those with teaching status and larger bed size. Further research to understand other reasons for regional differences in antibiotic use such as different rates of resistance is needed.
Funding: This work was supported by Funding: from the Agency for Healthcare Research and Quality (AHRQ) (R01-HS026205 to A.D.H.).
Disclosures: None
Surgical Antibiotic Prophylaxis in Hysterectomy: Is Cefazolin Still the Best?
- Alyssa Valentyne, Fauzia Osman, Ahmed Al-Niaimi, Aurora Pop-Vicas
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s52-s53
-
- Article
-
- You have access Access
- Export citation
-
Background: Prior studies suggest that cefazolin, widely used for antibiotic prophylaxis in hysterectomy, might not prevent surgical site infections (SSIs) as well as antibiotics with a broader antianaerobic antimicrobial spectrum. We compared the effectiveness of cefazolin versus antibiotic regimens with a broader antimicrobial spectrum in a ≥500-bed regional referral center. Methods: Study design: retrospective cohort. Population and setting: patients ≥18 years old who underwent hysterectomy between 1998 and 2018 at the University of Wisconsin Hospital. Analysis: propensity score matching with a caliper of 0.2 to select controls for cefazolin treatment, matching on: age, body mass index (BMI), diabetes, length of stay, duration of surgery, and preoperative renal function. We conducted a crude SSI incidence analysis and adjusted for additional covariates (malignancy, intraoperative temperature, and preoperative glucose level) using a Cox proportional hazards model. All analyses were conducted using STATA SE v15 software. Results: We had 4,087 hysterectomy patients, with 123 SSIs (3%). Among these SSIs, 46%, 11%, and 42% were superficial, deep, and organ-space, respectively. Malignancies were present in 83% of SSI patients, with 30% being ovarian cancer. Risk factors for SSI in the unmatched sample multivariable analysis (MV) were length of stay (aHR, 1.1; 95% CI, 1.05–1.1; P < .001), duration of surgery (aHR, 1.2; 95% CI, 1.1–1.32; P < .001), and BMI (aHR, 1.04; 95% CI, 1.02–1.06; P < .001). After propensity matching, 2,282 hysterectomies remained. In the crude incidence analysis, cefazolin (IR, 6.0) had a protective SSI effect compared to cefoxitin (IR, 7.1), ciprofloxacin/metronidazole (IR, 27.2), clindamycin/gentamicin (IR, 14.1), any antianaerobic regimen (IR, 8.0), and regimens not guideline recommended (IR, 11.7). In our MV analyses of cefazolin versus comparator antibiotic regimens, we found that hypothermia was consistently associated with a higher SSI risk (P ≤ .03). Receipt of a β-lactam antibiotic regimen was associated with a significantly lower SSI risk (aHR, 0.31; 95% CI, 0.11–0.89, P = .03), but cefazolin’s protective SSI effect was no longer statistically significant. Conclusions: We found that cefazolin had a lower SSI risk compared to other antibiotic regimens, including those with better antianaerobic spectrum, in our tertiary-care hospital’s 11-year high-risk cohort. Our analysis suggests that maintaining intraoperative normothermia and administering β-lactam antibiotic prophylaxis are important modifiable risk factors for SSI prevention.
Funding: None
Disclosures: None
Transmission of Listeriosis in a Neonatal Intensive Care Unit Supported by Whole-Genome Sequencing
- Janice Kim, Hilary Rosen, Kristen Angel, Azarnoush Maroufi, Samantha Tweeten, Jacqueline Lui, John Crandall, Tracy Lanier, Jane Siegel, Akiko Kimura
-
- Published online by Cambridge University Press:
- 02 November 2020, p. s53
-
- Article
-
- You have access Access
- Export citation
-
Background: Listeriosis is a rare but serious infectious disease caused by Listeria monocytogenes (LM) and predominantly transmitted through contaminated food. Moreover, 15% of listeriosis cases in the United States are pregnancy associated; nosocomial neonatal transmission in hospitals is extremely rare. In July 2018, the California Department of Public Health (CDPH) was notified of 4 patients, a mother–neonate pair and twin neonates, with listeriosis at the same hospital. The CDPH and San Diego County Health and Human Services Agency initiated an investigation to determine transmission and prevent additional infections. Methods: We reviewed medical records of the neonates and their mothers, interviewed the mothers with a detailed food exposure questionnaire, interviewed healthcare personnel (HCP), and performed an infection control assessment of the neonatal intensive care unit (NICU). CDPH performed whole-genome sequencing (WGS) on LM isolates that were then analyzed by whole-genome multilocus sequence typing (wgMLST) by the Centers for Diseases Control and Prevention (CDC) to assess relatedness in PulseNet, a public health laboratory database. The CDC also performed testing for LM on formalin-fixed placentas from the mother of the twins. Results: During a 1-week period, 4 patients with LM were identified at the hospital. A mother was admitted at 31 weeks gestation with acute abdominal and back pain that progressed with precipitous vaginal delivery and postpartum sepsis. Her neonate was resuscitated, transported to the NICU, underwent a sepsis evaluation, received antibiotics, and was transferred to another hospital within 6 hours. Maternal blood, placenta, and neonatal blood cultures grew LM. Twin neonates, born to an asymptomatic mother and present in the NICU during the index neonate’s stay, developed acute infection 4 and 6 days after the index neonate’s transfer; blood cultures confirmed LM. The LM isolates from the 4 patients were indistinguishable by wgMLST and were not related to other PulseNet isolates. LM was not detected in the twin placentas. There were no common food exposures between the mothers. At least 1 common HCP cared for all 3 neonates. Infection control lapses included lack of proper hand hygiene during the index neonate’s resuscitation and potentially after cleaning and disinfection of the neonate’s incubator. Conclusions: This report provides supportive evidence that nosocomial transmission of LM can occur during a brief NICU stay due to lapses in infection control practices. Strict adherence to standard precautions in the delivery room and NICU is imperative to prevent cross transmission.
Disclosures: None
Funding: None
Trends in Hospital Onset Clostridioides difficile Infection Incidence, National Healthcare Safety Network, 2010–2018
- Yi Mu, Margaret Dudeck, Karen Jones, Qunna Li, Minn Soe, Allan Nkwata, Jonathan Edwards
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s53-s54
-
- Article
-
- You have access Access
- Export citation
-
Background:Clostridioides difficile infection (CDI) is one of the most common laboratory-identified (LabID) healthcare-associated events reported to the National Healthcare Safety Network (NHSN). CDI prevention remains a national priority, and efforts to reduce infection burden and improve antibiotic stewardship continue to expand across the healthcare spectrum. Beginning in 2013, the Centers for Medicare and Medicaid Services (CMS) required acute-care hospitals participating in CMS’ Inpatient Quality Reporting program to report CDI LabID data to NHSN and, in 2015, extended this reporting requirement to emergency departments (ED) and 24-hour observation units. To assess national progress, we evaluated changes in hospital onset CDI (HO-CDI) incidence during 2010–2018. Methods: Cases of HO-CDI were reported to NHSN by hospitals using the NHSN’s LabID criteria. Generalized linear mixed-effects modeling was used to assess trends of HO-CDI by treating the hospital as a random intercept to account for the correlation of the repeated responses over time. The data were summarized at the quarterly level, the main effect was time, and the covariates of interest were the following: CDI test type, inpatient community-onset (CO) infection rate, hospital type, average length of stay, medical school affiliation, number of beds, number of ICU beds, number of infection control professionals, presence of an ED or observation unit , and an indicator for 2015 to account for CDI protocol changes that required hospitals to conduct surveillance in both inpatient and ED or observation unit setting. Results: During 2010–2013, the number of hospitals reporting CDI increased and then stabilized after 2013 (Table 1). Crude HO-CDI rates decreased over time, except for an increase in 2015 and steeper reduction thereafter. (Table 2). During 2010–2014, the adjusted quarterly rate of change was −0.45% (95% CI, −0.57% to −0.33%; P < .0001). The rate of reduction was smaller in 2010–2014 compared to those of 2015–2018 (−2.82%; 95% CI, −3.10% to −2.54%; P < .0001). Compared to 2014, the adjusted rate in 2015 increased by 79.14% (95% CI, 72.42%–86.11%; P < .0001). Conclusions: The number of hospitals reporting CDI LabID data grew substantially in 2013 as a result of the CMS requirement for reporting. Adjusted HO-CDI rates decreased over time, with a rate hike in the year of 2015 and a rapid decrease thereafter. The increase in 2015 may be explained by changes in the NHSN CDI surveillance protocol and better test type classification in later years. Overall decreases in HO-CDI rates may be influenced by prevention strategies.
Funding: None
Disclosures: None
Universal Decolonization Reduces MDRO Burden on High-Touch Objects in Nursing Home Resident Rooms and Common Areas
- Gabrielle M. Gussin, Raveena D. Singh, Raheeb Saavedra, Tabitha D. Catuna, Lauren Heim, Job Mendez, Ryan Franco, Marlene Estevez, Harold Custodio, Kaye D. Evans, Ellena M. Peterson, James A. McKinnell, Loren Miller, Susan Huang
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s54-s55
-
- Article
-
- You have access Access
- Export citation
-
Background: More than half of nursing home (NH) residents harbor a multidrug-resistant organism (MDRO), and MDRO contamination of the environment is common. Whether NH decolonization of residents reduces MDRO contamination remains unclear. The PROTECT trial was a cluster-randomized trial of decolonization versus routine care in 28 California NHs from April 2017 through December 2018. Decolonization involved chlorhexidine bathing plus nasal iodophor (Monday–Friday, every other week), and it reduced resident nares and skin MDRO colonization by 36%. Methods: We swabbed high-touch objects in resident rooms and common areas for MDROs before and after the 3-month decolonization phase-in (April–July 2017). Five high-touch objects (bedrail, call button and TV remote, doorknob, light switch, and bathroom handles) were swabbed in 3 resident rooms per NH based on care needs (Alzheimer’s disease and related dementias (ADRD), ie, total care; ADRD, ambulatory care; and short stay). Five high-touch objects were also swabbed in the common area (nursing station, table, chair, railing, and drinking fountain). Swabs were processed for methicillin-resistant S. aureus (MRSA), vancomycin-resistant Enterococcus (VRE), extended-spectrum β-lactamase (ESBL) producing Enterobacteriaceae, and carbapenem-resistant Enterobacteriaceae (CRE). We used generalized linear mixed models to assess the impact of decolonization on MDRO environmental contamination when clustering by NH and room and adjusting for room type and object because unclustered and unadjusted results are likely to be inaccurate. Results: A high proportion of rooms were contaminated with any MDRO in control NHs: 43 of 56 (77%) in the baseline period and 46 of 56 (82%) in the intervention period. In contrast, decolonization NHs had similar baseline contamination (45 of 56, 80%) but lower intervention MDRO contamination (29 of 48, 60%). When evaluating the intervention impact using multivariable models, decolonization was associated with significantly less room contamination for any MDRO (OR, 0.25; 95% CI, 0.06–0.96; P = .04) and MRSA (OR, 0.16; 95% CI, 0.05–0.55; P = .004) but nonsignificant reductions in VRE contamination (OR, 0.86; 95% CI, 0.23–3.13) and ESBL contamination (OR, 0.13; 95% CI, 0.01–1.62). CRE was not modeled due to rare counts (2 rooms total). In addition, room type was important, with common areas associated with 5-fold, 9-fold, and 3-fold higher contamination with any MDRO, MRSA, and VRE, respectively, compared with short-stay rooms. Conclusions: The high burden of MDROs in NHs calls for universal prevention strategies that can protect all residents. Although decolonization was associated with an 84% reduction in odds of MRSA contamination of inanimate room objects, significant reductions in VRE or ESBL contamination were not seen, possibly due to the lower proportion of baseline contamination due to these organisms. Multimodal strategies are needed to address high levels of MDRO contamination in NHs.
Funding: None
Disclosures: Gabrielle Gussin: Stryker (Sage Products): Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes. Clorox: Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes. Medline: Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes. Xttrium: Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes.
VA Antibiotic Stewardship Intervention to Improve Outpatient Antibiotic Use for ARIs: A Cost-Effectiveness Analysis
- Minkyoung Yoo, Richard Nelson, McKenna Nevers, Karl Madaras-Kelly, Katherine Fleming-Dutra, Adam Hersh, Jian Ying, Benjamin Haaland, Matthew Samore
-
- Published online by Cambridge University Press:
- 02 November 2020, p. s55
-
- Article
-
- You have access Access
- Export citation
-
Background: The Core Elements of Outpatient Antibiotic Stewardship provide a framework to improve antibiotic use, but cost-effectiveness data on interventions to improve antibiotic use are limited. Beginning in September 2017, an antibiotic stewardship intervention was launched in within 10 outpatient Veterans Healthcare Administration clinics. The intervention was based on the Core Elements and used an academic detailing (AD) and an audit and feedback (AF) approach to encourage appropriate use of antibiotics. The objective of this analysis was to evaluate the cost-effectiveness of the intervention among patients with uncomplicated acute respiratory tract infections (ARI). Methods: We developed an economic simulation model from the VA’s perspective for patients presenting for an index outpatient clinic visit with an ARI (Fig. 1). Effectiveness was measured as quality-adjusted life-years (QALYs). Cost and utility parameters for antibiotic treatment, adverse drug reactions (ADRs), and healthcare utilization were obtained from the published literature. Probability parameters for antibiotic treatment, appropriateness of treatment, antibiotic ADRs, hospitalization, and return ARI visits were estimated using VA Corporate Data Warehouse data from a total of 22,137 patients in the 10 clinics during 2014–2019 before and after the intervention. Detailed cost data on the development of the AD and AF materials and electronically captured time and effort for the National AD Service activities by specific providers from a national ARI campaign were used as a proxy for the cost estimate of similar activities conducted in this intervention. We performed 1-way and probabilistic sensitivity analyses (PSAs) using 10,000 second-order Monte Carlo simulations on costs and utility values using their means and standard deviations. Results: The proportion of uncomplicated ARI visits with antibiotics prescribed (59% vs 40%) was lower and appropriate treatment was higher (24% vs 32%) after the intervention. The intervention was estimated to cost $110,846 (2018 USD) over a 2-year period. Compared to no intervention, the intervention had lower mean costs ($880 vs $517) and higher mean QALYs (0.837 vs 0.863) per patient because of reduced inappropriate treatment, ADRs, and subsequent healthcare utilization, including hospitalization. In threshold analyses, the antibiotic stewardship strategy was no longer dominant if intervention cost was >$64,415,000 or the number of patients cared for was <3,672. In the PSA, the antibiotic stewardship intervention was dominant in 100% of the 10,000 Monte Carlo iterations (Fig. 2). Conclusions: In every scenario, the VA outpatient AD and AF antibiotic stewardship intervention was a dominant strategy compared to no intervention.
Funding: None
Disclosures: None
Validation of a Surgical Site Infection Detection Algorithm for Use in Cardiac and Orthopedic Surgery Research
- Hiroyuki Suzuki, Erin Balkenende, Eli Perencevich, Gosia Clore, Kelly Richardson, Rajeshwari Nair, Michihiko Goto, Westyn Branch-Elliman, Kalpana Gupta, Stacey Hockett Sherlock, Marin Schweizer
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s55-s56
-
- Article
-
- You have access Access
- Export citation
-
Background: Studies of interventions to decrease rates of surgical site infections (SSIs) must include thousands of patients to be statistically powered to demonstrate a significant reduction. Therefore, it is important to develop methodology to extract data available in the electronic medical record (EMR) to accurately measure SSI rates. Prior studies have created tools that optimize sensitivity to prioritize chart review for infection control purposes. However, for research studies, positive predictive value (PPV) with reasonable sensitivity is preferred to limit the impact of false-positive results on the assessment of intervention effectiveness. Using information from the prior tools, we aimed to determine whether an algorithm using data available in the Veterans Affairs (VA) EMR could accurately and efficiently identify deep incisional or organ-space SSIs found in the VA Surgical Quality Improvement Program (VASQIP) data set for cardiac and orthopedic surgery patients. Methods: We conducted a retrospective cohort study of patients who underwent cardiac surgery or total joint arthroplasty (TJA) at 11 VA hospitals between January 1, 2007, and April 30, 2017. We used EMR data that were recorded in the 30 days after surgery on inflammatory markers; microbiology; antibiotics prescribed after surgery; International Classification of Diseases (ICD) and current procedural terminology (CPT) codes for reoperation for an infection related purpose; and ICD codes for mediastinitis, prosthetic joint infection, and other SSIs. These metrics were used in an algorithm to determine whether a patient had a deep or organ-space SSI. Sensitivity, specificity, PPV and negative predictive values (NPV) were calculated for accuracy of the algorithm through comparison with 30-day SSI outcomes collected by nurse chart review in the VASQIP data set. Results: Among the 11 VA hospitals, there were 18,224 cardiac surgeries and 16,592 TJA during the study period. Of these, 20,043 were evaluated by VASQIP nurses and were included in our final cohort. Of the 8,803 cardiac surgeries included, manual review identified 44 (0.50%) mediastinitis cases. Of the 11,240 TJAs, manual review identified 71 (0.63%) deep or organ-space SSIs. Our algorithm identified 32 of the mediastinitis cases (73%) and 58 of the deep or organ-space SSI cases (82%). Sensitivity, specificity, PPV, and NPV are shown in Table 1. Of the patients that our algorithm identified as having a deep or organ-space SSI, only 21% (PPV) actually had an SSI after cardiac surgery or TJA. Conclusions: Use of the algorithm can identify most complex SSIs (73%–82%), but other data are necessary to separate false-positive from true-positive cases and to improve the efficiency of case detection to support research questions.
Funding: None
Disclosures: None
Variation in Hospitalist-Specific Antibiotic Prescribing at Four Hospitals: A Novel Tool for Antibiotic Stewardship
- Julianne Kubes, Jesse Jacob, Scott Fridkin, Raymund Dantes, K. Ashley Jones, Sujit Suchindran, Mary Elizabeth Sexton, Zanthia Wiley
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s56-s57
-
- Article
-
- You have access Access
- Export citation
-
Background: Hospitalists play a critical role in antimicrobial stewardship as the primary antibiotic prescriber for many inpatients. We sought to describe antibiotic prescribing variation among hospitalists within a healthcare system. Methods: We created a novel metric of hospitalist-specific antibiotic prescribing by linking hospitalist billing data to hospital medication administration records in 4 hospitals (two 500-bed academic (AMC1 and AMC2), one 400-bed community (CH1), and one 100-bed community (CH2)) from January 2016 to December 2018. We attributed dates that a hospitalist electronically billed for a given patient as billed patient days (bPD) and mapped an antibiotic day of therapy (DOT) to a bPD. Each DOT was classified according to National Healthcare Safety Network antibiotic categories: broad-spectrum hospital-onset (BS-HO), broad-spectrum community-onset (BS-CO), anti-MRSA, and highest risk for Clostridioides difficile infection (CDI). DOT and bPD were pooled to calculate hospitalist-specific DOT per 1,000 bPD. Best subsets regression was performed to assess model fit and generate hospital and antibiotic category-specific models adjusting for patient-level factors (eg, age ≥65, ICD-10 codes for comorbidities and infections). The models were used to calculate predicted hospitalist-specific DOT and observed-to-expected ratios (O:E) for each antibiotic category. Kruskal-Wallis tests and pairwise Wilcoxon rank-sum tests were used to determine significant differences between median DOT per 1,000 bPD and O:E between hospitals for each antibiotic category. Results: During the study period, 116 hospitalists across 4 hospitals contributed a total of 437,303 bPD. Median DOT per 1,000 bPD varied between hospitals (BS-HO range, 46.7–84.2; BS-CO range, 63.3–100; anti-MRSA range, 48.4–65.4; CDI range, 82.0–129.4). CH2 had a significantly higher median DOT per 1,000 bPD compared to the academic hospitals (all antibiotic categories P < .001) and CH1 (BS-HO, P = .01; anti-MRSA, P = .02) (Fig. 1A). The 4 antibiotic groups at 4 hospitals resulted in 16 models, with good model fit for CH2 (R2 > 0.55 for all models), modest model fit for AMC2 (R2 = 0.46–0.55), fair model fit for CH1 (R2 = 0.19–0.35), and poor model fit for AMC1 (R2 < 0.12 for all models). Variation in hospitalist-specific O:E was moderate (IQR, 0.9–1.1). AMC1 showed greater variation than other hospitals, but we detected no significant differences in median O:E between hospitals (all antibiotic categories P > .10) (Fig. 1B). Conclusions: Adjusting for patient-level factors significantly reduced much of the variation in hospitalist-specific DOT per 1,000 bPD in some but not all hospitals, suggesting that unmeasured factors may drive antibiotic prescribing. This metric may represent a target for stewardship intervention, such as hospitalist-specific feedback of antibiotic prescribing practices.
Funding: None
Disclosures: Scott Fridkin, consulting fee - vaccine industry (various) (spouse)
Whole-Genome Sequencing Reveals a Novel Subclade of Pansusceptible Candida auris in Ontario, Canada
- Susy Hota, Lalitha Gade, Kaitlin Forsberg, Erica Susky, Susan Poutanen, Kevin Katz, Jerome Leis, Jeya Nadarajah, Chingiz Amirov, Vydia Nankoosingh, Victoria Williams, Natasha Salt, Anastasia Litvintseva, Julianne Kus
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s57-s58
-
- Article
-
- You have access Access
- Export citation
-
Background:Candida auris is an emerging pathogen that has recently disseminated globally and caused challenging outbreaks in healthcare facilities (HCFs), in part because it is commonly multidrug-resistant. Candida auris remains rare in Canada, with ~20 known cases to date. We describe the emergence of a novel subclade of C. auris in Ontario, Canada, using whole-genome sequencing (WGS). Methods: In Ontario, many HCFs submit yeast isolates from sterile sites requiring species-level characterization and antifungal susceptibility testing (AFST) to the provincial reference laboratory. Yeasts were identified using a combination of standard methods (morphology, API 20C, MALDI-ToF MS) including ITS2 sequencing. Sensititre YO9 panels were used for AFST. Genomic analysis of C. auris was performed using an Illumina HiSeq platform with at least 50× coverage; variants were called against the reference genome by using the previously published North Arizona SNP pipeline (NASP). Phylogenetic trees were produced by maximum parsimony method (MEGA7.0). Results: Between 2014 and 2018, yeast isolates from 5 different patients from 4 HCFs in the same region of Ontario were confirmed to be C. auris by ITS2 PCR and sequence analysis (Table 1). Based on interim CDC criteria for antifungal drug break points, all isolates were pansusceptible to common antifungals. WGS analysis demonstrated that the C. auris isolates were part of the South American clade (IV) and formed an isolated subclade that is well supported by bootstrap analysis, indicating clonal relationships among these isolates (Fig. 1). Conclusions: Although C. auris isolates are usually drug resistant, all 5 initial Ontario isolates were pansusceptible. WGS determined that these isolates clustered within clade IV and were clonal. This cluster of C. auris appears to represent a new subclade of the South American clade that has been transmitted among patients within a region of Ontario. C. auris may have been present in Ontario for some time, escaping earlier detection due to lack of screening programs in HCFs, historical challenges with microbiologic detection of C. auris, and the antifungal susceptibility of the circulating isolates. Investigations are underway to determine clinical features and epidemiologic relatedness among patients in this cluster.
Funding: None
Disclosures: Susy Hota, Contracted Research - Finch Therapeutics
Top Rated Posters Presentations
Assessing the Efficacy and Unintended Consequences of Utilizing a Behavioral Approach to Reduce Inappropriate Clostridioides difficile Testing
- Lana Dbeibo, Allison Brinkman, Cole Beeler, Kristen Kelley, William Fadel, Yun Wang, William Snyderman, Nicole Hatfield, Josh Sadowski, Areeba Kara
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s58-s59
-
- Article
-
- You have access Access
- Export citation
-
Background: Effective strategies to improve diagnostic stewardship around C. difficile infection (CDI) remain elusive. Electronic medical record-based solutions, such as ‘hard’ and ‘soft’ stops, have been associated with reductions in testing, but may not be sustainable due to alert fatigue. Additionally, data on the potential for undertesting, missed diagnoses, and the implications regarding patient harm or clusters of transmission are limited. In this study, we assessed the efficacy of a behavioral approach to diagnostic stewardship, while monitoring for unintended consequences. Methods: This quality improvement study was conducted January 2018–May 2019; baseline period: January–April 2018, implementation period: May–December 2018, sustainment period: January 2019–May 2019. First, we conducted an internal analysis and identified 3 barriers to appropriate testing: clinician’s perceived risk of CDI, inconsistent definition of diarrhea, and lack of involvement of nurses in diagnostic stewardship. A multidisciplinary team to address these barriers was then convened. The team utilized the Bristol stool scale to improve the reliability of diarrhea description, and created a guideline-concordant testing algorithm with clinicians and nurses. The primary outcome was the number of tests ordered. The secondary outcomes were the proportion of inappropriate tests and the proportion of delayed tests. Delayed tests were defined as CDI-compatible diarrhea based on the algorithm where the test was sent >24 hours after symptom onset. Results: During the baseline period, we detected no significant change in number of tests ordered month to month, with 194.2 tests ordered per month on average. During the postimplementation period, the number of tests ordered decreased by ~4.5 each month between January 2018 and May 2019 (P < .0001). The proportion of inappropriate tests steadily decreased from 54% to 30% across the 3 study periods, and the number of delayed testing changed from 11% to 1% then increased to 20% in the sustainment period. There were no cases of toxic megacolon associated with delayed testing. Conclusions: The decision to test for CDI is complex. Interventions that address this issue as a simple ‘right’ and ‘wrong’ fail to address the root cause of CDI overdiagnosis, and they have no embedded mechanism to detect unintended consequences. Our study demonstrates that by taking a behavioral approach and addressing clinicians’ safety concerns, we were able to sustain a significant reduction in testing. We could not determine the significance of the increase in delayed testing given the low numbers; however, further studies are needed to evaluate the safety of CDI reduction strategies through diagnostic stewardship only.
Funding: None
Disclosures: None
Characteristics Associated with Death in Patients with Carbapenem-Resistant Acinetobacter baumannii, United States, 2012–2017
- Hannah E. Reses, Kelly Hatfield, Jesse Jacob, Chris Bower, Elisabeth Vaeth, Jacquelyn Mounsey, Daniel Muleta, Medora Witwer, Ghinwa Dumyati, Emily Hancock, James Baggs, Maroya Walters, Sandra Bulens
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s59-s60
-
- Article
-
- You have access Access
- Export citation
-
Background: Carbapenem-resistant Acinetobacter baumannii (CRAB) is an important cause of healthcare-associated infections with limited treatment options and high mortality. To describe risk factors for mortality, we evaluated characteristics associated with 30-day mortality in patients with CRAB identified through the Emerging Infections Program (EIP). Methods: From January 2012 through December 2017, 8 EIP sites (CO, GA, MD, MN, NM, NY, OR, TN) participated in active, laboratory-, and population-based surveillance for CRAB. An incident case was defined as patient’s first isolation in a 30-day period of A. baumannii complex from sterile sites or urine with resistance to ≥1 carbapenem (excluding ertapenem). Medical records were abstracted. Patients were matched to state vital records to assess mortality within 30 days of incident culture collection. We developed 2 multivariable logistic regression models (1 for sterile site cases and 1 for urine cases) to evaluate characteristics associated with 30-day mortality. Results: We identified 744 patients contributing 863 cases, of which 185 of 863 cases (21.4%) died within 30 days of culture, including 113 of 257 cases (44.0%) isolated from a sterile site and 72 of 606 cases (11.9%) isolated from urine. Among 628 hospitalized cases, death occurred in 159 cases (25.3%). Among hospitalized fatal cases, death occurred after hospital discharge in 27 of 57 urine cases (47.4%) and 21 of 102 cases from sterile sites (20.6%). Among sterile site cases, female sex, intensive care unit (ICU) stay after culture, location in a healthcare facility, including a long-term care facility (LTCF), 3 days before culture, and diagnosis of septic shock were associated with increased odds of death in the model (Fig. 1). In urine cases, age 40–54 or ≥75 years, ICU stay after culture, presence of an indwelling device other than a urinary catheter or central line (eg, endotracheal tube), location in a LTCF 3 days before culture, diagnosis of septic shock, and Charlson comorbidity score ≥3 were associated with increased odds of mortality (Fig. 2). Conclusion: Overall 30-day mortality was high among patients with CRAB, including patients with CRAB isolated from urine. A substantial fraction of mortality occurred after discharge, especially among patients with urine cases. Although there were some differences in characteristics associated with mortality in patients with CRAB isolated from sterile sites versus urine, LTCF exposure and severe illness were associated with mortality in both patient groups. CRAB was associated with major mortality in these patients with evidence of healthcare experience and complex illness. More work is needed to determine whether prevention of CRAB infections would improve outcomes.
Funding: None
Disclosures: None
Characteristics Associated With Invasive Staphylococcus aureus Infection Rates in Nursing Homes, Emerging Infections Program
- Runa Gokhale, Kelly Jackson, Kelly Hatfield, Susan Petit, Susan Ray, Joelle Nadle, Christina B. Felsen, William Schaffner, Isaac See, Prabasaj Paul
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s60-s61
-
- Article
-
- You have access Access
- Export citation
-
Background: Most invasive methicillin-resistant Staphylococcus aureus (iMRSA) infections have onset in the community but are associated with healthcare exposures. More than 25% of cases with healthcare exposure occur in nursing homes (NHs) where facility-specific iMRSA rates vary widely. We assessed associations between nursing home characteristics and iMRSA incidence rates to help target prevention efforts in NHs. Methods: We used active, laboratory- and population-based surveillance data collected through the Emerging Infections Program during 2011–2015 from 25 counties in 7 states. NH-onset cases were defined as isolation of MRSA from a normally sterile site in a surveillance area resident who was in a NH within 3 days before the index culture. We calculated MRSA incidence (cases per NH resident day) using Centers for Medicare & Medicaid Services (CMS) skilled nursing facility cost reports and described variation in iMRSA incidence by NH. We used Poisson regression with backward selection, assessing variables for collinearity, to estimate adjusted rate ratios (aRRs) for NH characteristics (obtained from the CMS minimum dataset) associated with iMRSA rates. Results: Of 590 surveillance area NHs included in analysis, 89 (15%) had no NH-onset iMRSA infections. Rates ranged from 0 to 23.4 infections per 100,000 resident days. Increased rate of NH-onset iMRSA infection occurred with increased percentage of residents in short stay ≤30 days (aRR, 1.09), exhibiting wounds or infection (surgical wound [aRR, 1.08]; vascular ulcer/foot infection [aRR, 1.09]; multidrug-resistant organism infection [aRR, 1.13]; receipt of antibiotics [aRR, 1.06]), using medical devices or invasive support (ostomy [aRR, 1.07]; dialysis [aRR, 1.07]; ventilator support [aRR, 1.17]), carrying neurologic diagnoses (cerebral palsy [aRR, 1.14]; brain injury [aRR, 1.1]), and demonstrating debility (requiring considerable assistance with bed mobility [aRR, 1.05]) (Table). iMRSA rates decreased with increased percentage of residents receiving influenza vaccination (aRR, 0.96) and with the presence of any patients in isolation for any active infection (aRR, 0.83). Conclusions: iMRSA incidence varies greatly across nursing homes, with many NH patient and facility characteristics associated with NH-onset iMRSA rate differences. Some associations (short stay, wounds and infection, medical device use and invasive support) suggest that targeted interventions utilizing known strategies to decrease transmission may help to reduce infection rates, while others (neurologic diagnoses, influenza vaccination, presence of patients in isolation) require further exploration to determine their role. These findings can help identify NHs in other areas more likely to have higher rates of NH-onset iMRSA who could benefit from interventions to reduce infection rates.
Funding: None
Disclosures: None
Enhancing Influenza Vaccination of Hospital Workers from 30% to 80% Through Application of Behavior Change Theories
- Egil Lingaas, Ylva Sandness, Ragnhild Raastad
-
- Published online by Cambridge University Press:
- 02 November 2020, p. s61
-
- Article
-
- You have access Access
- Export citation
-
Background: Historically, influenza vaccination coverage among Norwegian healthcare workers has been low. In 2014–2015 and 2015–2016 the national averages were 9% and 12%, respectively, Although the Fig.s for Oslo University Hospital were higher (30% in 2015–2016), we were still far from the goal of 75% set by the WHO. The same year, <10% of employees at Vestre Viken Hospital Trust were vaccinated. Before the 2016–2017 influenza season, we therefore launched a campaign using methods based on behavior change theories and social marketing to enhance vaccination coverage. Methods: In May–June 2016 a questionnaire was sent by e-mail to all employees at Oslo University Hospital (n = ~25,000) and Vestre Viken Hospital Trust (n = 9,000). The questionnaire was structured according to the theory of planned behavior, asking questions related to attitude, subjective norms, and perceived control. The respondents were asked to grade each answer from 1 to 5, and we could then calculate a score for each question based on the proportion (%) of respondents across the 5 grades. Thus a score between 0 and 500 was possible. We then selected the questions with the highest and lowest scores for intervention, and applied stages of change principles and social marketing for implementation. In May–June 2017, the same questionnaire was sent to all employees and the procedure was repeated before the 2017–2018 influenza season. Finally, for the third time, the procedure was repeated before the 2019–2020 season. This time some additional questions were added, about which sources the employees were using for information on influenza vaccination. Results: In 2017–2018 vaccination coverage increased from 30% to 54%. The year after we reached 73%, and at the time of abstract submission (November 12, 2019) we have passed 80% for the 2019–2020 season, still with more vaccines to be given. Among Norwegian healthcare workers, attitudes and perceived control seemed to have a stronger impact on behavior (vaccination) than subjective norms. Conclusions: We were able to significantly increase voluntary influenza vaccination, reaching the WHO goal of at least 75%, by the application of behavior change theories and social marketing.
Funding: None
Disclosure: Egil Lingaas, Salary - 3M Healthcare
Epidemiologic and Microbiologic Characteristics of 28 Hospitalized Patients Cocolonized With Multiple Carbapenem-Resistant Enterobacteriaceae (CRE) in the United States
- Timileyin Adediran, Anthony Harris, J. Kristie Johnson, David Calfee, Loren Miller, Minh-Hong Nguyen, Katherine Goodman, Lisa Pineles
-
- Published online by Cambridge University Press:
- 02 November 2020, p. s62
-
- Article
-
- You have access Access
- Export citation
-
Background: As carbapenem-resistant Enterobacteriaceae (CRE) prevalence increases in the United States, the risk of cocolonization with multiple CRE may also be increasing, with unknown clinical and epidemiological significance. In this study, we aimed to describe the epidemiologic and microbiologic characteristics of inpatients cocolonized with multiple CRE. Methods: We conducted a secondary analysis of a large, multicenter prospective cohort study evaluating risk factors for CRE transmission to healthcare personnel gown and gloves. Patients were identified between January 2016 and June 2019 from 4 states. Patients enrolled in the study had a clinical or surveillance culture positive for CRE within 7 days of enrollment. We collected and cultured samples from the following sites from each CRE-colonized patient: stool, perianal area, and skin. A modified carbapenem inactivation method (mCIM) was used to detect the presence or absence of carbapenemase(s). EDTA-modified CIM (eCIM) was used to differentiate between serine and metal-dependent carbapenemases. Results: Of the 313 CRE-colonized patients enrolled in the study, 28 (8.9%) were cocolonized with at least 2 different CRE. Additionally, 3 patients were cocolonized with >2 different CRE (1.0%). Of the 28 patients, 19 (67.6%) were enrolled with positive clinical cultures. Table 1 summarizes the demographic and clinical characteristics of these patients. The most frequently used antibiotic prior to positive culture was vancomycin (n = 33, 18.3%). Among the 62 isolates from 59 samples from 28 patients cocolonized patients, the most common CRE species were Klebsiella pneumoniae (n = 18, 29.0%), Escherichia coli (n = 10, 16.1%), and Enterobacter cloacae (n = 9, 14.5%). Of the 62 isolates, 38 (61.3%) were mCIM positive and 8 (12.9%) were eCIM positive. Of the 38 mCIM-positive isolates, 33 (86.8%) were KPC positive, 4 (10.5%) were NDM positive, and 1 (2.6%) was negative for both KPC and NDM. Also, 2 E. coli, 1 K. pneumoniae, and 1 E. cloacae were NDM-producing CRE. Conclusion: Cocolonization with multiple CRE occurs frequently in the acute-care setting. Characterizing patients with CRE cocolonization may be important to informing infection control practices and interventions to limit the spread of these organisms, but further study is needed.
Funding: None
Disclosures: None
Epidemiology and Clinical Outcomes Associated With Extensively Drug-Resistant (XDR) Acinetobacter in US Veterans’ Affairs Health Care
- Margaret Fitzpatrick, Katie Suda, Linda Poggensee, Amanda Vivo, Marissa Gutkowski, Geneva Wilson, Charlesnika Evans
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s62-s63
-
- Article
-
- You have access Access
- Export citation
-
Background: Infections caused by Acinetobacter spp are often healthcare acquired, difficult to treat, and associated with high mortality. Extensively drug-resistant (XDR) Acinetobacter are nonsusceptible to at least 1 agent in all but 2 or fewer antimicrobial classes. Epidemiologic and outcome data for XDR Acinetobacter are limited and have largely been reported outside the United States. This national cohort study describes epidemiology, clinical characteristics, and outcomes for patients with XDR Acinetobacter in VA health care. Methods: This was a retrospective cohort study including microbiology and clinical data from all patients hospitalized between 2012 and 2018 at any VA medical center who had cultures that grew XDR Acinetobacter spp. Performance and reporting of bacterial speciation and antibiotic susceptibility testing were performed by each VA laboratory according to their protocol. Descriptive statistics were used to summarize data. Results: Of 11,541 unique patients with 15,358 cultures that grew Acinetobacter spp during the study period, 410 (3.6%) patients had 670 (4.4%) cultures that grew XDR Acinetobacter. Mean age was 68 years (SD, 12.2 years) and the median Charlson comorbidity index was 3 (IQR, 1–5). The greatest proportion of isolates were from the respiratory tract (n = 235, 35%) followed by urine (n = 184, 28%). The South had the greatest proportion of patients with XDR Acinetobacter (n = 162, 40%); almost all patients were seen at urban VA medical centers (n = 406, 99%). Most patients (n = 335, 82%) had had antibiotic exposure in the prior 90 days, most commonly vancomycin (n = 238, 65%) and third- or fourth-generation cephalosporins (n = 155, 38%). Most patients (n = 334, 81%) also had a hospital or long-term care admission in the prior 90 days. Fig. 1 shows antibiotic susceptibilities of XDR Acinetobacter isolates; polymyxins, tigecycline, and minocycline demonstrated the highest susceptibility. In-hospital mortality occurred in 90 patients (22%), 30-day mortality in 97 patients (24%), and 1-year mortality in 198 patients (48%). Of 93 patients, 23% were readmitted to the hospital within 90 days. Conclusions: Providers should maintain a heightened suspicion for infection with XDR Acinetobacter spp in older patients seen at urban medical centers who have had recent healthcare and antibiotic exposures, particularly if they have respiratory or urinary tract infections. Isolation of XDR Acinetobacter is associated with high in-hospital and 30-day mortality. New antibiotics targeting MDR gram-negative bacteria generally lack activity against Acinetobacter, leaving polymyxins, tigecycline, and minocycline as the only limited treatment options. Therefore, novel antibiotics for XDR Acinetobacter are urgently needed.
Funding: None
Disclosures: None
ESBL Types and Plasmid Heterogeneity in Urinary E. coli Isolates: Results From a Nationwide Multicenter Study in Croatia
- Tomislav Mestrovic, Marija Krilanovic, Maja Tomic-Paradzik, Natasa Beader, Zoran Herljevic, Rick Conzemius, Ivan Barisic, Jasmina Vranes, Vesna Elvedi-Gasparovic, Branka Bedenic
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s63-s64
-
- Article
-
- You have access Access
- Export citation
-
Background: The prevalence of Escherichia coli strains producing extended-spectrum β-lactamases (ESBLs) has increased both in the community and in healthcare settings. Furthermore, recent studies in nursing homes and long-term care facilities have shown that these institutions can act as potential reservoirs of ESBL- and CTX-M–producing E. coli. Consequently, we aimed to characterize ESBLs produced by E. coli isolates causing hospital-onset, long-term care facility and community infections throughout Croatia (Europe), as well as to compare antimicrobial sensitivity patterns, molecular specificities, plasmid types and epidemiological features. Methods: From a total pool of 16,333 E. coli isolates, 164 ESBL-producing strains with reduced susceptibility to third-generation cephalosporins were used for further appraisal. Phenotypic tests for the detection of ESBLs and plasmid-mediated AmpC β-lactamases were initially pursued (including a novel version of modified CIM test named cephalosporin inactivation method), followed by conjugation experiments, molecular detection of resistance genes, plasmid extraction with PCR-based replicon typing, serotyping, genotyping with pulsed-field gel electrophoresis, and whole-genome sequencing (WGS). Results: The isolates in this study exhibited a high level of resistance to expanded-spectrum cephalosporins and carried CTX-M or TEM β-lactamases, and all of them were classified as multidrug-resistant due to their resistance pattern to other antimicrobial drugs. The β-lactamase content did not differ among isolated E. coli strains from various sources (ie, hospitals, nursing homes, and the community). According to the genotyping results, the isolates were allocated into 8 clusters, which contained subclusters. Serotyping results revealed that O25 antigen was the dominant one; furthermore, isolates subjected to WGS belonged to the ST131 sequence type. The most pervasive plasmid types in the isolates from the country’s capital (Zagreb) were IncFII and FIA, whereas FIA alone was a dominant plasmid type in the southern part of the country. Conversely, eastern parts were characterized by plasmids belonging to IncB/O and IncW groups. Conclusions: Our study demonstrated the dissemination of group 1 CTX-M–positive E. coli not only in different geographic regions of Croatia but also in different arms of the healthcare system (ie, hospitals, nursing homes, and the community). Our results also confirmed the switch from previously pervasive SHV-2 and SHV-5 ESBLs to the nationwide predominance of group 1 CTX-M β-lactamases; however, regional distribution was associated with different plasmid types carrying blaCTX-M genes. These types of nationwide studies are indispensable for informing global decision making that addresses the issue of antimicrobial resistance.
Funding: None
Disclosures: None
Financial and Mortality Modeling as a Tool to Present Infection Prevention Data: What a SIR of 1.2 Means for the Hospital
- Vidya Mony, Kevin Hultquist, Supriya Narasimhan
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s64-s65
-
- Article
-
- You have access Access
- Export citation
-
Background: Presenting to hospital leadership is an annual requirement of many infection prevention (IP) programs. Most presentations include current statistical data of hospital-acquired infections (HAIs) and whether the hospital has met its goals according to the National Healthcare Safety Network (NHSN) criteria. We presented HAI data in a novel way, with financial and mortality modeling, to show the impact of IP interventions to leadership not attuned to NHSN metrics. Method: We looked at 4 HAIs, their trends, and their effect on our hospital, Santa Clara Valley Medical Center (SCVMC). To estimate the impact of specific HAIs, we used 2 metrics derived from a meta-analysis by the US Department of Health and Human Services (HHS): excess mortality and excess cost. Excess mortality is defined as the difference between the underlying population mortality and the affected population mortality expressed as deaths per 1,000 population. Excess cost is defined as the additional cost introduced per patient with a specific HAI versus a similarly admitted patient without that HAI. HHS data were multiplied by the number of HAI events at SCVMC to generate estimates. Result: In our presentation, we elucidated a previously unseen cost savings and decreased mortality with 2 HAIs, central-line–associated blood stream infections (CLABSIs) and catheter associated urinary tract infections (CAUTIs), which were below NHSN targets due to IP-led interventions. We then showed 2 other HAIs, Clostridium difficile infection (CDI) and surgical site infections (SSIs), which did not meet our expected NHSN and institutional goals and were estimated to increase costs and potential mortalities in the upcoming year. We argued that proactive monies directed toward expanding our IP program and HAI mitigation efforts would cost a fraction of the impending healthcare expenditures as predicted by the model. Conclusion: By applying financial and mortality modeling, we helped our leadership perceive the concrete effect of IP-led interventions versus presenting abstract NHSN metrics. We also emphasized that without proactive leadership investment, we would continue to overspend healthcare dollars while not meeting our goals. This format of presentation gave us critical leverage to advocate for and successfully expand our IP department. Further SHEA-led cost-analysis modeling and education are needed to help IP departments promote their efforts in an effective manner.
Funding: None
Disclosures: None
Getting the Most Out of the ICAR Visit by Using a Scoring Report to Provide Feedback
- Rehab Abdelfattah, Virgie Fields, Carol Jamerson, Sarah Lineberger
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s65-s66
-
- Article
-
- You have access Access
- Export citation
-
Background: The Centers for Disease Control and Prevention developed the Infection Control Assessment and Response (ICAR) tools to assist health departments in assessing infection prevention practices and to guide quality improvement activities. ICAR tools are available for the following healthcare settings: acute care (including hospitals and long-term acute-care hospitals), outpatient, long-term care, and hemodialysis. The Virginia Healthcare-Associated Infections and Antimicrobial Resistance (HAI/AR) Program developed a scoring report that provides a quantitative measure for each infection control domain and summarizes strengths and opportunities for improvement. The scoring report aims to provide feedback to facility administration in a simple, user-friendly way to increase their engagement, prioritize follow-up actions for areas in need of improvement, and to analyze statewide data systematically to identify and address major defects. Methods: Scoring reports were developed for acute care, long-term care, and hemodialysis facilities. Each report includes 2 tables: infection control domains for gap assessment and direct observation of facility practices. The first table has rows for infection control assessment domains, and the second table summarizes direct observations conducted during the ICAR visit such as hand hygiene, point-of-care testing, and wound dressing change. Each row is stratified by the score, which is determined by responses to the ICAR tool, for each domain or observation, interpretation of the score, strengths, and opportunities for improvement. Stoplight colors with assigned percentages are used for score interpretation. ICAR visit results from 5 long-term care facilities (LTCFs) and 3 hemodialysis centers were entered into a REDCap database and analyzed. Results: Data from these visits elucidated consistent gaps in Infection Prevention and Control programs and defined what practices are most lacking. The low-performance areas in LTCFs included hand hygiene, personal protective equipment (PPE), environmental cleaning and disinfection, and antimicrobial stewardship. In hemodialysis centers, respiratory hygiene and cough etiquette, injection safety, and surveillance and disease reporting had the lowest scores. Positive feedback on the scoring report was received from facilities and other state HAI programs. Conclusion: The Virginia HAI/AR Program developed a scoring report that engaged healthcare facility administration, including corporate leadership, by providing a composite score with interpretation. The report prioritized areas for improvement and guided public health follow-up visits. Common gaps in infection prevention practices were identified across facilities, and this information has been used to determine statewide training needs by facility type. The scoring report is an effective method to help allocate state resources and improve communication and engagement of healthcare facilities. Reports can be adapted for use in other jurisdictions.
Funding: None
Disclosures: None