The Sixth Decennial International Conference on Healthcare-Associated Infections Abstracts, March 2020: Global Solutions to Antibiotic Resistance in Healthcare
Poster Presentations
VRE Screening and Isolation: One Size Does Not Fit All
- Alon Vaisman, Smit Mistry, Sarah Zanchettin, Susy Hota
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s511-s512
-
- Article
-
- You have access Access
- Export citation
-
Background: The need for screening and isolation for patients colonized with vancomycin-resistant Enterococcus (VRE) remains controversial. In this study, we examined the effects of discontinuation and reinstatement of these practices on VRE infection and colonization incidence within a multisite, tertiary-care hospital center, including its effects on specific at-risk groups. Methods: We retrospective analyzed VRE clinical isolate, infection, and bacteremia incidence rates at our hospital (1) prior to discontinuation of universal screening and isolation (January 2010–June 2012), (2) during discontinuation (July 2012–April 2017), and (3) after reinstatement of screening and isolation in high-risk wards (intensive care and multiple-organ transplant units, June 2017–April 2019). Monthly incidence rates were calculated for each of 3 sites at our tertiary-care hospital: site A, which includes the transplant program, site B, an adult cancer hospital, and site C, which includes orthopedic and neurology programs. To understand the differential effect of screening and isolation on various risk groups, incidence rates were also calculated for individual programs within our hospital, including medicine, surgery, intensive care, oncology, and transplant programs. Results: During the period of study, 3,167 cases of VRE isolates were identified. Patient colonizations of VRE across the institution increased throughout the study period, with the monthly number of newly colonized patients increasing from 10.4 in the first period of study to 20.6 in the last period. The overall VRE clinical isolate, infection, and bacteremia incidence rates did not increase following the cessation of VRE screening and isolation precautions; however, a significant increase was seen among the patients at site B (Fig. 1, infection rates). Furthermore, there was a significant decrease in VRE clinical isolate, infection, and bacteremia incidence following the reinstatement of screening in the ICU and transplant programs at site A, but no effect was seen in the other programs (Fig. 2, infection rates). Conclusions: The risk associated with discontinuing VRE screening and isolation measures appears depend on the subgroup of patients within a hospital environment. Furthermore, risk-based or unit-based VRE screening and isolation appears to be effective at controlling VRE incidence, even after measures had previously been discontinued. Additional study of other inpatient settings is warranted to determine the effects of screening and isolation for VRE on other patient subgroups.
Funding: None
Disclosures: Susy Hota reports contract research for Finch Therapeutics.
When Legionnaires’ Disease Isn’t: Case Presentation and Implications of the Council of State and Territorial Epidemiologists (CSTE) Changes to Case Definitions
- Janet E. Stout, Anurag Malani
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s512-s513
-
- Article
-
- You have access Access
- Export citation
-
Background: Most cases of Legionnaires’ disease are diagnosed by the urinary antigen test (UAT). Single cases of suspected healthcare-acquired Legionnaires’ disease are often investigated by local and state health departments. Such investigations can result in disruptive and expensive interventions. We report a case of a urine-antigen–positive patient whose clinical presentation was inconsistent with Legionnaires’ disease. Within the same year, an employee at this hospital was diagnosed with presumed community-acquired Legionnaires’ disease; however, the case was considered by the health department to be healthcare acquired. The occurrence of 2 cases, as determined by the health department, fulfilled the definition for an outbreak investigation and triggered water restrictions and extensive testing of the environment and patients for Legionella. The cases and the implications of these actions are reviewed in the context of new information about false-positive urinary-antigen tests and changes to the outbreak case definitions for Legionnaires’ disease by the Council of State and Territorial Epidemiologists (CTSE). This includes “probable” cases that have no positive diagnostic tests.
Funding: None
Disclosures: Janet E. Stout reports salary from the Special Pathogens Laboratory and is an owner.
Whole-Genome Sequencing Reveals Diversity of Carbapenem-Resistant Pseudomonas aeruginosa Collected Through the Emerging Infections Program
- Richard Stanton, Jonathan Daniels, Erin Breaker, Davina Campbell, Joseph Lutgring, Maria Karlsson, Kyle Schutz, Jesse Jacob, Lucy Wilson, Elisabeth Vaeth, Linda Li, Ruth Lynfield, Erin C. Phipps, Emily Hancock, Ghinwa Dumyati, Rebecca Tsay, P. Maureen Cassidy, Jacquelyn Mounsey, Julian Grass, Maroya Walters, Alison Halpin
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s513-s514
-
- Article
-
- You have access Access
- Export citation
-
Background: Carbapenem-resistant Pseudomonas aeruginosa (CRPA) is a frequent cause of healthcare-associated infections (HAIs). The CDC Emerging Infections Program (EIP) conducted population and laboratory-based surveillance of CRPA in selected areas in 8 states from August 1, 2016, through July 31, 2018. We aimed to describe the molecular epidemiology and mechanisms of resistance of CRPA isolates collected through this surveillance. Methods: We defined a case as the first isolate of P. aeruginosa resistant to imipenem, meropenem, or doripenem from the lower respiratory tract, urine, wounds, or normally sterile sites identified from a resident of the EIP catchment area in a 30-day period; EIP sites submitted a systematic random sample of isolates to CDC for further characterization. Of 1,021 CRPA clinical isolates submitted, 707 have been sequenced to date using an Illumina MiSeq. Sequenced genomes were classified using the 7-gene multilocus sequence typing (MLST) scheme, and a core genome MLST (cgMLST) scheme was used to determine phylogeny. Antimicrobial resistance genes were identified using publicly available databases, and chromosomal mechanisms of carbapenem resistance were determined using previously validated genetic markers. Results: There were 189 sequence types (STs) among the 707 sequenced genomes (Fig. 1). The most frequently occurring were high-risk clones ST235 (8.5%) and ST298 (4.7%), which were found across all EIP sites. Carbapenemase genes were identified in 5 (<1%) isolates. Overall, 95.6% of the isolates had chromosomal mutations associated with carbapenem resistance: 93.2% had porinD-associated mutations that decrease membrane permeability to the drugs; 24.8% had mutations associated with overexpression of the multidrug efflux pump MexAB-OprM; and 22.9% had mutations associated with overexpression of the endogenous β-lactamase ampC. More than 1 such chromosomal resistance mutation type was present in 37.8% of the isolates. Conclusions: The diversity of the sequence types demonstrates that HAIs caused by CRPA can arise from a variety of strains and that high-risk clones are broadly disseminated across the EIP sites but are a minority of CRPA strains overall. Carbapenem resistance in P. aeruginosa was predominantly driven by chromosomal mutations rather than acquired mechanisms (ie, carbapenemases). The diversity of the CRPA isolates and the lack of carbapenemase genes suggest that this ubiquitous pathogen can readily evolve chromosomal resistance mechanisms, but unlike carbapenemases, these cannot be easily spread through horizontal transfer.
Funding: None
Disclosures: None
Competency of Infection Preventionists in Japan
- Hanako Misao, Kazumi Kawakami
-
- Published online by Cambridge University Press:
- 02 November 2020, p. s514
-
- Article
-
- You have access Access
- Export citation
-
Backgrounds: In the United States, the Association for Professionals in Infection Control and Epidemiology (APIC) announced a competency model for infection preventionists (IP) in 2011. On the other hand, IPs in Japan must develop their career by themselves because there are no guidelines of career development for Japanese IPs. In recent years, infectious diseases and infection control issues have become more global. Objective: Aiming for international collaboration among IPs, the purpose of this study were to clarify the actual competencies of IPs in Japan and the United States and to compare the competencies of both. We report on the competencies of IPs in Japan. Methods: Semistructured interviews were conducted with 67 certified nurses in infection control (CNIC) who responded to the translated version of the APIC Competency Model Assessment Tool. From the qualitative descriptive analysis of interview verbatim records, we extracted the behavioral characteristics and completed the questionnaire “Survey of Competency for Infection Preventionist,” which consisted of 130 items. A survey form was created using Survey Monkey. We sent e-mails to ask anonymous survey collaboration, including the URL of the survey form, to 2,284 CNIC and CNS in infection control professionals. The research was approved by the research ethics committee at the facility to which the researcher belongs (Juntendo University, approval no. 30–49). Results: The number of responses was 648 and the response rate was 28.4%. The mean years of experience as nurses of 648 respondents was 24.7 (SD, 6.9), and >60% belonged to general hospitals. The scores of mean and standard deviation of each category were as follows: “Clarification of infectious disease process” (mean, 79.1; SD, 13.2); “HAI surveillance and epidemiological survey” (mean, 49.3; SD, 12.3); “Prevention and control of transmission of infectious microorganisms” (mean, 93.8; SD, 17.3); “Management and communication” (mean, 128.5; SD, 23.7); “Education and Research” (mean, 56.8; SD, 11.0); “Employee and occupational health” (mean, 40.6; SD, 9.6); and the total score of all categories (mean, 449.4; SD, 74.4). Based on years of experience as infection preventionists, we divided them into 3 groups: beginners, competent, and experts. As the career level increased, each category score for competency increased (ANOVA, P < .001). However, the mean scores of competency did not reach 70% of the total score for the following categories: “Prevention and control of transmission of infectious microorganisms.” “Education and research,” and “Employee and occupational health.” Conclusions: The competencies that need to be strengthened for the career development of Japanese IPs have been clarified.
Funding: This study was supported by JSPS KAKENHI.
Disclosures: None
Developing a Competency Model for Nurses Certified in Infection Control in Japan
- Kazumi Kawakami, Hanako Misao
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s514-s515
-
- Article
-
- You have access Access
- Export citation
-
Background: In July 2019, 2,793 nurses were registered as certified nurse in infection control (CNIC) at the Japanese Nursing Association (JNA). Most CNICs work as full-time infection preventionists (IPs) in hospitals. However, a competency model for CNICs has not been developed in Japan yet. Therefore, we developed a competency model for CNICs. Methods: We conducted a 2-phase explanatory sequential mixed-methods study between November 2013 and October 2019. The participants were 1,711 CNICs listed on the JNA website. Phase 1 was a cross-sectional study using self-administered questionnaires that included 10 competency domains based on the Association for Professionals in Infection Control and Epidemiology Competency Assessment Tool. Considering years of experience as an IP and full-time position, participants’ career stages were novice, competent, proficient, and expert. The CNICs who answered the questionnaire were included in the interview during phase 2, which was a descriptive qualitative study. Specifically, 10–30 participants were selected from each career stage. Semi-structured individual interviews were conducted, and verbatim transcripts were analyzed qualitatively. The knowledge, skills, and abilities of CNICs were extracted at each career stage. This study was approved by the Research Ethics Committee of Juntendo University (approval no. 25-27). Results: During phase 1, 1,711 CNICs were invited to participate: 975 returned the questionnaire (57% response rate) and 969 (99.3%) responses were valid and used in the analysis. Only 257 participants agreed to attend the interviews. In phase 2, interviews were conducted with 67 CNICs: 30 novice, 20 competent, 13 proficient, and 4 expert. The mean years of experience as a nurse and CNIC were 22.2 (SD, 7.0) and 5.3 (SD, 3.1), respectively. As the career stage advanced, the contents and range of infection prevention role and activities in the hospital or community expanded across competency domains. In clarification of infection process, one of the crucial competencies, the novice needed to consult reference material about the infectious disease each time due to lack of knowledge. Although the competent CNICs understood the frequent occurrence of infectious disease, they needed the specialist’s advice. However, the proficient and expert CNICs could interpret information independently, and importantly, expert CNICs could distinguish between what they know and do not know. Conclusions: Using an explanatory sequential mixed-methods approach, we developed a competency model for CNICs that may encourage CNICs to develop their expertise and that is useful in assessing the qualities or abilities of CNICs. In the future, this model can be used to develop systematic educational programs for CNICs.
Funding: This study was supported by JSPS KAKENHI.
Disclosures: None
Development of a Risk Prediction Model for Central-Line–Associated Bloodstream Infection (CLABSI) in Patients With Continuous Renal Replacement Therapy
- Hui Zhang
-
- Published online by Cambridge University Press:
- 02 November 2020, p. s515
-
- Article
-
- You have access Access
- Export citation
-
Background: The number of patients with end-stage renal disease and acute kidney injury in China is large and increasing year by year. Continuous renal replacement therapy (CRRT) is one of the important treatment methods. However, long-time CRRT would inevitably lead to CLABSI, which would seriously affect the treatment and prognosis of the patient. Although CLABSIs can be prevented and controlled, the rate of CLABSI in China is still higher than in other countries. Therefore, it is urgent to find new intervention methods on the basis of existing methods. Surveillance is the prerequisite of infection prevention and control. We sought to develop a risk prediction model for CLABSI in patients with CRRT according to uncontrollable risk factors, which could be used for early assessment and screening of high-risk infection groups. Such a tool would bring the supervision and infection control to the forefront in addressing these issues. Methods: We selected 3,103 CRRT patients in the West China Hospital of Sichuan University from January 2013 to December 2018 using the hospital infection and infectious disease monitoring module of electronic medical records (EMR) system with the integration and elimination criteria. Data mining and feature selection were performed using Weka software. Separately, prediction models developed by Weka software and SPSS software were compared with each other using the area under the curve (AUC) method to assess the performance of the forecasting models. Result: The incidence of CLABSI in CRRT patients was 8.01 per 1,000 catheter days (238 of 29,711). According to the multifactor regression analysis by SPSS software, the retaining time of dialysis catheter, C-reactive protein levels, total bilirubin, acute pancreatitis, and systemic inflammation reaction syndrome were the risk factors. According to the Youden’s index, the cutoff point of the retaining time of dialysis catheter was 5.5 days; the cutoff point of CRP was 112.5mg/L; and the cutoff point of total bilirubin was 14.15 μmol/L. The prediction models of CLABSI for CRRT patients were developed: The AUC of the prediction model developed by SPSS software was 0.763 (95% CI, 0.717–0.809). The receiver operating characteristic (ROC) curve analysis showed that the AUCs of the prediction models developed separately by Weka software using Bayes, logistic regression analysis, multiple layer Perceptron and J48, and SPSS software through logistic regression analysis were between 0.6 and 0.8. Using the down-sampling technique, the AUC ranged between 0.7 and 0.9, and the sensitivity, precision, and κ value increased. Thus, these models had definite clinical significance. Conclusion: The prediction models of CLABSI for CRRT patients, developed based on the big healthcare data, not only had good judgment ability, but also had good application value for individual evaluations and the target population.
Funding: This study was supported by the Health Commission of Sichuan Province.
Disclosures: None
Effectiveness of Stewardship Intervention for Urinary Tract Infections in Primary Care: A Difference in Differences Study
- Larissa Grigoryan, George Germanos, Roger Zoorob, Mohamad Sidani, Haijun Wang, Mohammad Zare, Melanie Goebel, Barbara Trautner
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s515-s516
-
- Article
-
- You have access Access
- Export citation
-
Adherence to 2011 Infectious Diseases Society of America (IDSA) guidelines for urinary tract infections (UTIs) remains low in primary care. Fluoroquinolones are commonly prescribed to treat uncomplicated cystitis, and most antibiotic prescriptions have durations that exceed current recommendations. We performed a difference-in-differences study to assess the effectiveness of a stewardship intervention in a family medicine clinic at an academic outpatient center from August 2016 to March 2019. During our intervention period, the FDA released 2 additional warnings about the side effects of fluoroquinolones. Methods: The study had 2 sites (intervention and comparison) and 3 periods: baseline, before the intervention, and the intervention. During the first 2 years, we obtained baseline data and performed interviews (preintervention period) exploring provider prescribing decisions for cystitis at both sites. During the intervention period at the intervention site only, we presented an educational lecture including an overview of the IDSA guidelines, definitions for various UTI syndromes and actual clinical examples, and instruction on use of a decision aid. During the audit and feedback phase, providers were contacted once per month in person or by phone to provide follow-up on whether their treatment decision adhered to the IDSA guidelines. We performed a log-binomial regression analysis of the primary outcome, adherence to the IDSA guidelines for management of uncomplicated cystitis, both to antibiotic choice and duration of therapy. Results: We performed 156 audit-and-feedback sessions with 13 providers during the intervention period (March 2018–2019). Patients in both sites were similar in terms of age and Charlson comorbidity index. Adherence to the guidelines for antibiotic choice and duration increased in the intervention period at both sites (Fig. 1). The treatment of cystitis in the intervention period of the intervention site was 11.5 times (95% CI, 6.1–21.6) as likely to be guideline-adherent as the treatment in the baseline period of the comparison site (Fig. 2). Conclusions: Adherence to IDSA guidelines for the choice of antibiotic and duration increased in both intervention and comparison sites. Even though the intervention site started with higher compliance, improvement was also greater in the intervention site. FDA warnings about the side effects of fluoroquinolones released during the intervention period may have contributed to the avoidance of fluoroquinolones at both sites. Our intervention was effective at improving antibiotic choice and duration, so our future plans include incorporating our decision-support algorithm into the electronic medical record.
Funding: This study was supported by Zambon Pharmaceuticals.
Disclosures: None
Effectiveness of Dry Hydrogen Peroxide on Reducing Environmental Microbial Bioburden Risk in a Pediatric Intensive Care Unit
- Mario Melgar, Marylin Ramirez, Laura Matheu, Miguel Gomez, Jose Amadeo Ferrolino
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s516-s517
-
- Article
-
- You have access Access
- Export citation
-
Background: Environmental contamination is a known risk factor for healthcare-associated infection acquisition. Transmission of pathogens from the environment can occur from indirect or direct patient contact with the environment or via healthcare workers’ hands. Dry hydrogen peroxide (DHP) has been shown to reduce microbial contamination in the hospital setting. This novel technology allows safe delivery of hydrogen peroxide in an occupied space, using ambient air and humidity to help create DHP. This study describes the implementation of DHP as an adjunct to routine environmental cleaning and disinfection, with the goal of reducing air and surface microbial bioburden in the intensive care unit (ICU) of a pediatric oncology hospital in Guatemala. Methods: A prospective IRB-approved study was conducted. Two rooms served as controls and 2 rooms served as intervention DHP sites. Air and surface cultures (5 high-touch, 2 low-touch) and adenosine triphosphate (ATP) swabs were collected from study areas for 1 week prior to deployment of the portable DHP units and at various time points for 1 month during the intervention phase. Air samples were collected using settling plates. The level of microbial burden was measured using colony-forming units (CFU) and ATP levels. A comparison between groups was carried out using Poisson regression analyses for CFU counts and linear regression analyses for log-transformed ATP levels. Results: In total, 280 surface cultures and ATP surface swabs were collected and analyzed. The overall mean microbial burden was significantly reduced in the intervention group compared to the control group (mean, 5.50 vs 11.77; P ≤ .0001). Reductions in microbial CFU were observed across all sampling sites in the intervention group. ATP readings in both control and intervention group showed passing levels of surface cleanliness. ATP was measured in terms of relative light units (RLU). A reduction in the mean RLU levels was also noted in the intervention group compared to the control group (172.08 vs 225.83; P ≤ .006). A reduction in aerobic CFU was seen as well in the air samples in the intervention group but was not statistically significant (P = .139). The ICU census was full, and services were not affected. Conclusions: DHP was effective in reducing surface and air microbial bioburden in an occupied space. Further studies of the impact of DHP decontamination on incidence of nosocomial infections should be performed.
Funding: This study was supported by Synexis.
Disclosures: None
Efficacy of Double Manual Cleaning Versus Automated Cleaning for Removal of Biofilm of Hinged Surgical Instruments
- Dayane Costa, Roel Castillo, Lillian Kelly Lopes, Anaclara Tipple, Honghua Hu, Karen Vickery
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s518-s519
-
- Article
-
- You have access Access
- Export citation
-
Objectives: To evaluate the efficacy of double manual cleaning (DMC) with enzymatic followed by alkaline detergent for removing biofilm on hinged surgical instruments compared to automated cleaning by the washer-disinfector. Methods: Biofilm of Staphylococcus aureus (ATCC 25923) was formed in vitro on hemostatic forceps (Fig. 1). Biofilm-covered forceps were rinsed in distilled water and subjected to one of the following cleaning regimes (n = 5 forceps each): Group 1 forceps were soaked in sterile water for 5 minutes. Group 2-DMC forceps were soaked in enzymatic detergent, brushed 5 times on each face, rinsed with filtrated water (0.2 µm), soaked in alkaline detergent, brushed 5 times each face, rinsed with filtrated water (0.2 µm), and dried with sterile cloth. For group 3-DMC plus hinge inner brushing (n = 5), the forceps were soaked in detergents and brushed as in group 2, including hinge inner brushing (2-mm lumen brush) (Fig. 1). In group 4 (automated cleaning in a washer/disinfector), forceps were prewashed, washed once, washed again, rinsed, thermally rinsed, and dried. After the treatments, forceps were evaluated for microbial load (counting of colony-forming units), residual protein (BCA protein assay kit), and biofilm (scanning electron microscopy). Results: There was no statistically significant differences between the microbial load and protein level contaminating the forceps subjected to DMC (group 2) and the positive control group. The DMC with hinge inner brushing group (group 3) and the automated cleaning group (group 4) demonstrated a significantly reduced microbial load: reduction averages of 2.8 log 10 (P = .038) and 7.6 log10 (P ≤ .001), respectively. The protein level remaining on the forceps also significantly decreased: 2.563 μg (P = .016) and 1,453 μg (P = .001), respectively, compared to the positive control group. There was no statistically significant difference between DMC with hinge inner brushing and automated cleaning (groups 3 and 4) for all of the tests performed. None of the cleaning methods completely removed biofilm and/or soil from the forceps hinge internal region (Fig. 1). Conclusions: Automated cleaning had the best efficacy for removing biofilm. However, DMC with hinge inner brushing was an acceptable alternative cleaning method for sterilizing service units with only manual cleaning available, as is the case in most low- and middle-income countries. Neither automated nor any manual cleaning regimes were able to completely remove biofilm and soil from the forceps hinged area, and the amount of protein left after automated and DMC plus hinge brushing was higher than the recommended. Cleaning is the most important step for the reprocessing of reusable medical devices; thus, efforts must be undertaken to improve cleaning in different social and economic realities and scenarios.
Funding: This study was supported by Coordenação de Aperfeiçoamento de Pessoal de Nível Superior – CAPES.
Disclosures: None
Evaluation of a Continuous Decontamination Technology in an Intensive Care Unit
- Tami Inman BSN, David Chansolme
-
- Published online by Cambridge University Press:
- 02 November 2020, p. s519
-
- Article
-
- You have access Access
- Export citation
-
Background: The scientific literature increasingly indicates the need for the development of continuous disinfection to address the persistent contamination and recontamination that occurs in the patient rooms despite routine cleaning and disinfection. Methods: To determine a baseline microbial burden level on patient room surfaces in the intensive care unit (ICU) of a large urban hospital, 50 locations were swabbed for total colony-forming units (CFU) and the prevalence of methicillin-resistant Staphylococcus aureus (MRSA). Once the baseline in ICU patient rooms was established, 5 novel decontamination devices were installed in the HVAC ducts near these patient rooms. The devices provide a continuous low-level application of oxidizing molecules, predominately hydrogen peroxide. These molecules exit the duct and circulate in the patient room through normal convection, landing on all surfaces. After activation, environmental sampling was conducted every 4 weeks for 4 months. The effect from continuous low levels of oxidizing molecules on the intrinsic microbial burden and the prevalence of MRSA were analyzed. In addition to external laboratory reports, the facility tracked healthcare-associated infections (HAIs) in the unit. HAI data were averaged by month and were compared to the preactivation average in the same unit. Results: The preactivation average microbial burden found on the 50 locations were 179,000 CFU per 100 in2. The prevalence of MRSA was 71% with an average of 81 CFU per 100 in2. After activation of the devices, levels of microbial burden, prevalence of MRSA, and average monthly HAI rates were all significantly lower on average: 95% reduction in average microbial burden (8,206 CFU per 100 in2); 81% reduction in the prevalence of MRSA (13% vs 71%); 54% reduction in the average of healthcare-onset HAIs. All data were obtained from the averages of sampling data for 4 weeks during the 4-month trial period. Conclusions: The continuous application of low levels of oxidizing molecules throughout the patient rooms of an ICU demonstrated 3 outcomes: reduced overall surface microbial burden, lowered the incidence of MRSA, and significantly decreased the monthly average HAI rate. Please note, the ICU ran other infection prevention interventions at this time, including standard cleaning, as well as and their standard disinfecting techniques.
Funding: This study was supported by the CASPR Group.
Disclosures: None
Firefighters Fighting Germs - Evaluation of a Disinfecting Protocol for Use in the Fire Service
- Christine McGuire-Wolfe
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s519-s520
-
- Article
-
- You have access Access
- Export citation
-
Background: Pasco County Fire Rescue (PCFR) is a rapidly growing suburban fire department located in Florida. PCFR employs >500 firefighters (all cross-trained as either emergency medical technicians or paramedics) in 27 stations to provide both emergency medical services (EMS) and fire suppression response. Although multiple studies have established that pathogens are present in both apparatus and stations within the fire service, there is a knowledge gap regarding the effectiveness of cleaning and disinfecting protocols in this specific setting. Methods: In total, 65 high-touch surfaces in 11 vehicles (ambulances and engines) and common areas of 2 fire stations were swabbed before and after disinfection. Vehicle surfaces swabbed included seats, cabinet doors, door handles, stretchers, medical equipment, keyboard, steering wheels, shared headsets and hand rails. Inside the stations, the refrigerator handle, television remote, radio and alarm buttons, door handles, and locker handles were swabbed. Immediately following the initial swab collection, the surfaces were disinfected with hydrogen peroxide wipes and disinfectant cleaner sprayed through an electrostatic system. The same surfaces were then swabbed after disinfection. Colony-forming units (CFUs) were quantified using standard microbiological techniques by a third-party laboratory. Statistical analysis was performed on the resulting bacterial counts using Minitab version 18.1 software. Results: We detected statistically significant decreases in total bacteria, yeast, and mold counts following implementation of this disinfection protocol. The predisinfection mean of bacteria, yeast, and mold counts for all surfaces combined was reduced 96% after disinfection (from 254,637 CFU to 9,392 CFU). Conclusions: Cleaning and disinfection of surfaces in PCFR emergency vehicles and fire station common areas with the agents described above effectively reduced contamination with bacteria, yeast, and mold spores. The PCFR has implemented this disinfection protocol as a tool in eliminating EMS vehicles and the station environment as reservoirs of infection for patients, visitors, and firefighters. Future efforts will include assessing the impact of regular cleaning and disinfection on baseline levels of bacteria, yeast, and mold spores.
Funding: This study was supported by Clorox.
Disclosures: None
Impact of Antimicrobial Stewardship Programs in Latin American Adult Intensive Care Units: PROA-LATAM Project
- Rodolfo Quirós, Patricia Angeleri, Jeannete Zurita, Washington Aleman, Marcelo Carneiro, Silvia Guerra, Julio Medina, Ximena Castañda Luquerna, Alexander Guerra, Silvio Vega, Luis Cuéllar, Jose Munita, Gina Maki, Tyler Prentiss, Elvio Escobar, Ángel Foianini, Marcus Zervos, Ana Bardossy
-
- Published online by Cambridge University Press:
- 02 November 2020, p. s520
-
- Article
-
- You have access Access
- Export citation
-
Background: Antimicrobial stewardship programs (ASPs) are useful in improving clinical outcomes in a cost-effective way and in reducing antimicrobial resistance. Objective: We sought to determine the impact of ASP in adult medical-surgical intensive care units (MS-ICUs). Methods: A multicenter study, in 77 MS-ICUs of 9 Latin-American countries, was conducted along 12 months (July 2018–June 2019). A self-assessment survey using a tool based on CDC recommendations (0–100 scale) was performed at the beginning, after 6 months, and at the end of the study. The impact of ASP was evaluated monthly using the following indicators: antimicrobial consumption (defined daily doses [DDD] per 100 patient days), appropriateness of antimicrobial prescriptions (percentage of total prescriptions), crude mortality rate (events per 100 discharges), and hospital-acquired multidrug-resistant microorganisms (MDRs) and Clostridioides difficile infections (CDI events per 1,000 patient days). These indicators were compared between MS-ICUs that reached the 75th percentile and those that maintained the 25th percentile at the final self-assessment. Results: Of all indicators evaluated, only surgical prophylaxis ≤24 hours, vancomycin therapeutic monitoring, and aminoglycosides (1 dose per day) did not show significant differences between MS-ICUs at the 75th percentile and the 25th percentile. CDI events were significantly higher at the 75th percentile MS-ICUs, probably related to better detection of C. difficile (Table 1). Conclusions: This study confirmed that MS-ICUs with more comprehensive ASPs had significantly better indicators.
Funding: Proprietary Organization: MERCK
Disclosures: None
Influence of Antibiotic Susceptibility Testing on Antibiotic Choice in Hospital-Acquired and Ventilator-Associated Pneumonia
- Taissa Zappernick, Robbie Christian, Sharanie Sims, Brigid Wilson, Federico Perez, Robert Bonomo, Robin Jump
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s520-s521
-
- Article
-
- You have access Access
- Export citation
-
Background: The survival of patients with hospital-acquired pneumonia (HAP) and ventilator-associated pneumonia (VAP) is largely determined by the timely administration of effective antibiotic therapy. Guidelines for the treatment HAP and VAP recommend empiric treatment with broad-spectrum antibiotics and tailoring of antibiotic therapy once results of microbiological testing are available. Objective: We examined the influence of bacterial identification and antibiotic susceptibility testing on antibiotic therapy for patients with HAP or VAP. Methods: We used the US Veterans’ Health Administration (VHA) database to identify a retrospective cohort of patients diagnosed with HAP or VAP between fiscal year 2015 and 2018. We further analyzed patients who were started on empiric antibiotic therapy, for whom microbiological test results from a respiratory sample were available within 7 days and who were alive within 48 hours of sample collection. We used the antibiotic spectrum index (ASI) to compare antibiotics prescribed the day before and the day after availability of bacterial identification and antibiotic susceptibility testing results. Results: We identified 4,669 cases of HAP and VAP in 4,555 VHA patients. The median time from respiratory sample receipt in the laboratory to final result of bacterial identification and antibiotic susceptibility testing was 2.22 days (IQR, 1.31–3.38 days). The most common pathogen was Staphylococcus aureus (n = 994), with methicillin resistance in 58% of those isolates tested. The next most common pathogen was Pseudomonas spp (n = 946 isolates). The susceptibility of antipseudomonal antibiotics, when tested, was as follows: 64% to carbapenems, 74% to cephalosporins, 75% to β-lactam/β-lactamase inhibitors, 69% to fluoroquinolones, and 95% to amikacin. Lactose-fermenting gram-negative bacteria (296 Escherichia coli and 360 Klebsiella pneumoniae) were also common. Among the 3,094 cases who received empiric antibiotic therapy, 607 (20%) had antibiotics stopped the day after antibiotic susceptibility results became available, 920 (30%) had a decrease in ASI, 1,075 (35%) had no change in ASI, and 492 (16%) had an increase in ASI (Fig. 1). Among the 1,098 patients who were not started on empiric antibiotic therapy, only 154 (14%) were started on antibiotic therapy the day after antibiotic susceptibility results became available. Conclusions: Changes in antibiotic therapy occurred in at least two-thirds of cases the day after bacterial identification and antibiotic susceptibility results became available. These results highlight how respiratory cultures can inform the treatment and improve antibiotic stewardship for patients with HAP/VAP.
Funding: This study was supported by Accelerate Diagnostics.
Disclosures: None
Patient-Specific Predictive Antibiogram in Decision Support for Empiric Antibiotic Treatment
- Debarka Sengupta, Vaibhav Singh, Seema Singh, Dinesh Tewari, Mudit Kapoor, Debabrata Ghosh, Shivam Sharma
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s521-s522
-
- Article
-
- You have access Access
- Export citation
-
Background: The rising trend of antibiotic resistance imposes a heavy burden on healthcare both clinically and economically (US$55 billion), with 23,000 estimated annual deaths in the United States as well as increased length of stay and morbidity. Machine-learning–based methods have, of late, been used for leveraging patient’s clinical history and demographic information to predict antimicrobial resistance. We developed a machine-learning model ensemble that maximizes the accuracy of such a drug-sensitivity versus resistivity classification system compared to the existing best-practice methods. Methods: We first performed a comprehensive analysis of the association between infecting bacterial species and patient factors, including patient demographics, comorbidities, and certain healthcare-specific features. We leveraged the predictable nature of these complex associations to infer patient-specific antibiotic sensitivities. Various base-learners, including k-NN (k-nearest neighbors) and gradient boosting machine (GBM), were used to train an ensemble model for confident prediction of antimicrobial susceptibilities. Base learner selection and model performance evaluation was performed carefully using a variety of standard metrics, namely accuracy, precision, recall, F1 score, and Cohen κ. Results: For validating the performance on MIMIC-III database harboring deidentified clinical data of 53,423 distinct patient admissions between 2001 and 2012, in the intensive care units (ICUs) of the Beth Israel Deaconess Medical Center in Boston, Massachusetts. From ~11,000 positive cultures, we used 4 major specimen types namely urine, sputum, blood, and pus swab for evaluation of the model performance. Figure 1 shows the receiver operating characteristic (ROC) curves obtained for bloodstream infection cases upon model building and prediction on 70:30 split of the data. We received area under the curve (AUC) values of 0.88, 0.92, 0.92, and 0.94 for urine, sputum, blood, and pus swab samples, respectively. Figure 2 shows the comparative performance of our proposed method as well as some off-the-shelf classification algorithms. Conclusions: Highly accurate, patient-specific predictive antibiogram (PSPA) data can aid clinicians significantly in antibiotic recommendation in ICU, thereby accelerating patient recovery and curbing antimicrobial resistance.
Funding: This study was supported by Circle of Life Healthcare Pvt. Ltd.
Disclosures: None
Patterns of Oral Antibiotic Use and Excess Duration at Hospital Discharge
- Corey Medler, Nicholas Mercuro, Helina Misikir, Nancy MacDonald, Melinda Neuhauser, Lauri Hicks, Arjun Srinivasan, George Divine, Marcus Zervos
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s522-s523
-
- Article
-
- You have access Access
- Export citation
-
Background: Antimicrobial stewardship (AMS) interventions have predominantly involved inpatient antimicrobial therapy. However, for many hospitalized patients, most antibiotic use occurs after discharge, and unnecessarily prolonged courses of therapy are common. Patient transition from hospitalization to discharge represents an important opportunity for AMS intervention. We describe patterns of antibiotic use selection and duration of therapy (DOT) for common infections including discharge antibiotics. Methods: This retrospective cross-sectional analysis was derived from an IRB-approved, multihospital, quasi-experiment at a 5-hospital health system in southeastern Michigan. The study population included patients discharged from an inpatient general and specialty practice ward on oral antibiotics from November 2018 through April 2019. Patients were included with the following diagnoses: skin and soft-tissue infections (SSTIs), community-acquired pneumonia (CAP), hospital-acquired pneumonia (HAP), respiratory viral infections, acute exacerbation of chronic obstructive pulmonary disease (AECOPD), intra-abdominal infections (IAIs), and urinary tract infections (UTIs). Other diagnoses were excluded. Data were extracted from medical records including antibiotic indication, selection, and duration, as well as patient characteristics. Results: In total, 1,574 patients were screened and 800 patients were eligible for inclusion. The most common antibiotic indications were respiratory tract infections, with 487 (60.9%) patients. These included 165 AECOPD cases (20.6%) and 200 CAP cases (25%) with no multidrug resistant organism (MDRO) risk factors; 57 patients (7.1%) with MDRO risk factors; HAP in 7 patients (0.9%); and influenza in 58 patients (7.2%). Also, 205 (25.6%) patients were diagnosed with UTIs: 71 with cystitis (8.9%), 86 (10.8%) with complicated UTI (cUTI), and 48 (6%) with pyelonephritis. Furthermore, 125 patients (15.6%) were diagnosed with SSTI: 59 (7.4%) purulent and 66 (8.3%) nonpurulent. 31 (3.9%) patients had an IAI. The most commonly used antibiotics were cephalosporins in 536 patients (67%), azithromycin in 252 patients (31.5%), and fluroquinolones and tetracyclines in 231 patients (28.9%). Fluroquinolones were the most frequent antibiotic prescribed at discharge in 210 patients (26.3%). Figure 1 displays the average DOT relative to specific indications. The median duration of total antibiotic therapy exceeded institutional guideline recommendation for multiple conditions, including AECOPD (7 days vs recommended 5 days), CAP with COPD (8.3 vs 7 days ), CAP without COPD (7.7 vs 5 days), and pyelonephritis (11 vs 7–10 days). Also, 269 (33.6%) patients received unnecessary therapy; 218 (27.3%) of these were due to excess duration. Conclusions: Among a cross-section of hospitalized patients, the average DOT, including after discharge, exceeded the optimal therapy for many patients. Further understanding of patterns and influences of antibiotic prescribing is necessary to design effective AMS interventions for improvement.
Funding: This work was completed under CDC contract number 200-2018-02928.
Disclosures: None
Point Prevalence Surveys and Customized Interventions Are Good Strategies to Improve Antimicrobial Use: The Brazilian Experience
- Ana Paula Matos Porto, Icaro Boszczowski, Ann Versporten, Ines Pauwels, Thais THAIS, Evelyne Girão, Patricia Esteves, Claudia Carrilho, Tiago Luiz Ferraz, Camila Donini, Herman Goossens, Silvia Figueiredo
-
- Published online by Cambridge University Press:
- 02 November 2020, p. s523
-
- Article
-
- You have access Access
- Export citation
-
Background: Although antimicrobial stewardship is recommended by Brazilian government, data regarding prescription practices in the country are scarce. Objective: To describe the impact of 2 point-prevalence surveys and customized interventions on antimicrobial consumption among 8 hospitals in 3 regions of Brazil. Method: In 2017 and 2018, 8 tertiary-care Brazilian hospitals conducted the Global Point Prevalence Survey of Antimicrobial Consumption and Resistance (Global-PPS). All enrolled hospitals were provided the 2017 results. The group discussed intervention strategies by WhatsApp and e-mail. Hospitals customized interventions, including feedback to prescribers, discussion with pharmacists, and antimicrobial use data in accreditation process. A web-based program was used for data entry, validation, and reporting of details on AMC prescriptions. The Global-PPS was developed by the University of Antwerp and was funded by bioMérieux. The 1-day prevalences in 2017 and 2018 are presented as risk ratios. The main outcomes are whole antimicrobial use in hospitals and intensive care units (ICUs). Prevalence of infections caused by multidrug-resistant organisms (MDROs) were reported. Results: Overall, 1,716 patients were evaluated, of whom 420 (52.5%) and 429 (46.8%) were using antimicrobials in 2017 and 2018, respectively (P = .02). In 33 ICUs, 170 patients (61.4%) and 204 patients (56.8%) were on antimicrobials, in 2017 and 2018, respectively (P = .20). Significant decreases of overall use were observed for vancomycin (from 11% to 7%; P =.01), meropenem (from 12% to 9%; P = .04), and linezolid (from 1.5% to 0.33%; P =.01). There was no significant increase in any singular drug or class of drugs. Within ICUs, vancomycin use decreased significantly (from 19% to 11%; P = .005), linezolid use decreased significantly (from 2.9% to 0.3%; P = .01), colistin use decreased significantly (from 4.3% to 1.7%; P = .05), and metronidazole use decreased significantly (from 6.5% to 2.8%; P = .03). We observed a nonsignificant decrease of infections caused by MDROs across the whole hospital (from 8.7% to 6.6%; P = .10) and in the ICUs (from 15.2% to 12.3%; P = .30). The most frequent infectious diagnoses were pneumonia (27%), intra-abdominal sepsis (14%), skin and soft-tissue infection (SSTI) (9.4%), urinary tract infection (9.1%), and sepsis and septic shock with no identified focus (SSNIF) (7.4%). There was a significant increase in SST (from 7.6% to 11.4%; P = .03) and a decrease in SSNIF (from 10.7% to 4.1%; P = .00002). In 2018, there were significantly fewer antimicrobial prescriptions for healthcare-acquired infections (from 52.6% to 43.6%; P = .0007) and more antimicrobial prescriptions for community-acquired infections (from 27.4% to 34.6%; P = .003). We detected no difference for medical or surgical prophylaxis. Conclusions: Feedback of prescription practices might have had an impact on local policies of antimicrobial use, as demonstrated by an overall decrease is antimicrobial use and a decrease in the ICU.
Funding: This study was supported by Biomérieux.
Disclosures: None
Profile of Nursing Homes Enrolled in the National Health Safety Network: Focus on Interfacility Communication
- Karen Jones, John Mills, Sarah Krein, Ana Montoya, Jennifer Meddings, Lona Mody
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s523-s524
-
- Article
-
- You have access Access
- Export citation
-
Background: A robust infection prevention infrastructure is critical for creating a safe resident environment in nursing homes. The CDC NHSN provides a standardized approach to infection surveillance and analysis, which can drive internal quality improvement efforts in nursing homes and could serve as an indicator of facilities’ infection prevention aptitude. The purpose of this study was to compare the characteristics of nursing homes enrolled to those not enrolled in the NHSN, including interfacility communication methods, as an essential part of reducing resident infection-related risks. Methods: Over a 2-year period, 50 nursing homes participated in a 12-month program designed to reduce healthcare-associated infections (HAIs) by enhancing relationships between nursing homes and hospitals. Overall, 11 demographic surveys were administered to nursing homes prior to the start of the phase 1 pilot year between January and March 2018, and another 39 were administered prior to beginning phase 2 in January–February 2019. The survey consisted of 36 questions on facility characteristics, including NHSN enrollment, infection prevention and control (IPC) program and infection preventionist characteristics, and communication methods related to interfacility transfer of care. We compared facility, IPC program characteristics, and communication methods between nursing homes stratified based on NHSN enrollment. These were compared using the Fisher exact test. Results: In total, 50 nursing homes, varying in size and services provided, completed the demographic survey (Table 1). Of these 50 nursing homes, 11 (22%) were enrolled in the NHSN. Nursing homes enrolled in the NHSN were more likely to use a telephone report prior to resident transfer in and out of the facility (P = .04) and to disseminate infection data to all facility nursing staff (P = .02). Overall, less than half of nursing homes included a telephone report as part of their routine hand-off communication, and most nursing homes relied only on written transfer forms or discharge documentation. Moreover, 65% of the nursing homes reported use of a standardized method to accept new residents with history of multidrug-resistant organism (MDRO), including a review of infection or MDRO type, antibiotic orders, and ambulation status. NHSN-enrolled nursing homes were also more likely to have an antibiotic stewardship program and to use the electronic health record (EHR) to facilitate infection surveillance, though these differences were not statistically significant. Conclusions: A higher percentage of nursing homes enrolled in the NHSN engaged in activities connected with resident safety including verbal report prior to interfacility transfer and antimicrobial stewardship programs. Dedicating resources for nursing homes to enhance their IPC program including NHSN enrollment should be encouraged.
Funding: This study was supported by a grant from the AHRQ (grant no. RO1HS25451).
Disclosures: None
Quality Initiative to Reduce Catheter-Associated Urinary Tract Infections Using Cleansing Cloths With a Standardized Method
- Lauren Droske, Parul Patel, Donna Schora, Jignesh Patel, Ruby Barza, Cherie Faith Monsalud, Adrienne Fisher, Rachel Lim, Mona Shah, Bridget Kufner, Shane Zelencik, Mary Alice Lavin, Kamaljit Singh
-
- Published online by Cambridge University Press:
- 02 November 2020, p. s525
-
- Article
-
- You have access Access
- Export citation
-
Background: Catheter-associated urinary tract infections (CAUTIs) account for >15% of hospital-acquired infections, resulting in increased length of stay and costs. Consequently, methods to improve indwelling urinary catheter (IUC) care and maintenance are warranted to reduce the risk of hospital-acquired CAUTIs. This study was a prospective quality improvement (QI) project to reduce CAUTIs using prepackaged cloths (ReadyCleanse by Medline Industries) and a simple, standardized cleaning process for care and maintenance of IUCs. Methods: This study is an ongoing QI project at NorthShore University HealthSystem, a 4-hospital system located north of Chicago, Illinois, with 750 beds and ∼64,000 annual admissions. The study consists of a 1.5-month staff training on proper product use (phase 1), followed by an intervention using the cloths for IUC care (phase 2). Each package contains 5 individual cloths corresponding to a simple, 5-step, cleansing protocol. IUC care and maintenance are performed twice daily on a routine basis and after each incontinent episode. Beginning July 2018, current practice (soap and wash cloth) was replaced with the ReadyCleanse cloths, and on August 1, 2018, data collection began. Adult patients admitted at all 4 NorthShore Hospitals with an IUC for >24 hours are enrolled in the study. From patient electronic health records, we collected patient demographics, reason for IUC insertion, days of catheter use, and development of CAUTI (according to the NHSN definition). During the intervention, observations of compliance and performance of catheter care were also performed. For the analysis described here, results for the first 14 months of the study were compared to CAUTI numbers from the 14-month period prior to the start of the study (February 2017–March 2018); the data presented represent ∼50% of the planned data collection. Results: As of September 30, 2019, 4,969 patients were prospectively enrolled in the study: 1,491 patients from hospital A, 1,451 from hospital B, 1,091 from hospital C, and 936 from hospital D. Patient demographics for the study cohort were 47% female, with a median age of 77 years and an average of 3.9 catheter days per patient. Systemwide, observational audits for compliance using the cloths averaged 95%. Upon completion of study month 14, 22 CAUTIs had been identified, compared to 26 CAUTIs for the comparison period, indicating a 15% reduction. Conclusion: Implementation of this simple, standardized alternative for IUC care is feasible on a large scale and may have potential for reducing CAUTI rates.
Funding: Medline Industries supported this study.
Disclosures: None
Similar Mortality in Patients with Invasive and Noninvasive Pneumonia Due to Group B Streptococcus
- Brigid Wilson, Sunah Song, Taissa Zappernick, Janet Briggs, Richard Banks, Robin Jump, Federico Perez
-
- Published online by Cambridge University Press:
- 02 November 2020, pp. s525-s526
-
- Article
-
- You have access Access
- Export citation
-
Background: Rates of invasive infections caused by caused group B Streptococcus (GBS) are increasing among adults. The burden of noninvasive GBS infections, including pneumonia, has not been well characterized. Here, we compare comorbidities and mortality associated with invasive and noninvasive pneumonia caused by GBS. Methods: Using the Veterans’ Health Administration national data warehouse, we studied a retrospective cohort review of veterans diagnosed with GBS pneumonia between 2008 and 2017. Invasive pneumonia was defined as blood cultures positive for GBS associated with an order for a chest x-ray and an International Classification of Disease (ICD) code for pneumonia. Noninvasive pneumonia was defined as a respiratory culture positive for GBS associated with both an order for a chest x-ray and an ICD code for pneumonia among patients with negative or without blood cultures. Patients with respiratory cultures positive for GBS without either an associated chest x-ray or ICD code for pneumonia were considered colonized. We compared demographics, comorbid conditions, and mortality among patients with invasive and noninvasive GBS pneumonia. Results: Between 2008 and 2017, we detected 706 cases of invasive GBS pneumonia, 1,244 cases of noninvasive GBS pneumonia, and 1,470 cases of respiratory colonization with GBS. Most patients were male (97%), with an average age of 69.0 years (SD, 12.0 years). The prevalence of several comorbid conditions differed between those with invasive and noninvasive disease: diabetes mellitus (61% and 46%, respectively); chronic pulmonary diseases (53% and 65%, respectively); chronic heart disease (58% and 44%, respectively), chronic kidney disease (43% and 27%, respectively). Mortality was similar among those with invasive and noninvasive GBS pneumonia at 30 days (17% and 18%, respectively) and at 1 year (38% and 43%, respectively) (Fig. 1). Conclusions: We identified important differences in underlying comorbid conditions between patients with invasive and noninvasive GBS pneumonia, which may give rise to differences in their clinical presentation. Overall mortality, however, was similar: more than one-third of patients with GBS pneumonia died within 1 year. These findings indicate that noninvasive GBS pneumonia is an important clinical entity.
Funding: This study was supported by Pfizer.
Disclosures: None
Surgical Site Infections with Predominance of Multidrug Resistant in Benin: A Multicenter Study
- Carine Laurence YEHOUENOU, Hector RODRIGUEZ-VILLALOBOS, Olivia DALLEUR, Anne SIMON
-
- Published online by Cambridge University Press:
- 02 November 2020, p. s526
-
- Article
-
- You have access Access
- Export citation
-
Background: Surgical site infections remain common and widespread; they contribute to increasing antimicrobial resistance among the etiological agents. Antimicrobial resistance is the ability of a microorganism like bacteria to stop an antimicrobial from working against it. This study was conducted to determine the spectrum of bacterial isolates from surgical site infections and their susceptibility patterns. A secondary outcome was to compare bacterial identification by a local lab and a European one. Methods: This descriptive cross-sectional study was conducted between January and August 2019 in 6 public hospitals in Benin. Pus specimens were processed using standard microbiological procedures, and identification was performed using the analytical profile index (API). Antimicrobial susceptibility testing was performed in Benin following the modified Kirby-Bauer disk-diffusion technique and was confirmed in Belgium by MALDI-TOF mass spectrometry. A second antimicrobial susceptibility test was performed using BD Phoenix automated microbiology system (Becton Dickinson). Clinical data of enrolled patients were obtained from hospital records. Results: The mean age of patients was 32 ± 11 years (range, 18–76). The median time for surgical site infections was 9 postoperative days. Of the 229 patients from whom wound swabs were collected, 195 (85.15%) showed positive aerobic bacterial growth. In total, 164 pathogenic bacteria were isolated, including 41 gram-positive organisms (25%), 78 gram-negative fermentative bacteria (47.5%), and 45 gram-negative nonfermentative bacteria (27.5%). We observed 3 discrepancies between API technique and MALDI-TOF. Two Klebsiella pneumoniae and 1 Pseudomonas spp (API) versus, respectively, Klebsiella varicola and Pseudomonas mendocina (MALDI-TOF). The most prevalent bacterial species were E. coli (31%), followed by S. aureus (25%), Pseudomonas aeruginosa (18%), and Klebsiella pneumoniae (11%). Of the 41 S. aureus, 26 (63,41%) were methicillin-resistant Staphylococcus aureus (MRSA), and 3 of these were carrying both MRSA and induced clindamycin resistance (ICR). Extended-spectrum β-lactamase (ESBL)–producing Enterobacteriaceae were observed in 60 of 78 isolates tested (77%). All of 2 Morganella morgannii and 89% of K. pneumoniae were ESBL producers. Conclusions: Among S. aureus, 2 of 3 were MRSA, whereas almost K. pneumoniae and E. coli were ESBL producers. Three strains are pan–drug resistant in nonfermentative bacteria, and no isolate was susceptible to all antibiotics. These findings are of high interest for better management of patients and control of antimicrobial resistance in Benin.
Funding: This study was supported by Académie de Recherche pour l’Enseignement Supérieur (ARES).
Disclosures: None