To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: The Clinical & Laboratory Standards Institute (CLSI) recommends use of annual antibiograms to help guide empiric antibiotic therapy. Because CLSI periodically updates minimum inhibitory concentration (MIC) breakpoints, we assessed the impact of these updates on longitudinal trends in antibiotic susceptibility rates for Escherichia coli and Klebsiella pneumoniae at a single academic medical center in Atlanta, GA. Methods: Susceptibilities for cefepime, ceftazidime, and levofloxacin in E. coli and K. pneumoniae were extracted from hospital antibiograms from 1988 to 2022. Starting in 1995, intensive care units (ICUs) and wards had separate annual antibiograms, which we combined using weighted averages to create annual overall hospital antibiograms. After summarizing the frequency of isolates tested and susceptibilities using medians and interquartile ranges (IQR), we conducted an interrupted time series analysis using linear segmented regression models, to evaluate the level changes and trends in susceptibility, before and after CLSI MIC breakpoints were updated for ceftazidime (2010 and 2017), cefepime (2014 and 2017), and levofloxacin (2013). Results: Among 21,214 E. coli, there was a median of 291 [IQR: 104, 555] isolates tested annually. Similarly, among 8,686 K. pneumoniae isolates, the median was 125 per year (IQR: 76, 178). Prior to the MIC breakpoint changes, baseline susceptibility trends of both organisms to all 3 antibiotics significantly declined at a rate between 0.2% to 2.4% per year (Table 1). For cefepime (Figure 1), susceptibility decreased annually during 1988 – 2013 for both E. coli (-0.5%) and K. pneumoniae (-1.2%). There were no significant level changes but there were trend changes after 2018, for E. coli (+2.1%) and K. pneumoniae (– 5.5%). For ceftazidime (Figure 2), significant level changes occurred after 2010 for both organisms (E. coli: -5.7%; K. pneumoniae: -5.2%). For levofloxacin (Figure 3), the breakpoint update in 2013 lead to significant level change in susceptibility (E. coli: +8.4%; K. pneumoniae: +11.4%). Conclusion: Overall, we observed a consistent decrease in antibiotic susceptibility in E. coli and K. pneumoniae over three decades, with immediate increases in the level change of susceptibility when MIC breakpoints were changed, followed by a decreasing trend. These findings highlight the importance of longitudinal surveillance and MIC breakpoint changes to inform antimicrobial stewardship strategies.
Objective: Mycobacterium tuberculosis (MTB) is a contagious airborne disease that is spread from person to person via particles in the air which are expelled when speaking or coughing1. This retrospective observational study aims to assess the nosocomial transmission of pulmonary MTB among inpatient roommates in a high-risk oncological population over a 14-year period. With limited studies on the transmissibility of MTB in such environments, the investigation focuses on evaluating the risk of nosocomial transmission and implementation of appropriate infection control measures. Design: A retrospective analysis from 2010 – April 2023 was conducted in an acute care, 500-bed oncological center. Following exposure workups performed by the Department of Infection Prevention and Control, 17 of 57 identified patients with active pulmonary MTB had inpatient stays with roommates. Source infectivity showed 7 AFB smear positive results, 4 MTB PCR positive results, and 14 MTB culture positive results. Some index patients had a combination of AFB, PCR and/or culture positivity. A high-risk exposure is defined as any patient who shared a room with an index patient for >4 cumulative hours during the infectious period. Infectious period was determined for each index patient based on the onset of symptoms and laboratory results. Workups identified 33 exposed roommates who were notified and advised to undergo testing, employing QuantifERON (QFT-GIT) serum test or Tuberculin skin (TST) PPD test at least 8 weeks following their last day of exposure. The overlap between inpatient roommates and index patients ranged from 1 to 4 days, averaging 1.5 days. Results: Of the 33 high-risk roommates, 14 (42%) patients were unable to provide follow-up testing for various reasons including: patient expiration prior to testing, patient transfer to hospice, and being lost to follow up. Nineteen (58%) patients completed post-exposure testing. 12 patients underwent PPD testing (63%) and 7 patients underwent QuantifERON testing (37%). Zero (0%) were found to have a positive QuantifERON or PPD following their exposure. 15.8% (N=3) of exposed patients had hematologic malignancies, and 84.2% (N=16) of exposed patients had solid tumor malignancies. Conclusion: The risk of active pulmonary MTB transmission in an oncological, inpatient setting was determined to be low. The absence of positive conversions among roommates of confirmed MTB patients underscores the effectiveness of infection control measures, emphasizing the importance of isolating confirmed or suspected cases promptly. Ongoing efforts should continue to focus on these preventive measures to mitigate the risk of MTB transmission in similar high-risk settings.
Background: Serratia marcescens (S. marcescens) is an environmentally associated organism known for causing healthcare associated infections and outbreaks in neonatal intensive care units (NICUs). The colonization or infection rates in NICU settings remain uncertain. This study aims to evaluate the rate of baseline colonization and clinical infection and relatedness of S. marcescens isolates. Methods: Prospective surveillance of rectal colonization and clinical infection of S. marcescens was conducted on patients admitted to the NICU at Mount Sinai Hospital in Toronto, Ontario, from March 1, 2023, to September 30, 2023. The NICU is a 57 bed unit with all private rooms. Monthly point prevalence assessments by rectal screening were performed, alongside active surveillance for clinical infections associated with S. marcesens. Isolates from screening or clinical samples underwent assessment for relatedness using pulse field gel electrophoresis (PFGE). Results: Over the 7 month study period, 12 different patients (5.4%) were colonized/infected with S. marcescens. Among these, 10 patients (4.5%) were identified through rectal screening (316 rectal swabs were collected from 224 patients) and two patients (0.9%) exhibited positive clinical specimens (urine and endotracheal aspirate) in association with pyelonephritis and ventilator-associated pneumonia, respectively. Of the two clinical cases, one case showed a negative preceding rectal swab and the other detected through a clinical sample before the point prevalence date. The age at which a positive S. marcescens swab or positive clinical specimen was identified ranged from 4 to 66 days (median=18 days, IQR 5-38.8) (Figure 1). Sixty seven infants had repeated screening. Three out of 67 (4.5%) were colonized with S. marcescens, the timing and sequence of positive and negative testing are presented in Figure 2. Females demonstrated a higher positivity rate compared to males [9.1% (9/99) vs 2.4% (3/125), p=0.04, respectively]. PFGE analysis of all 12 (100%) isolates revealed a polyclonal pattern. Most cases were detected from March to May in 9/12 cases (75%). Ten different strains were identified. Notably, two strains demonstrated clusters of two cases each, one during March and the other during May (Figure 3). No mortality was reported among the cases. Conclusions: The study highlights the polyclonal nature of S. marcesens and raises questions about the utility of point prevalence in anticipating clinical cases or patient-to-patient transmission, especially in patients with clinical infection where there were no preceding positive screening tests.
Background: Use of a combination of parenteral and oral antimicrobial prophylaxis prior to colorectal surgery is recommended to reduce risk of surgical site infection (SSI). Parenteral antibiotic selection is complicated by the need to target organisms likely to cause infection at the surgery site, while mitigating risk of antimicrobial resistance caused by overuse of broad spectrum agents. This study aimed to evaluate microbiologic data from colorectal surgical site infections across an 11-hospital health system. Microbiologic data from SSI events were used to assess continued appropriateness of health system standard recommendations for parenteral antibiotic prophylaxis in colorectal surgery, consisting of either cefazolin with metronidazole or cefoxitin monotherapy. Methods: This multicenter, retrospective, observational study was conducted from January 1, 2019 to March 31, 2023, using data extracted from the National Healthcare Safety Network (NHSN). Microbiologic data from colorectal SSIs from 2019 to 2022 were evaluated for a descriptive review of pathogen and phenotype trends. SSI data excluded patients age < 18 years, those identified as infection present at time of surgery (PATOS), or outpatient procedures. Results: A total of 8059 colorectal procedures were evaluated. Most SSIs were polymicrobial, with at least one pathogen detected in 65% of cases. The most commonly identified organisms were E. coli (22.5%), Enterococcus spp. (19.7%), P. aeruginosa (6.5%), Streptococcus spp. (4.9%), and C. albicans (4.7%). Change over time in antimicrobial-resistant phenotypes from 2019 to 2022 was not statistically significant for extended-spectrum cephalosporin-resistant E. coli (p=0.335), extended-spectrum cephalosporin-resistant K. oxytoca/pneumoniae (p=0.189), multi-drug resistant P. aeruginosa (0.058), methicillin-resistant S. aureus (p=0.906), or among isolates with no identified antimicrobial-resistance phenotype (p=0.096). Among E. coli, change from 2019 to 2022 in cefazolin non-susceptible, ceftriaxone susceptible isolates was not statistically significant (p=0.177). No carbapenem-resistant Enterobacterales isolates were identified among non-PATOS cases. Conclusions: Data does not support a change to broader spectrum agents for colorectal surgery parenteral antimicrobial prophylaxis. Continued use of cefazolin with metronidazole or cefoxitin as IV antibiotic prophylaxis in colorectal surgery is recommended, with ongoing tracking of microbiologic trends and antimicrobial susceptibility.
Background: Older adults (aged ≥65) are at high risk of harm from overdiagnosis and overtreatment of urinary tract infections (UTIs) with antibiotics. Involving patients/caregivers in their antibiotic treatment decisions has potential to improve prescribing. To engage effectively, patients/caregivers must have sufficient knowledge about UTIs, asymptomatic bacteriuria (ASB: bacteria in the urine without signs of UTI), and antibiotics and opportunities to share their concerns and treatment preferences with healthcare staff. Patient education is one of the core elements of antibiotic stewardship recommended by the Centers for Disease Control and Prevention but, there are few resources for patients/caregivers about UTIs and antibiotics, leaving a knowledge gap as to what effective patient/caregiver antibiotic education for UTIs looks like. We sought to better understand the perspectives of patients/caregivers at high-risk of antibiotic overuse for UTIs and create an educational leaflet on UTIs, antibiotics, and ASB. Method: Between 11/2022 and 03/2023, we conducted virtual semi-structured interviews with patients ≥65yrs who had experienced UTI and caregivers about their needs, experiences, and preferences for educational support. Interviews lasted ~1 hour. Audio recordings were transcribed verbatim. NVivo software managed the data, which we analyzed using thematic analysis. Results: We conducted 9 interviews (5 patients, 4 caregivers). Interviewees expressed desire to be involved in their treatment decisions and learn more about antibiotics and alternative strategies (themes shown in Figure 1). Reported reasons for limited involvement in decisions included lacking the knowledge and confidence to ask questions, emotional factors (e.g., embarrassment/stress), deference to healthcare staff, and time constraints. Healthcare staff behaviors were described both as barriers (e.g., assertive treatment decisions) and facilitators (e.g., effective communication) of patient/caregiver engagement. Interviewees were eager for printed and digital educational support that could provide tailored content to help improve their knowledge and prepare for future conversations with healthcare staff. From this feedback we developed an educational leaflet (Figure 2). Conclusions: Involving patients/caregivers in antibiotic treatment decisions represents an opportunity to intervene before patients experience antibiotic overuse. Our findings offer important insights on patient/caregiver’ educational needs and preferences as well as perceived barriers to engaging in antibiotic treatment decisions for UTI. We used these insights to inform the development of educational materials about UTIs, ASB, and antibiotics for patients/caregivers and plan to test their use through multiple mediums with tailoring for unique patient needs, experiences, and backgrounds.
The capture of wild-living animals can provide valuable information that is critical in developing and implementing effective conservation actions. These capture procedures, however, often require direct handling of individuals by researchers, and conservationists should constantly seek to improve capture methods so that the impacts on animal welfare are minimised. The ngwayir (western ringtail possum; Pseudocheirus occidentalis) is a critically endangered arboreal marsupial in need of effective conservation. It is, however, not amenable to conventional trapping, leading to the use of methods such as nest robbing and tranquilisation using dart guns or pole syringes, which involve potentially serious animal welfare risks and longer exposure of animals to humans as compared to conventional trapping. In pursuit of an improved capture method, we investigated opportunistically whether placing traps above the ground would increase the capture success rate of the species, using wire cage traps baited with universal bait and fruit. Between 2010 and 2019, we deployed trapping grids in Locke Nature Reserve and adjacent campsites near Busselton, WA, Australia, with traps placed on the ground for 1,985 trap nights and traps placed on horizontal tree branches, fallen trees or fences, 1–2 m above ground for 694 trap nights. With the above ground traps we trapped 82 ngwayirs out of 694 trap nights, 27 in autumn and 55 in spring. We also captured eleven common brushtail possums (Trichosurus vulpecula; 1.6% trap success rate), 12 King’s skinks (Egernia kingii; 1.7%) and five black rats (Rattus rattus; 0.7%). Trapping success rate was higher in elevated traps (up to 18.3%) compared to traps on the ground (0.5%) and using fruit as bait increased the trap success rate. These results suggest that using elevated traps baited with fruit is a practical, effective method to capture the ngwayir.
Background: Clinically non-indicated asymptomatic bacteriuria (ASB) identification precipitates higher reported catheter-associated urinary tract infection (CAUTI) rates and urinary tract infection (UTI)-directed antimicrobial overuse. Published diagnostic stewardship interventions to reduce ASB were mostly tested individually and heterogeneously; hence the optimal bundle approach is yet to be defined. Methods: We performed a single-center sequential quasi-experimental study involving hospitalized, emergency, and long-term care patients at a VA healthcare facility, retrospectively comparing standard of care (period 1: 1/1/2022-6/30/2022) to adding dedicated provider education on facility-approved urine-culturing indications (period 2: 7/1/2022-1/19/2023), then adding an electronic clinical decision support (CDS) tool (Figure 1) mandating urine-culturing indications selections (period 3: 1/20/2023-6/30/2023), then prospectively adding real-time case-based physician-generated audit/feedback emails on ordering appropriateness (period 4: 7/1/2023-12/31/2023). We randomly sampled approximately 500 orders from each period and measured the impact on the rate of urine reflex/culture orders, the percentage of non-indicated orders and ASB, UTI-directed antimicrobial usage, and facility-wide CAUTI rates. Results: We analyzed 2140 urine reflex/culture orders (Table 1 and Figure 2). The mean monthly orders per 1000 bed-days and percentage of non-indicated orders decreased with each intervention to one-fourth of the initial values by period 4 (p=0.0002). The ASB rate among positive cultures was unchanged from periods 1 to 2 but started to decrease in period 3 with the biggest impact in period 4 (p=0.01). Non-indicated and ASB-directed antimicrobial courses both followed the previous pattern, dropping from 28% and 26% baseline to 6% and 4% by the study conclusion (p=0.015 and 0.008), respectively. Estimated UTI-directed antimicrobials decreased by 34% (363/551) with antimicrobial-days saved from 4093 to 2846 per 6-month period. CAUTI rate relatively declined with each intervention, along with a reduction in ASB-attributed CAUTI’s from 45% (5/11) initially to 20% (1/5) in period 4. Conclusion: A stepwise urine-culturing diagnostic stewardship approach of clinical education, electronic CDS tool, plus real-time audit/feedback decreased overall urine reflex/cultures, non-indicated ordering, ASB identification, unnecessary antimicrobials, and CAUTI rates, with the greatest impact after bundling all interventions including order appropriateness audit/feedback.
Background: In Japan, notifiable infectious disease surveillance ended and was replaced by sentinel surveillance following the COVID-19 reclassification in May 2023. Since COVID-19 sentinel surveillance is integrated into seasonal influenza surveillance, the number of reported cases varies depending on the extent to which sentinel facilities provide COVID-19 care. Therefore, we compared COVID-19 sentinel surveillance with school absentee surveillance, which is limited to high school equivalent age or younger, but provides reliable information on absences in the target population. Method: The 17-week period from week 23 (June 5 to June 11) to week 39 (September 25 to October 1) of 2023 was used as the target period. The number of weekly COVID-19 reports from 72 sentinel sites in Mie Prefecture (Population 1.7 million) as for sentinel surveillance and the number of COVID-19 absentees at a total of 998 facilities (401 kindergartens and nursery schools, and 597 elementary, junior high, and senior high schools) registered for school absentee surveillance in Mie Prefecture as for school absentee surveillance were compared across Mie Prefecture and eight health centers (Fig 1). Result: Except for the summer vacation period from week 29 to 35, sentinel surveillance and school absentee surveillance showed a significant positive correlation. During the summer vacation period, a decrease in the number of COVID-19 absentees was observed, especially in the elementary, junior high, and senior high school groups of the school absentee surveillance, compared to the sentinel surveillance (Fig 2 and 3). When compared by health center, no regional differences were observed in school absentee surveillance, but in sentinel surveillance, some health centers reported significantly more cases than others. Conclusion: The results of this study suggest that although COVID-19-based school absentee surveillance has some drawbacks, such as the limited number of subjects and the difficulty of evaluation during the summer vacation when schools are closed, it has the advantage of being able to evaluate the entire community without being affected by medical institution practice bias, and can be used to monitor trends in infectious diseases. It was considered important to combine and evaluate multiple surveillance indicators in order to accurately monitor epidemiologic trends of infectious disease over time.
Background: Frequent use of delayed sternal closure and prolonged stays in critical care units contribute to surgical site infections among pediatric patients undergoing cardiothoracic (CT) procedures. Bundled interventions to prevent or reduce surgical site infections (SSIs) have shown prior success, but limited data exist on sustainability of these efforts especially during the Coronavirus Disease 2019 (COVID-19) pandemic. Here, we re-examine the SSI rates for pediatric CT procedures after the onset of the pandemic. Methods: In a single academic center providing regional quaternary care, we created a multidisciplinary CT-surgery SSI Prevention workgroup in response to rising CT SSI rates. Bundle elements focused on daily chlorhexidine bathing, environmental cleaning, monthly room changes, linen management, antimicrobial prophylaxis, and sterile techniques for beside and operating room procedures. CDC surveillance definitions were used to identify superficial, deep or organ space SSIs. To assess the bundle’s sustainability, we compared SSI rates during years impacted by the COVID-19 pandemic (2021–2023, period 2) to pre-pandemic rates (2017–2019, period 1). Data from 2020 were excluded to account for bundle implementation, pandemic restrictions, and a minor decrease in surgical volumes. Rates were calculated as surgical site infection cases per 100 procedures. Mean rates across both periods were compared using paired t-tests (Stata/SE version 14.2). Results: Excluding the year 2020, the average SSI rate per 100 CT procedures increased from 1.07 in period 1 to 1.56 in period 2(p=0.55). Concurrently, the average SSI rate per 100 CT procedures with delayed closures increased from 1.49 in period 1 to 1.97 in period 2(p=0.67). Figure 1 shows SSI rates and procedure counts for 2017–2023. Coagulase negative Staphylococci most frequently caused SSIs in period 1 while methicillin-susceptible Staphylococcus aureus (MSSA) was most frequently identified in period 2. During period 2, the estimated compliance with SSI prevention bundle remained stable and reached 95% for pre-operative chlorhexidine baths and use of appropriate antimicrobial prophylaxis. Monthly room changes with dedicated environmental cleaning reached 100% compliance. Conclusion: Despite staffing shortages and resource limitations (e.g., discontinuation of contact isolation for MRSA colonization) during the COVID-19 pandemic, SSI rates for pediatric CT surgeries showed a slight, but non-statistically significant, increase in post-pandemic years as compared to pre-pandemic years. implementation of bundled interventions and improved surveillance methods may have sustainably impacted these SSI rates. Reinforcing bundle adherence as well as identifying additional prevention interventions to incorporate in pre-, intra-, and post-operative periods may improve patient outcomes.
Background: Invasive aspergillosis (IA) poses a substantial threat to morbidity and mortality, particularly among immunocompromised individuals. In 2023, a New York City Intensive Care Unit (ICU) experienced an aspergillus outbreak following a structural water leak, resulting in two patients diagnosed with Invasive Aspergillus niger in their bronchial cultures. Immediate interventions, including patient relocation and ICU reconstruction were implemented to mitigate further impacts. This study aims to assess the impact of timely relocation of patients and renovation of the ICU, on the incidence of invasive aspergillosis. Method: A quasi-experimental study design of ICU patients over a nine month period included surveillance by the Infection Prevention department from March 1 to December 1, 2023. Surveillance included review of microbiology reports, environmental cultures, and patient chart reviews. The Pre-intervention spanned March 1 to May 1, and the post-intervention from May 4 to December 1. Indoor mold assessments of pre- and post-intervention involved testing wall surfaces for moisture, air sample collection for fungal spores, and surface swabs for direct fungal analysis. The intervention included relocating all seventeen patients from the impacted ICU and comprehensive reconstruction. Reconstruction involved the removal and replacement of all sheetrock within the unit extending four feet from the floor with moisture-resistant sheetrock. Additionally, moisture resistant single sheet welded vinyl flooring and cove-bases were installed. All heating, ventilation, and air-conditioning (HVAC) systems were inspected and cleaned. Construction activities strictly adhered to Infection Control Risk Assessment (ICRA) guidelines, with emphasis on maintaining negative pressure, to ensure a safe environment. Result: Environmental swab samples from 50% of ICU rooms indicated growth of Aspergillus/Penicillium, Chaetomium, and Stachybotrys/Memnoniella type spores during the pre-intervention phase. Environmental microbiology results strongly suggest the indoor environment as the fungal spore source, with the presence of fruiting structures indicating surface mold growth. Indoor air samples, when compared to outdoor samples collected during pre-intervention, showed rare (2-6 raw count) growth of Aspergillus in 55% of the sampled rooms and subequently no growth post-intervention. Prospective surveillance revealed no further aspergillus growth in the ICU population and environment. Conclusion: Our findings highlight a potential correlation between environmental modifications and reduced IA incidence. Swift mitigation and structural interventions are crucial in averting potentially fatal outcomes, marking a significant advancement of prevention strategies for inner-city hospital settings. Although promising, study limitations include the inability to speciate environmental aspergillus for comparison to patient bronchial cultures and the absence of baseline bronchial cultures for affected patients on admission.
Background: Antimicrobial stewardship programs rely heavily on the electronic medical record (EMR) to carry out daily activities, make interventions, optimize patient care, and collect data. In 2019 the University of Vermont Medical Center transitioned from using a third party platform to the Epic (Verona, WI, www.epic.com) Bugsy module for antimicrobial stewardship. Method: We have spent the past 4 years optimizing the Epic foundation to match our institutional antimicrobial prescribing guidelines, susceptibility patterns, and build reports to extract actionable data. Result: During the build process, we readily identified three areas needed for customization: (1) Empiric, definitive, and prophylactic indications of use for all antimicrobials based on our hospital’s internally published books “Guide to Antimicrobial Therapy for Adults” and “Guide to Antimicrobial Therapy for Pediatrics” (figure 1); (2) An on-demand report to capture all patients with new administrations of antimicrobials in the preceding 72 hours, that includes ordering clinician, stop date of therapy, and indication (figure 2); and (3) A unique, custom-built slicer-dicer report to capture high-level data on how each antimicrobial is being prescribed by indication, dose, route of administration, ordering clinician, attending physician, and department (figure 3). Conclusion: We have built a system where we can readily identify patients that are receiving antimicrobials both within and outside of institutional guidelines and know the ordering clinician to contact to provide in-the-moment feedback. We can also collect retrospective data to know which antimicrobial agents were prescribed for all infectious syndromes. These three institutional customizations have provided invaluable information to improve patient care.
Background: Central venous catheter (CVC) utilization and central-line associated bloodstream infection (CLABSI) have increased nationwide. Busy providers can easily overlook the recommended practice of daily assessment of the ongoing indication for CVC. Prospective audit and feedback (PAF) is widely used and the gold standard for antibiotic stewardship programs (ASP), but reports involving PAF for device use are scarce. Therefore, we decided to evaluate the usefulness and feasibility of PAF for reducing device utilization and ultimately CLABSI rates at our 528-bed tertiary care hospital. Methods: A PAF-based Central Line Stewardship (CLS) initiative was launched in February 2023, with a team of hospitalists, infectious diseases physicians, health informatics, vascular access nurses, and infection preventionists. On business days, CVC line-lists were exported from the electronic medical record (EMR), into a REDCap (Research Electronic Data Capture) database for team members to evaluate, in real-time, all non-ICU CVCs in place for more >72 hours (Figures 1 and 2). For CVCs eligible for stewardship, CLS members placed Legal-approved note into the patient’s EMR, advising CVC removal or alternative IV access (Figure 3). CVCs continued to be audited until removal or patient discharge. Recommendation outcomes were tracked over the subsequent 72 hours for acceptance. Standardized Utilization Rates (SUR) and Infection Rates (SIR) were calculated and compared using the National Healthcare Safety Network. Results: Between February and August 2023, the CLS team reviewed 861 CVCs, representing 581 unique patient encounters, and made 622 recommendations. Recommendations to remove or replace the CVC represented 23.5% (146) of reviews, and 57% of these CVCs were removed within 3 days. 95% of removed lines had no adverse outcomes and did not require reinsertion. Hospital-wide CVC utilization decreased 18.7% from a SUR of 1.006 in the previous year, to 0.818 during the 7-month pilot period (p < 0 .001). In the 4 months following the pilot period, decreased CVC utilization was sustained with an SUR of 0.797. Non-ICU CLABSI SIR decreased from 1.282 in 2022 to 1.024 in 2023 (p=0.36). Average time physician required for CLS review approximated a 0.4 full-time equivalent a week. The intervention was well received, with requests for expansion to urinary catheters. Conclusion: CLS safely and significantly reduced device-utilization; directly via documentation and recommendation, and indirectly through increased awareness and the Hawthorne Effect. Examples of the “art” of CLS include when to leave a discoverable note and how to determine ongoing need for CVC in a fragile patient.
Background: Candida auris is often identified in healthcare settings through bilateral composite of axilla/groin skin swabs screening. Re-screening the same patient has demonstrated inconsistent results over time, complicating the understanding of longitudinal colonization and limiting confidence in negative Results: Previous studies have described identification of colonized patients using other anatomical sites. Here, we compare bilateral composite of nares/hands with bilateral composite of axilla/groin screenings in a cohort of hospitalized patients in Miami, Florida, to assess the use of screening other body sites for C. auris surveillance. Methods: This study took place in a 560-bed academic acute-care facility and included patients previously colonized with C. auris who were cohorted on a 30-bed unit. Bilateral composite samples from both the axilla/groin and nares/hands were obtained simultaneously. Swabs were collected at six different time points at biweekly intervals between March and May 2023 (Figure 1) and sent to the Centers for Disease Control and Prevention for testing with culture and Real-time PCR-based methods. Results: A total of 102 swabs (51 from each swab type) were collected from 19 patients who were each sampled a median of twice (IQR: 1-5). Among the 102 swabs, 35 of 51 (69%) axilla/groin swabs were positive compared with 45 of 51 (88%) nares/hands swabs using culture (Figure 2). Furthermore, 48 of 51 (94%) swabs were positive by culture for both methods, with 15 positive from the nares/hands and one positive from the axilla/groin (Figure 3). Among 11 patients who were tested ≥2 times with nares/hands swabs, 9/11 (81%) tested positive on all sequential swabs via culture and 10/11 (90%) tested positive via PCR (Ct threshold < 3 6.9). Among the same 11 patients but using the axilla/groin swabs, 3/11 (27%) patients tested positive on all sequential swabs using culture, and 5/11 (45%) tested positive using PCR (Figures 2-4). On average, samples collected from nares/hands swabs had lower Ct values (mean=27) compared to axilla/groin swabs (mean=31) (p-value=< 0.001) (Figure 5). Discussion: Identifying the swab site with most consistent C. auris detection is important for surveillance purposes. In our study, there were more positives and consistent positivity for nares/hands by both culture and PCR-based methods, as well as lower Ct values, suggesting that these swabs provide more reliable detection of C. auris colonization. Alternative screening methods deserve consideration as CDC continues to explore whether swabbing of other body sites (e.g., nares, hands) would improve accuracy and consistency when identifying colonized patients.
Background: Methicillin-resistant Staphylococcus aureus (MRSA) is a common pathogen responsible for nosocomial and community-acquired infections with high morbidity and mortality1. MRSA nasal colonization is a major risk factor for developing infection in the hospital setting2,3. Decolonization of MRSA carriers is a strategy to decrease recurrence or to prevent new MRSA infections3,4. Decolonization with nasal mupirocin 2% and chlorhexidine baths has been shown to decrease the risk of MRSA infection after hospital discharge3. Mupirocin is an RNA synthetase inhibitor with activity against MRSA5. Resistance of MRSA isolates to mupirocin has been described previously6. As topical disinfectants play a crucial role in prevention of MRSA infection in a variety of settings, it is important to monitor the emergence of resistance. The goal of this study was to determine the prevalence of mupirocin resistance among MRSA samples isolated from two different regions in the United States (U.S). Methods: Our study had a total of 474 MRSA samples that were obtained from hospitals in Detroit, MI (287 samples) and Cleveland, OH (187 samples). After whole genome sequencing using NextSeq (Illumina Inc., CA) platform the data was analyzed using ResFinder 4.1, to identify antimicrobial resistance which can be either acquired or chromosomally mediated mutations. To visualize the presence of genes of interest the resistance genes were tallied on a spread sheet. Results: Mupirocin resistance gene was detected in five of 287 (1.74%) MRSA samples from the Detroit hospitals, all of which were associated with the mupA gene. Samples collected from the Cleveland area hospital demonstrated mupirocin resistance in seven samples of 187 (3.74%), all associated again with the mupA gene. One sample from the Detroit group showed resistance to both mupirocin and chlorhexidine. Conclusions: Prevalence of mupirocin resistance gene varied between the two hospital locations. Resistance to mupirocin has been documented in association with mutations in the mupA gene as well as chromosomal point mutations that can lead to either low or high-level resistance7,8. Although the mechanisms are not fully clear, mupA gene has been associated with high-level resistance9. Mupirocin resistance among MRSA isolates has increased over time9. MRSA infections remain an important etiology of nosocomial and community-acquired infections and common practice to combat this issue is universal decolonization with mupirocin10. It is critical to understand and monitor for development of mupirocin resistance as mupirocin remains one of the most effective tools to prevent invasive infection with MRSA in many patient populations.
Background: Nursing home (NH) residents are at high risk of COVID-19 from exposure to infected staff and other residents. Understanding SARS-CoV-2 viral RNA kinetics in residents and staff can guide testing, isolation, and return to work recommendations. We sought to determine the duration of antigen test and polymerase chain reaction (PCR) positivity in a cohort of NH residents and staff. Methods: We prospectively collected data on SARS-CoV-2 viral kinetics from April 2023 through November 2023. Staff and residents could enroll prospectively or upon a positive test (identified through routine clinical testing, screening, or outbreak response testing). Participating facilities performed routine clinical testing; asymptomatic testing of contacts was performed within 48 hours if an outbreak or known exposure occurred and upon (re-) admission. Enrolled participants who tested positive for SARS-CoV-2 were re-tested daily for 14 days with both nasal antigen and nasal PCR tests. All PCR tests were run by a central lab with the same assay. We conducted a Kaplan-Meier survival analysis on time to first negative test restricted to participants who initially tested positive (day zero) and had at least one test ≥10 days after initially testing positive with the same test type; a participant could contribute to both antigen and PCR survival curves. We compared survival curves for staff and residents using the log-rank test. Results: Twenty-four nursing homes in eight states participated; 587 participants (275 residents, 312 staff) enrolled in the evaluation, participants were only tested through routine clinical or outbreak response testing. Seventy-two participants tested positive for antigen; of these, 63 tested PCR-positive. Residents were antigen- and PCR-positive longer than staff (Figure 1), but this finding is only statistically significant (p=0.006) for duration of PCR positivity. Five days after the first positive test, 56% of 50 residents and 59% of 22 staff remained antigen-positive; 91% of 44 residents and 79% of 19 staff were PCR-positive. Ten days after the first positive test, 22% of 50 residents and 5% of 22 staff remained antigen-positive; 61% of 44 residents and 21% of 19 staff remained PCR-positive. Conclusions: Most NH residents and staff with SARS-CoV-2 remained antigen- or PCR-positive 5 days after the initial positive test; however, differences between staff and resident test positivity were noted at 10 days. These data can inform recommendations for testing, duration of NH resident isolation, and return to work guidance for staff. Additional viral culture data may strengthen these conclusions.
Disclosure: Stefan Gravenstein: Received consulting and speaker fees from most vaccine manufacturers (Sanofi, Seqirus, Moderna, Merck, Janssen, Pfizer, Novavax, GSK, and have or expect to receive grant funding from several (Sanofi, Seqirus, Moderna, Pfizer, GSK). Lona Mody: NIH, VA, CDC, Kahn Foundation; Honoraria: UpToDate; Contracted Research: Nano-Vibronix
Background: Optimizing antimicrobial use (AU) among post-acute and long-term care (PALTC) residents is fundamental to reducing the morbidity and mortality associated with multidrug-resistant organism (MDROs), as well as unintended social consequences related to infection prevention. Data on AU in PALTC settings remains limited. The U.S. Department of Veteran Affairs (VA) provides PALTC to over 23,000 residents at 134 community living centers (CLCs) across the United States annually. Here, we describe AU in VA CLCs, assessing both class and length of therapy. Methods: Monthly AU between January 1, 2015 and December 31, 2019 was extracted from the VA Corporate Data Warehouse across 134 VA CLCs. Antimicrobials and administration routes were based on the National Healthcare Safety Network AU Option protocol for hospitals. Rates of AU were measured as the days of therapy (DOT) per 1,000 resident-days. An antimicrobial course was defined as the same drug and route administered to the same resident with a gap of ≤ three days between administrations. Course duration was measured in days. AU Rates were measured as the days of therapy (DOT) per 1,000 resident-days. Results: The most common class of antimicrobial course administered during the study period was beta-lactam/beta-lactamase inhibitor combinations (15%) followed by fluroquinolones (14%), extended-spectrum cephalosporins (12%) and glycopeptides (11%; Figure 1). Neuraminidase inhibitors had the longest median (IQR) course duration (10 (IQR 8) days), followed by tetracyclines (8 (IQR 8) days), and then folate pathway inhibitors, nitrofurans and 1st/2nd generation cephalosporins (7 (IQR 7) days). Overall, 60% of antimicrobial courses were administered orally, with fluroquinolones the most frequently administered orally (20%). From 2015 – 2019, the annual rate of total antimicrobial use across VA CLCs decreased slightly from 213.6 to 202.5 DOT/1,000 resident-days. During the 5-year study period, fluroquinolone use decreased from 27.47 to 13.36 DOTs/1,000 resident-days. First and 2nd generation cephalosporin use remained relatively stable, but 3rd or greater generation cephalosporin use increased from 14.70 to 19.21 DOTs/1,000 resident-days (Figure 2). Conclusion: The marked decrease in the use of fluoroquinolones at VA CLCs from 2015-2019 is similar to patterns observed for VA hospitals and for non-VA PALTC facilities. The overall use of antibacterial agents at VA CLCs decreased slightly during the study period, but other broad-spectrum agents such as 3rd or greater generation cephalosporins increased over the same period. The strategies used to decrease fluroquinolone use may have application for other antibiotic classes, both in VA and non-VA PALTC settings.
Disclosure: Robin Jump: Research support to my institution from Merck and Pfizer; Advisory boards for Pfizer
Background: Longer courses of antibiotics can be associated with antimicrobial resistance and adverse effects. Randomized clinical trials support treating gram-negative bloodstream infections (GN-BSI) for a shorter duration with a consensus that a seven-day course of antibiotics is appropriate for uncomplicated GN-BSI. Prior to the implementation of a GN-BSI treatment guideline at our institution, we aimed to evaluate the characteristics of patients with GN-BSI and the duration of antibiotic therapy (DOT). Method: We retrospectively reviewed adult inpatients who had a blood culture with at least 1 gram-negative organism within 6 months (November 2022 to April 2023). Patients were excluded if they had a concomitant gram-positive bloodstream infection or if they were transitioned to comfort-focused care within 48 hours of their first positive blood culture. Complicated GN-BSI was defined as exhibiting any of the following: involvement of bone, joint, endovascular system, or foreign body, an inability to achieve source control, immunocompromised status, or failure to demonstrate clinical improvement or culture clearance within 72 hours. The primary outcome of this study was the mean DOT in patients with GN-BSI. Result: 100 patients met the inclusion criteria. Escherichia coli, identified in 54 cases, emerged as the most frequent organism. Urine (41) was the predominant source of bacteremia. Cefepime (48) was the most common empiric agent used. Of the 91 patients with available ceftriaxone susceptibility results, 84% had a susceptible organism. Amongst the 51 patients classified as having a complicated GN-BSI, the leading reason was immunosuppression. Table 1 presents a comparative analysis of complicated vs. uncomplicated GN-BSI. The average DOT for complicated GN-BSI was longer than the uncomplicated infections (20 vs. 11 days, P < 0 .005). Additionally, fewer patients transitioned to oral therapy in the complicated group (33% vs. 67%, P < 0 .005). Conclusion: At our institution, patients with uncomplicated GN-BSI have a shorter DOT and are more likely to transition to oral therapy than those with complicated GN-BSI. However, the mean DOT for uncomplicated infections remained longer than seven days and a large number of uncomplicated GN-BSI patients did not transition to oral therapy, indicating room for improvement in local practice through antimicrobial stewardship initiatives.
Introduction: Hospital-onset Clostridioides difficile infection (HO-CDI), reported as laboratory-identified (LabID) event, is common in patients with chronic kidney disease (CKD), especially those with end-stage renal disease (ESRD), and is associated with prolonged length of hospitalization and more severe disease. CKD patients are at increased of developing CDI due to frequent antimicrobial and healthcare exposures. The objective of this study was to assess recent trends of HO-CDI in patients on a nephrology unit at our academic, tertiary care institution. Methods: Retrospective cross-sectional study of patients with HO-CDI who were hospitalized on a nephrology unit between January 2021 to December 2023. Collected variables included: demographic data, characterization of HO-CDI risk factors, infection and diagnosis (including prior history of CDI, toxin versus nucleic acid amplification test [NAAT] positivity, number of loose stools), CDI rate (defined as CDI count/patient days x1000), standardized antimicrobial administration ratio (SAAR) for high-risk for CDI antimicrobials (defined by the National Healthcare Safety Network), and infection prevention and control (IPC) practices, including hand hygiene audit rates. Results: A total of 30 HO-CDI infections were reported on the nephrology unit [Table], with 8 occurring in 2021, 5 in 2022, and 17 in 2023. The median age of patients was 70.8 (range: 37-96) years, and most patients (57%) were female. The majority of patients were admitted from home (73%), and two patients (7%) had a history of CDI in the last 6 months. Among the CDI cases, 60% were NAAT positive and toxin negative, and only 50% had >3 bowel movements (BM) within 24 hours prior to the positive test. Ten percent received promotility agents prior to testing. Most cases (77%) occurred when other CDI patients were on the unit. Hand hygiene compliance rates averaged 81% over the three-year period [Figure 1A]. Eight-four per cent of patients received antibiotics within 30 days of CDI diagnosis; SAAR was >1 for quarters 2 and 4 in 2022, and quarter 1 in 2023 [Figure 1B]. Conclusion: On our nephrology unit, patients often had < 3 BM within 24 hours of CDI diagnosis, and 60% of cases were toxin-negative, NAAT-positive, suggesting possible C. difficile colonization, rather than true infection. In addition, an elevated SAAR correlated with high CDI rates. Multicomponent interventions may be required to reduce the rates of HO-CDI in CKD patients. Opportunities include emphasis on diagnostic and antimicrobial stewardship, environmental cleaning and adherence to IPC practices, including hand hygiene.