To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Greenland's marine- and land-terminating glaciers are retreating inland due to climate warming, reconfiguring the way the ice sheet interacts with its proglacial environment. Here we use three decades of satellite imagery to determine whether the ice-sheet margin is becoming more or less exposed to marine and lacustrine processes. During our 1990–2019 study period, we find that the length of ice-sheet perimeter in contact with the ocean shrank by 12.3 ± 3.8% (196.2 ± 10.4 km), due to the retreat of marine-terminating glaciers into narrower fjords. On the other hand, we find that the length of the ice-sheet perimeter in contact with freshwater lakes exhibited more divergent trends that is better explored at regional scales. The length of ice–lake boundaries increased in southwest, north and northwest Greenland but declined in southeast and central east Greenland. The magnitude of change we document during our study period leads us to conclude that the ice sheet is poised for further, substantial reconfiguration in the coming decades with consequences for the flux of fresh water, nutrients and primary productivity in Greenland's terrestrial and oceanic environment.
Background: Carbapenem-resistant Acinetobacter baumannii (CRAB) and Pseudomonas aeruginosa (CRPA) are drug-resistant pathogens causing high mortality rates with limited treatment options. Understanding the incidence of these organisms and laboratory knowledge of testing protocols is important for controlling their spread in healthcare settings. This project assessed how often Veterans Affairs (VA) healthcare facilities identify CRAB and CRPA and testing practices used. Method: An electronic survey was distributed to 126 VA acute care facilities September-October 2023. The survey focused on CRAB and CRPA incidence, testing and identification, and availability of testing resources. Responses were analyzed by complexity of patients treated at VA facilities (High, Medium, Low) using Fisher’s exact tests. Result: 77 (61.1%) facilities responded, most in urban settings (85.4%). Most respondents were lead or supervisory laboratory technologists (84.2%) from high complexity facilities (69.0%). Few facilities detected CRAB ≥ once/month (4.4%), with most reporting that they have not seen CRAB at their facility (55.0%). CRPA was detected more frequently: 19% of facilities with isolates ≥ once/month, 29.2% a few times per year, and 26.9% reporting had not seen the organism. No differences in CRAB or CRPA incidence was found by facility complexity. Nearly all facilities, regardless of complexity, utilize the recommended methods of MIC or disk diffusion to identify CRAB or CRPA (91.9%) with remaining facilities reporting that testing is done off-site (7.8%). More high complexity facilities perform on-site testing compared to low complexity facilities (32.0% vs 2.7%, p=0.04). 83% of laboratories test for Carbapenemase production, with one-fourth using off-site reference labs. One-fourth of facilities perform additional antibiotic susceptibility testing for CRAB and CRPA isolates, most of which test for susceptibility to combination antibiotics; no differences between complexities were found. Agreement that sufficient laboratory and equipment resources were available was higher in high complexity than in medium complexity facilities (70.7% vs 33.3%, p=0.01), but not low complexity facilities (43.8%). Conclusion: Having timely and accurate testing protocols for CRAB and CRPA are important to quickly control spread and reduce associated mortality. This study shows that most VA protocols follow recommended testing and identification guidelines. Interestingly, there was no difference in CRAB or CRPA incidence for facilities providing higher vs lower complexity of care. While high and low complexity facilities generally reported sufficient resources for CRAB and CRPA evaluation, some medium-complexity labs, who may feel more compelled than low-complexity labs to bring testing in house, reported that additional resources would be required.
Background: Gastrointestinal conditions are common in hospitalized patients. Decreased mobility, dietary changes, medications and their underlying illness may alter patients’ bowel movements. It’s important for health care providers to be aware of patient’s bowel habits, especially for early identification of Clostridiodies difficile infection (CDI). Prior research has shown that patient modesty may be a barrier to discussing bowel habits with nurses and providers. This can lead to delay in diagnosis of CDI, lack of timely isolation and possible misclassification of community onset CDI cases as hospital onset (HO-CDI). Methods: A Bowel Habits Assessment Tool (BHAT) was developed to assist health care providers in learning skills to assess and document patient bowel habits accurately. The tool provides a structured approach to help clinicians gather relevant information, identify abnormalities, and promote effective communication with patients. The tool was developed by an infectious disease physician and modeled on existing tools utilized to take a sexual history. A team of infectious disease physicians, nurses and a gastroenterologist reviewed the tool and provided feedback. See Table 1. The tool was introduced as a pilot program at a 180 bed academically affiliated Veterans Affairs Hospital. Micro educational sessions were held to provide education about the importance of a bowel habit history, introduce the tool and teach its use in clinical care. The teaching sessions were led by an Infectious Disease physician and a nurse infection preventionist. An anonymous pre and post survey employing a 5-point Likert scale was administered to participants. All participation was voluntary. This project was reviewed and approved as a Quality Improvement by the VA Research Office, Eastern Colorado Health Care System. Results: Twenty nine healthcare personnel participated in the pilot. Participants included nurses (13), resident physicians (13), medical students (2) and nursing assistants (1). 59% of participants stated that they strongly agree with the statement “I am comfortable discussing patient’s bowel habits” on the pre-survey. (Question 1). This increased to 73% after the BHAT educational session. The mean difference between pre and post survey responses for question one was 0.45 (CI 0.08761 to 0.8089, p= 0.0167). All participants found the BHAT related to their work and useful, with 41% strongly agreeing and 52% somewhat agreeing that the BHAT was useful. See figure 1: Survey Responses. Conclusions: The effectiveness of a bowel habit assessment tool was demonstrated using a pre and post survey. BHAT improved clinicians comfort level discussing patient’s bowel habits.
Objective: To describe whether detecting plasma microbial cell-free DNA by next-generation sequencing (NGS) provided additional information compared to routine cultures or led to change in antimicrobial management. Design and setting: This is a retrospective cohort study evaluating NGS tests performed on patients who were admitted to an 11-acute care hospital health system in the greater Houston area between May 2022 and May 2023. Repeat tests on the same patient encounter were included if >7 days from previous test. Routine microbiology data was compared if test was collected within 7 days before or after NGS testing. Results: During the study period there were 135 unique patient encounters identified with an NGS order. Of which, 74.1% were ≥18 years of age and 46.7% were immunocompromised. A total of 143 NGS tests were ordered, with 4 not being run due to quality control issues. Out of 139 NGS tests completed, 76 (54.7%) were positive for at least one organism. When compared to routine testing, NGS alone was positive in 29 (20.9%) instances, routine testing alone was positive in 17 (12.2%) instances, both were positive in 44 (31.7%) instances, and both were negative in 49 (35.3%) instances. In the 44 instances that both NGS and routine testing were positive, 15 (34.1%) were concordant for all organisms. In total, NGS identified 92 more organisms (69 bacterial, 8 fungal, and 15 viral) compared to routine testing and routine testing identified 42 more organisms (28 bacterial, 6 fungal, 11 viral, and 1 parasite) compared to NGS. Fifty-six changes to antibiotic therapy were made within 48 hours of the NGS test resulting, with 16 of these changes being directly attributed to NGS test. Nine of these changes being escalations and seven being de-escalations. Conclusion: NGS may aid in determining further testing, earlier detection of pathogens, and detection of pathogens outside of routine testing resulting in direct changes to antimicrobials. However, results need to be interpreted with caution. NGS can miss pertinent pathogens and is difficult to interpret when commensal organisms are detected, both of which can lead to unnecessary testing or treatment. There is an absence of a clear algorithm for the use of NGS testing and the test comes with a high price and unclear utility. Further studies are needed to determine which patients may most benefit from NGS testing.
Background: Hospitals are increasingly consolidating into networks and integrating infection prevention (IP) into system infection prevention programs (SIPP). Very little has been published about these programs. This survey sheds light on the current state of SIPPs. Methods: We used the survey generator Alchemer.com for setting up the questionnaire, and tested a beta version among peers. The final version was sent out to SHEA Research Network participants in August 2023. Raw data was compiled and analyzed. Results: Forty institutions responded (40/104, 38%), of which 25 (63%) had SIPPs. These SIPPS reported health systems with a median of 4.5 acute care hospitals (range, 1-33); 16 SIPPS reported a median of 2 critical access hospitals (range, 1-8); 4 SIPPs reported 1-3 LTACHs, and 6 SIPPS reported a median of 1.5 nursing homes. All except 3 (88%) contained an academic center; 48% (11/23) of the U.S. based programs operate in multiple states. Four programs have been in place >20 years, four < 2 years, and the remainder a median of 8 years (range, 2-18). Physician directors also have clinical (20/25, 80%), teaching (19/25, 76%), research (15/25, 60%), antimicrobial stewardship (8/25, 32%), quality (8/25, 32%), and/or patient safety (5/25, 20%) roles. Seventeen (68%) report having a written job description. Nineteen (76%) report having an infection preventionist in a system IP director role; only 7/25 (28%) have a dedicated system IP team that operates independent of individual hospitals. Sixteen (64%) report administrative support, 10/25 (40%) have a data manager/analyst, and 4/25 (16%) include IT expert or programmer support. 15/25 (60%) report having done a formal system-wide IP needs assessment. While 16/25 (64%) have some automation in HAI surveillance (predominantly using Bugsy [Epic] or Theradoc [Premier]), while only 5/25 (20%) run fully automated surveillance. 10/25 (40%) have implemented centralized surveillance. 12/25 (48%) have “system IP policies” that are hierarchically above individual site policies. The biggest challenges appear to be gaps in 1) clear governing structure, 2) communication, 3) consistent staffing, 4) data management support, and 5) dedicated, empowered IP expert FTEs. Conclusions: To our knowledge, this is the first U.S. survey to explore present-day system infection prevention. In this sample of hospital networks, we found heterogeneity in the structure, staffing and resources for system IP with significant opportunities for improvement. In this era of healthcare consolidation, our findings highlight the urgent need to more clearly delineate and support system IP needs in order to enhance their functionality.
Background: Detection of outbreaks traditionally relies on passive surveillance, and often misidentify or miss outbreaks. Whole genome sequencing (WGS) surveillance has emerged as a proactive measure, enabling early detection of outbreaks and facilitating rapid intervention strategies. WGS surveillance has not been widely studied due to infrastructure, cost, and evidence barriers regarding its impact on reducing healthcare-associated infections (HAIs). This study represents findings from two years of a real-time WGS surveillance program called the Enhanced Detection System for Healthcare-associated Transmission (EDS-HAT). Methods: The study was conducted at UPMC Presbyterian hospital, a 694-bed tertiary care center. Patient isolates of select bacterial pathogens were collected and underwent WGS weekly from November 2021 to November 2023. Potential transmission was defined using single-nucleotide polymorphism thresholds (≤15 for all organisms except Clostridioides difficile). Genetically related clusters were reviewed weekly for epidemiological linkages (unit, personnel, or procedural commonalities) and appropriate interventions were initiated by the infection prevention and control team. We described the frequency of genetic relatedness and nature of epidemiological linkages. Results: Of 7,051 eligible unique patient organism isolates, 4,723 were deemed healthcare-associated and underwent WGS. EDS-HAT identified 478 (12.2%) isolates genetically related to ≥1 other isolate across 173 clusters. Epidemiological links were found in 278 (58.2%) isolates in 114 clusters, with the majority being unit-based (205 isolates, 71.9%); other epidemiological links included equipment or healthcare workers (32 isolates, 11.5%), external facilities (24 isolates, 8.6%), and shared endoscopes (17 isolates, 6.1%); all endoscope outbreaks were effectively contained at two patients. No epidemiological links could be identified for 200 (41.8%) isolates. Infection prevention initiated 134 interventions in 114 clusters, including 74 (55.2%) general staff notification and education, 25 (18.7%) enhanced cleaning efforts, 23 (17.2%) hand hygiene/personal-protective equipment compliance observations, 9 (6.7%) environmental cultures, and 3 (2.2%) enhanced microbiological surveillance. Following the detection of an epidemiological link and intervention, 94/101 (94.1%) outbreaks were effectively halted on the intervened route (Figure). Conclusion: This study demonstrates the feasibility and efficacy of EDS-HAT as an infection prevention tool. Early detection and intervention of outbreaks significantly enhance the capability of healthcare facilities to control and prevent the spread of HAIs. Investment in infrastructure and implementation costs will result in reducing pathogen transmission and improving patient safety in acute care settings.
Disclosure: Alexander Sundermann: Honoraria - Opgen
An animal welfare education community of practice (AWECoP) for those teaching animal welfare science, applied ethology, and/or animal ethics was created to develop a dialogue amongst educators within the field of animal welfare science. The purpose of this paper is to describe the history, objectives, and members’ experiences within this community. AWECoP hosts 6–8 meetings annually for members to discuss topics relevant to our community and exchange teaching resources; within its first two years, the community has grown to 121 members representing approximately 70 institutions across six continents. A 12-question, mixed-method survey was distributed to capture member demographics, engagement with AWECoP, motivations for joining, and self-evaluation of AWECoP’s impacts. Quantitative data from the survey are presented descriptively, while reflexive thematic analysis was applied to the qualitative data. Survey respondents (n = 54) felt that AWECoP is a vital community and safe space for members to share their ideas and receive feedback, inspiration, information, and resources regarding subject-specific and broader pedagogical topics. As a result, a majority experienced professional (e.g. development of new contacts) and personal (e.g. increased feeling of belonging in their field) benefits, as well as impacts realised in their teaching practice. We conclude with an examination of challenges faced in ensuring AWECoP remains accessible to a growing membership and offer recommendations for facilitating similar communities to support fellowship and training in the teaching of animal welfare and related disciplines.
Background: At a comprehensive cancer center, hundreds of patients are screened daily for infections requiring the implementation of isolation precautions. Discontinuation of precautions is determined by negative testing, resolution of infection, or other criteria. Determining appropriate discontinuation of isolation precautions is labor intensive for Infection Preventionists (IPs). An unintended consequence of manual discontinuation is that numerous patients remain on isolation indefinitely. This was amplified during the COVID-19 pandemic when thousands of patients were placed on isolation precautions. Using electronic health record (EHR) tools, opportunities for process improvements were developed. Our goal was to establish an automated method to resolve isolation precautions. We aimed to decrease the number of manually resolved precautions by 25% each fiscal year (FY), compared to our baseline of activity in FY 2019 (FY19). Our secondary aim was to automate adding and resolving precautions when testing is initiated for suspected transmissible conditions (rule-out testing features).
Methods: Infection Control (IC) collaborated with EHR analysts to build tools to automate a process for appropriate isolation discontinuation. We reviewed our internal data in conjunction with evidence-based guidelines and started with acute, short-term infections that do not require repeat testing or cultures. Expiration dates were established for these infections to resolve automatically after meeting criteria. A secondary review determined that additional infections could be added safely to this process. The secondary aim of establishing rule-out testing was implemented for respiratory viral panels (including SARS-COV-2) and C. difficile testing. When testing was ordered for these conditions, a suspect-infection status and alert for precautions were automatically added to patients’ EHR banners. If the assay resulted negative, the suspect-infection status was automatically removed from their chart. Results: Our baseline of active infections in FY19 was approximately 2,700 cases. From FY19 through FY23, 123,115 infections were added to our patients, and 128,422 infections were resolved. In the first year of implementation, there was a 58% decrease in the number of manually resolved cases. From the initiation of our project through the end of FY23, manual discontinuation of precautions has decreased by 88%. Conclusions: We successfully implemented a process improvement project to appropriately remove patients from isolation precautions automatically using EHR tools, which resulted in reduced labor on our IPs and patient time spent on isolation restrictions. Additional benefits from this process improvement extend to decreasing unnecessary costs to the patient and the organization, better stewardship of supply/resources, and improving patient satisfaction.
Background: Antimicrobial resistance (AMR) is a global public health concern, necessitating close and timely monitoring of antibiotic consumption (AMC). In Belgium, AMC surveillance traditionally relies on reimbursement data, excluding over-the-counter non-reimbursed or imported products and involving a time lag. This study investigates disparities in AMC between reimbursement data and retail data, providing insights into AMC variations. Additionally this study seeks to critically evaluate the validity and representativeness of the reimbursed data in accurately reflecting the true extent of AMC in the country. Method: Utilizing reimbursement data from the National Institute for Health and Disability Insurance (NIHDI) and retail data (IQVIA Sales data; www.iqvia.com) for systemic antibacterials (ATC Group J01), outpatient consumption was estimated for the period 2013-2022. Volume of antimicrobials was measured in Defined Daily Doses (DDDs - WHO ATC/DDD Index 2023), while population data were extracted from Eurostat. Relative differences (RDs) in DDDs per 1000 inhabitants per day (DID) were computed, and validated through correlation analysis (Pearson’s r) and Bland–Altman plots. Result: J01 antibacterial sales declined from 23.10 DID (2013) to 20.85 (2022). Non-linear decreases, notably during the Covid-19 pandemic (21.54 DID in 2019 to 16.69 in 2020), followed by a rebound to pre-pandemic quantities in 2022 were observed (Figure 1). Reimbursement NIHDI data slightly underestimated IQVIA sales, with RDs ranging from 2% (2013) to 9% (2022). Notable differences, especially in recent years were attributed to quinolone reimbursement criteria changes implemented by law in Belgium in 2018, reducing the reimbursed proportion from 99% (2017) to 35% (2022). ATC-3 level analysis revealed disparities in low-DID groups (J01B, J01E and J01G). Notably, a small proportion of amphenicols (J01B) were reimbursed ( < 1 0%), with a congestion relieving combination product of tiamphenicol (+ N-acetylcysteine; Fluimucil®) frequently bought and remaining unreimbursed. Overall and across ATC3 groups, the correlation between NIDHI and IQVIA estimates was almost perfect across years and the Bland–Altman plots showed high agreement. Conclusion: Reimbursement data are reliable for outpatient AMC monitoring with slightly lower estimates than retail data across most categories. The 2018 quinolone reimbursement criteria change highlights the necessity of incorporating retail data for accurate assessments in this specific category. The synergistic use of reimbursement and retail datasets is crucial for a comprehensive understanding of consumption patterns, supporting effective AMR mitigation strategies in Belgium.
Background: The COVID-19 pandemic disrupted routine health services worldwide, including systems to detect antimicrobial resistance (AR). AR is a mounting global health threat with some studies showing the highest mortality rate from AR infection is in Sub-Saharan Africa (SSA). Antibiotic use is a major contributor to AR. We sought to characterize COVID-19-related changes to antibiotic use and AR detection capacity in two countries in SSA from 2019 to 2020. Methods: Health facilities (HF) in South Africa and Uganda were surveyed as part of a larger study assessing disruptions to essential health services in SSA in the context of COVID-19. Modified stratified random sampling of HF by facility level was conducted in regions with high COVID-19 cumulative prevalence. Hospital pharmacists were surveyed to identify perceived changes in antibiotic use. Among facilities with the capacity to detect AR, surveys were conducted with AR laboratory managers to identify perceived changes in staff, equipment, training, and supplies. Descriptive data analysis was conducted using frequencies and proportions. Results: A total of 39 HFs in South Africa and 45 HFs in Uganda responded to the antibiotic use survey. Increases in total antibiotic use from 2019 to 2020 were reported by 82% (23/28) of HF in South Africa and 68% (27/40) in Uganda. Increased use of antibiotics for multi-drug resistant bacteria (per World Health Organization Reserve classification) was reported by 36% (9/25) and 38% (8/21) of HFs in South Africa and Uganda, respectively. 19 HFs in South Africa and 12 HFs in Uganda responded to the AR detection capacity survey. HFs in both countries reported decreases in laboratory staff responsible for AR (33% [13/40] in South Africa and 31% [11/35] in Uganda). Decreased availability of reagents and consumables for bacteriology and antimicrobial susceptibility testing was reported by 50% (8/16) and 33% (4/12) of HFs, and decreased availability of specimen collection supplies for bacterial cultures was reported by 41% (7/17) and 42% (5/12) of HFs in South Africa and Uganda, respectively. Diversion of laboratory supplies was reported in both countries (32% [6/19] in South Africa and 25% [3/12] of HF in Uganda). Conclusions: HFs in South Africa and Uganda reported increases in antibiotic prescribing, a risk factor for increased AR, concurrently with disruptions in AR detection capacity during the early phases of the COVID-19 pandemic. These findings emphasize the importance of investing in bacteriology and AR testing in SSA and maintaining support during infectious disease pandemics.
Background: Healthcare-associated central line associated bloodstream infection (HA-CLABSI) surveillance is important for monitoring healthcare-associated infections (HAIs) and evaluating effectiveness of infection prevention (IP) measures. However, implementing it is a laborious and time-consuming approach. Exclusive focus on central lines neglects HAI risk due to peripheral vascular catheters. This study aimed to assess whether HA-CLABSI incidence could be inferred from HA-bloodstream infection (BSI) trends and explore shift to HA-BSI surveillance. Methods: The study was performed in a Singaporean tertiary care hospital. Electronic medical records review was performed to determine whether positive blood cultures met Centers for Disease Control/National Health Safety Network (CDC/NHSN) definitions for HA-CLABSI and HA-BSI. Incident episodes of HA-BSI were included (excluding positive cultures repeated within 14 days). Incident organisms were explored to identify common causative pathogens (excluding same organisms isolated from cultures repeated within 14 days). CLABSI and BSI occurring ≥72hrs after admission were considered healthcare-associated. Patients under oncology or hematology service were considered immunocompromised. Incidence rates (IR) per 10,000 patient-days, patient characteristics and causative pathogens were compared between both indicators. Results: From January 2022 to October 2023, mean IR for HA-CLABSI was 0.63 (n=68) and for HA-BSI was 10.06 (n=1094). Median age of patients with HA-CLABSI was 66 years and HA-BSI was 68 years. HA-CLABSI and HA-BSI were more common in males (60.86% & 58.68%). Median duration between admission to HA-CLABSI was 20 days and to HA-BSI was 12 days. Median duration between central line insertion to HA-CLABSI was 16 days. Of 1094, 631 (57.7%) patients had vascular catheter(s) (i.e., IV cannula, port-a-cath, peripherally-inserted central catheter or central line) inserted at time of HA-BSI diagnosis, of whom 46 (7.3%) patients had CLABSI ±2days from positive blood culture. There was no significant correlation between monthly aggregate data from these indicators (Spearman’s correlation coefficient= 0.36, p-value=0.1). Predominant organisms causing HA-CLABSI and HA-BSI were gram negative bacteria (GNB, 40% & 57.21%), gram positive bacteria (24.71% & 22.23%), and fungi. Common GNB in CLABSI patients were Pseudomonas spp. and Stenotrophomonas maltophilia (8.24%), followed by Serratia marcescens and Klebsiella pneumoniae (5.88%). The frequent GNB in HA-BSI patients were Escherichia coli (15.4%), Klebsiella pneumonia (12.68%), and Pseudomonas spp. (6.69%). Common multi-drug resistant organisms were vancomycin-resistant Enterococcus faecium (10.59% & 3.69%) and methicillin-resistant Staphylococcus aureus (10.59% & 3.07%). Conclusion: HA-BSI did not correlate with HA-CLABSI. HA-BSI reflects heterogenous population outcomes. For utilization as surveillance indicator, further assessment on exclusion criteria is required to improve specificity.
Cervical cancer screening rates in the USA fall behind national targets, requiring innovation to circumvent screening barriers. Cervical cancer screening where human papillomavirus (HPV) testing is performed on vaginal samples collected by the patients themselves (self-sampling) are effective and acceptable, and patient-operated rapid HPV tests (self-testing) are currently under development. It is unclear why there is ambivalence toward HPV self-sampling and self-testing among clinicians, an important stakeholder group. We conducted a mixed convergent quantitative and qualitative study to identify the factors influencing clinicians’ attitudes toward self-sampling and self-testing.
Methods:
A survey of Midwest clinicians distributed by professional group media and a market research firm between May and November 2021 was analyzed (n = 248) alongside in-depth interviews with Midwest clinicians from professional groups (n = 23). Logistic regression models examined willingness to support self-sampling and self-testing across respondent characteristics.
Results:
We report that family practice physicians and those in rural areas were more willing to adopt HPV self-sampling (adjusted OR (aOR) = 3.16 [1.43–6.99]; aOR = 2.17 [1.01–4.68]). Clinician willingness to support self-testing was positively associated with current use of self-testing for other conditions and negatively associated with performing 10 or more monthly cervical cancer screenings (aOR = 2.02 [1.03–3.95], aOR = 0.42 [0.23–0.78]). Qualitative data contextualize how clinical specialty and experience with self-sampling and self-testing for other conditions inform clinician perspectives.
Conclusion:
These data suggest clinician populations most accepting of initiatives to implement self-sampling and self-testing for cervical cancer screening and highlight that experience with other forms of self-testing could facilitate more widespread adoption for cervical cancer.
Background: Health equity is a critical consideration in public health research, emphasizing the importance of fair and just access to healthcare resources. This study explores the impact of health equity factors on the incidence rates of Central Line-Associated Bloodstream Infections (CLABSI) and Methicillin-Resistant Staphylococcus aureus (MRSA) across diverse healthcare facilities in Louisiana. Methods: We conducted a comprehensive analysis utilizing 2022 data from the National Healthcare Safety Network (NHSN). Fourteen healthcare facilities were randomly selected from nine regions in Louisiana, with guidance from the 2022 NHSN external validation toolkit. Key health equity factors from Health Resources and Service Administration (HRSA) were assessed, including urbanicity, MUA/P, and HPSA_Primary Care. Risk ratios were calculated to quantify the association between these health equity factors and the incidence rates of CLABSI and MRSA. Results: The findings reveal intriguing insights into the relationship between health equity factors and infection rates. In urban settings, the risk of CLABSI was lower (Risk Ratio: 0.634, 95% CI: 0.2442–1.646), contrasting with a significantly higher risk of MRSA (Risk Ratio: 1.7, 95% CI: 1.119–2.582). This suggests a complex interplay between urbanicity and the specific infection types. For MUA/P, no significant impact on CLABSI rates was observed (Risk Ratio: 0.963, 95% CI: 0.4225–2.195), but an increased risk of MRSA emerged (Risk Ratio: 1.652, 95% CI: 1.029–2.652). In healthcare professional shortage areas for primary care (HPSA_Primary Care), both CLABSI (Risk Ratio: 1.37, 95% CI: 0.5854–3.204) and MRSA (Risk Ratio: 2.098, 95% CI: 1.305–3.372) exhibited elevated risks, though only MRSA risk was statistically significant. Conclusions: This research underscores the nuanced relationship between health equity factors and infection rates in healthcare facilities. Urban settings may contribute to a lower risk of CLABSI but a higher risk of MRSA, emphasizing the need for tailored preventive strategies. Living in medically underserved areas appears to heighten the risk of MRSA, warranting targeted interventions. Additionally, healthcare professional shortage areas for primary care demonstrate potential associations with increased risks for both CLABSI and MRSA. These findings provide valuable insights for public health practitioners, policymakers, and healthcare administrators aiming to address health disparities and enhance infection control measures in diverse healthcare settings. Further research is encouraged to unravel the multifaceted dynamics influencing infection rates and to inform targeted interventions for improved health outcomes.
Background: Intravenous vancomycin is commonly used as initial empiric coverage for pneumonia but is often unnecessary. MRSA nasal surveillance cultures (MRSA nasal swab) have been highlighted in recent literature and the 2019 IDSA Pneumonia Treatment Guidelines as a tool to avoid unnecessary MRSA coverage in pneumonia. A negative MRSA nasal swab can be utilized by clinicians to de-escalate anti-MRSA therapies for pneumonia. The purpose of this study is to determine if implementing stewardship pharmacist driven MRSA nasal surveillance increases use of the test and reduces the inappropriate use of vancomycin for MRSA coverage in patients with pneumonia. Method: This study was a retrospective chart review and was approved by the Trinity Health of New England Institutional Review Board. For this initiative, a stewardship pharmacist evaluated all patients receiving vancomycin for anti-MRSA therapy at Saint Francis Hospital and Medical Center in Hartford, CT. If the patient’s indication was pneumonia and a MRSA nasal swab had not been ordered, the pharmacist contacted the patient’s provider and requested an order for it. Upon receipt of a negative MRSA nasal swab result, the pharmacist recommended discontinuation of vancomycin to the provider if appropriate. Outcomes from the first four weeks of the pharmacist-driven initiative (April 10, 2023 to May 5, 2023) were compared to the four weeks prior to the initiative (March 13, 2023 to April 7, 2023) as a control group. The primary outcome of this study was percentage of patients who received a MRSA nasal swab. Secondary outcomes included percentage of patients who had vancomycin appropriately de-escalated based on MRSA nasal swab results and mean length of vancomycin therapy. Result: 116 patients met inclusion criteria: 61 in the control group and 55 in the intervention group. Percentage of swabs ordered increased from 36.1% (22/61) without pharmacist intervention to 80.0% (44/55) with pharmacist intervention (p < 0 .0001). There were also increased rates of vancomycin de-escalation in patients with pharmacist intervention, with 58.2% (32/55) of patients in the intervention group having vancomycin discontinued following a negative MRSA swab compared to 19.7% (12/61) in the control group (p < 0 .0001). Conclusion: The results suggest implementing a pharmacist driven MRSA nasal surveillance program into practice could increase the number of MRSA nasal swabs ordered and in turn promote more timely de-escalation of vancomycin in patients with pneumonia. The results from this study can be used to support the wide-spread use of pharmacist driven MRSA nasal surveillance protocols at other institutions.
Synthetic Aperture Radar (SAR) has been used extensively to determine the surface ice flow velocity of tidewater glaciers and investigate changes in seasonal or annual ice dynamics at medium spatial resolution (⩾100 m). However, assessing tidewater glacier behaviour at these resolutions risks missing key details of glacier dynamics, which is particularly important for determination of strain rates that relate to crevasse formation, depth, and ice damage. Here we present surface ice velocity and strain maps with a 16 m posting derived from high-resolution (1 m) PAZ Ciencia spotlight mode SAR imagery for Narsap Sermia, SW Greenland, for October 2019 to February 2021. Results reveal fine details in strain rate, including an area of compression proximal to the terminus, with an upstream shift of strains through time. The velocity evolution of Narsap Sermia shows distinct seasonal changes starting in summer 2020, which are largely modulated by the subglacial drainage system. Comparison of our results with medium-resolution velocity products shows that while these can capture general strain and velocity patterns, our high-resolution data reveals considerably larger ranges of strain values. This is likely to have implications for tuning strain rate dependent calving and ice damage parameterisations within numerical models.
Background: During the COVID-19 pandemic, rates of central line bloodstream infections (CLABSI) increased nationally. Studies pre-pandemic showed improved CLABSI rates with implementation of a standardized vascular access team (VAT).[PL1] [PL2] [mi3] Varying VAT resources and coverage existed in our 10 acute care facilities (ACF) prior to and during the pandemic. VAT scope also varied in 1) process for line selection during initial placement, 2) ability to place a peripherally inserted central catheter (PICC), midline or ultrasound-guided peripheral IV in patients with difficult vascular access, 3) ownership of daily assessment of central line (CL) necessity, and 4) routine CL dressing changes. We aimed to define and implement the ideal VAT structure and evaluate the impact on CLABSI standardized infection ratios (SIR) and rates prior to and during the pandemic. Methods: A multidisciplinary workgroup including representatives from nursing, infection prevention, and vascular access was formed to understand the current state of VAT responsibilities across all ACFs. The group identified key responsibilities a VAT should conduct to aid in CLABSI prevention. Complete VAT coverage[mi4] was defined as the ability to conduct the identified responsibilities daily. We compared the SIR and CLABSI rates between hospitals who had complete VAT (CVAT) coverage to hospitals with incomplete VAT (IVAT) coverage. Given this work occurred during the pandemic, we further stratified our analysis based on a time frame prior to the pandemic (1/2015 – 12/2019) and intra-pandemic (1/2020 - 12/2022). Results: The multidisciplinary team identified 6 key components of complete VAT coverage: Assessment for appropriate line selection prior to insertion, ability to insert PICC and midlines, daily CL and midline care and maintenance assessments, daily assessment of necessity for CL, and weekly dressing changes for CL and midlines[NA5] . A cross walk of VAT scope (Figure 1) was performed in October 2022 which revealed two facilities (A and E) which met CVAT criteria. Pre-pandemic, while IVAT CLABSI rates and SIR were higher than in CVAT units, the difference was not statistically significant. During the pandemic, however, CLABSI rates and SIR were 40-50% higher in IVAT compared to CVAT facilities (Incident Rate Ratio 1.5, 95% CI 1.1-2.0 and SIR Relative Ratio 1.4, 95% CI1.1-1.9 respectively) (Table 1). Conclusions: CLABSI rates were lower in facilities with complete VAT coverage prior to and during the COVID-19 pandemic suggesting a highly functioning VAT can aid in preventing CLABSIs, especially when a healthcare system is stressed and resources are limited.
Background: Community Acquired Pneumonia (CAP) is the most common reason for antibiotic treatment in hospitalized adults. Some prior studies have found treatment differences by race/ethnicity but research on the topic is limited, results are mixed, and it is unclear if clinical outcomes are affected. We sought to examine whether guideline-concordant CAP care and patient outcomes varied by race/ethnicity. Methods: Using the Vizient clinical database, we conducted a cross sectional analysis of all hospitalized patients > = 18 years of age with a primary diagnosis of pneumonia (ICD10 codes: J12-J18) from 2018-2021. Univariate and bivariate analyses examined the distribution of demographic, clinical and hospital characteristics across race/ethnicity. The primary outcome was receipt of therapy concordant with ATS/IDSA Clinical Practice Guideline for CAP. Final models included only patients with bacterial pneumonia and examined the relationship between race/ethnicity and guideline-concordant antibiotic treatment. Secondary analysis examined the interaction between race/ethnicity and concordant antibiotic treatment with length of stay >7 days, 30-day hospital readmission, adverse events or complications in separate models. We used hierarchical multivariable regression models accounting for clustering within patients and among patients hospitalized at the same facility. Due to sample size, significance was assessed with an OR > = 1.2 and p≤ 0.05. All analyses used SAS (v.9.4, SAS Institute Inc. Cary, NC). Results: There were 1,277,770 admissions with a primary diagnosis of bacterial CAP. Sixty-nine percent of the sample was White, 18% Black, 8% Hispanic, 2% Asian and 3% identified as other. 56% of the sample received concordant care. In adjusted models Black patients had greater odds of overall concordant care (OR 1.22; p 7 days (OR 0.67 p <.0001), complication or adverse event (OR 0.75 p <.0001), but not readmission within 30 days. Conclusion: We observed differences between Black and White patients in the receipt of concordant treatment. Hospital bed size, CMI and region played an important role in both antibiotic treatment decisions and clinical outcomes, indicating that hospital and regional prescribing cultures may play in role in treatment inequities.
Background: Invasive Staphylococcus aureus infections cause significant morbidity and mortality in neonatal intensive care unit (NICU) infants.1 Colonization (asymptomatic carriage in the nose, skin, or gut) is a risk factor for subsequent invasive infection (e.g., pneumonia, bone infections, bloodstream infections, etc.). Active surveillance and decolonization measures for S. aureus-colonized infants have been associated with decreased invasive infection rates. 2-4 Methods: A methicillin-resistant S. aureus (MRSA) surveillance and decolonization program, consisting of admission and weekly MRSA nasal cultures followed by intranasal mupirocin plus chlorhexidine baths for colonized infants, was implemented in our level IV NICU with 150 beds in 2006.5 Due to poor compliance with decolonization protocols5, existing practices were reviewed and multiple interventions to increase compliance were implemented in 2018. These renewed efforts included revision of the existing MRSA decolonization protocol, updating the associated electronic medical record order set, re-education of unit staff, and weekly review by the Infection Prevention (IP) and NICU leadership teams to ensure the decolonization protocol was followed for newly colonized infants. Mean MRSA bloodstream infection (BSI) rates were calculated quarterly pre- (January 2014-December 2017) and post- (January 2018-December 2023) implementation of renewed efforts and compared via unpaired t-test. In July 2020 a similar methicillin-susceptible S. aureus (MSSA) surveillance and decolonization program was implemented with an associated revision of existing documents, education campaign, and weekly review of infants with new MSSA colonization. Mean MSSA BSI rates pre- (July 2018-June 2020) and post- (July 2020-December 2023) implementation were compared via unpaired t-test. Results: Renewed implementation of MRSA surveillance and decolonization was associated with a sustained decrease in the mean MRSA BSI rate (Figure 1): 0.10 per 1000 patient-days pre-implementation, 0.03 post-implementation (p=0.02). Following implementation of MSSA surveillance and decolonization, there was no statistically significant change in the mean MSSA BSI rate (Figure 2): 0.20 per 1000 patient-days pre-implementation, 0.15 post-implementation (p=0.32). Conclusions: Implementation of a robust MRSA surveillance and decolonization program in the NICU was associated with a sustained decrease in invasive MRSA infections. No change in invasive MSSA infection rates was observed following implementation of a similar protocol for MSSA. Additional research is needed to better understand the role of MSSA surveillance and decolonization in the NICU.
References: 1. Ericson, J.E., et al., JAMA Pediatr, 2015. 2. Popoola, V.O., et al., ICHE, 2016. 3. Kotloff, K.L., et al., Pediatrics, 2019. 4. Voskertchian, A., et al., ICHE, 2018. 5. Reich, P.J., et al. Clin Microbiol Infect, 2016.
Background: CAUTIs constitute forty percent of nosocomial infections, yet their direct link with mortality remains debated. In 2009, NHSN estimated the economic burden of CAUTIs in the U.S. to be over $340 million. Limited data exist on inter-physician concordance in diagnosing CAUTIs, especially in patients with abnormal genitourinary (GU) anatomy. Our study assessed inter-provider variability in diagnosing CAUTI in 50 such patients, including those meeting NHSN(National healthcare safety network) criteria. Methods: We included a random set of 50 adults (18+) with abnormal GU anatomy admitted to the University of Miami hospitals from January 2018 to November 2021 who had a urinary foley catheter and at least one positive urine culture during their hospitalization. Three Infectious disease fellows and three board-certified Infectious disease physicians independently reviewed each patient’s chart, classifying them as having or not having a CAUTI. Inter-physician reliability was assessed using kappa statistics. Results: Our findings highlight substantial variation in clinician-determined CAUTI incidence among the 50 patients with abnormal GU anatomy, ranging from 8% to 32% (Figures 2,3). Inter-rater agreement on CAUTI diagnosis was generally poor (Kappa Hollenbeak CS, et al. The attributable cost of catheter-associated urinary tract infections in the United States: A systematic review. Am J Infect Control. 2018 Jul;46(7):751-757. Trautner BW, et al. Development and validation of an algorithm to recalibrate mental models and reduce diagnostic errors associated with catheter-associated bacteriuria. BMC Med Inform Decis Mak. 2013 Apr 15;13:48. Gafary M, et al. Catheter Associated Urinary Tract Infections (CAUTI) in Bladder Cancer Patients Post Cystectomy With a Neobladder, Open Forum Infectious Diseases, Volume 2, Issue suppl_1, December 2015, 293.
Background: Patients without urinary tract infection (UTI) symptoms but with a positive urine culture are considered to have asymptomatic bacteriuria (ASB). This often represents colonization and treatment is not recommended or clinically beneficial. Treatment of ASB can promote antimicrobial resistance and increased rates of Clostriodies difficile infections. Many cases of ASB are incorrectly assigned as CAUTIs due to over-culturing practices. We hypothesized that a urine culture algorithm, embedded within a best practice alert (BPA) in the electronic medical record (EMR), would reduce urine culturing practices for ASB. Methods: From Feb 2022 through May 2023, a multidisciplinary team implemented an Inpatient Urine Culturing Stewardship Guideline. A BPA fired when a provider placed a urinalysis with reflex to culture (UACC) or urine culture (UC) order for patients who met criteria (Image 1). The BPA directed providers to remove the order, select the appropriate pathway from the guideline, or provide a rationale for placing the order. The intervention was piloted on three intensive care units and two progressive care units, containing both medical and surgical patients. Monthly ordering practices, CAUTI rates, and gram-negative rod (GNR) bacteremia rates from a 13-month pre-intervention baseline period were compared to a 16-month intervention period. Over the same time periods, we also assessed changes in ordering practices for comparison units which did not implement the intervention. Pre-and-post intervention cohorts were analyzed using median two sample tests and Exact Poison Method, as appropriate. Results: On intervention units there was a 41.0% reduction in the median number of UACC and UC orders per 1000 patient days from 16.31 during the baseline period to 9.62 in the intervention period (p=0.0036). Pan cultures per 1000 patient days in which one of the orders was a UACC or UC fell by 42.2% from a median of 10.20 per 1000 patient days to 5.90 (p=0.0008). The comparison units saw no significant reductions in UACC and UC orders (p=0.21) or pan cultures (p=1.0). On the intervention units, the CAUTI rate for the baseline period was 1.31 per 1000 catheter days versus 0.79 in the intervention period (IRR = 1.65; p=0.44). GNR bacteremias remained stable on the intervention units between the baseline and intervention periods (p=0.82). Conclusion: This multidisciplinary intervention, leveraging EMR clinical decision support, reduced urine and pan culturing practices while demonstrating a trend towards a reduced CAUTI rate. The prevalence of GNR bacteremias remained consistent with baseline levels, suggesting the intervention did not cause harm.