We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Quality improvement programmes (QIPs) are designed to enhance patient outcomes by systematically introducing evidence-based clinical practices. The CONQUEST QIP focuses on improving the identification and management of patients with COPD in primary care. The process of developing CONQUEST, recruiting, preparing systems for participation, and implementing the QIP across three integrated healthcare systems (IHSs) is examined to identify and share lessons learned.
Approach and development:
This review is organized into three stages: 1) development, 2) preparing IHSs for implementation, and 3) implementation. In each stage, key steps are described with the lessons learned and how they can inform others interested in developing QIPs designed to improve the care of patients with chronic conditions in primary care.
Stage 1 was establishing and working with steering committees to develop the QIP Quality Standards, define the target patient population, assess current management practices, and create a global operational protocol. Additionally, potential IHSs were assessed for feasibility of QIP integration into primary care practices. Factors assessed included a review of technological infrastructure, QI experience, and capacity for effective implementation.
Stage 2 was preparation for implementation. Key was enlisting clinical champions to advocate for the QIP, secure participation in primary care, and establish effective communication channels. Preparation for implementation required obtaining IHS approvals, ensuring Health Insurance Portability and Accountability Act compliance, and devising operational strategies for patient outreach and clinical decision support delivery.
Stage 3 was developing three IHS implementation models. With insight into the local context from local clinicians, implementation models were adapted to work with the resources and capacity of the IHSs while ensuring the delivery of essential elements of the programme.
Conclusion:
Developing and launching a QIP programme across primary care practices requires extensive groundwork, preparation, and committed local champions to assist in building an adaptable environment that encourages open communication and is receptive to feedback.
When conducting overviews of reviews, investigators must measure and describe the extent to which included systematic reviews (SRs) contain the same primary studies. The corrected covered area (CCA) quantifies overlap by counting primary studies included across a set of SRs. In this article, we introduce a modification to the CCA, the weighted CCA (wCCA), which accounts for differences in information contributed by primary studies. The wCCA adjusts the original CCA by weighting studies based on the square roots of their sample sizes. By weighting primary studies according to their precision, wCCA provides a useful and complementary representation of overlap in evidence syntheses .
Many post-acute and long-term care settings (PALTCs) struggle to measure antibiotic use via the standard metric, days of therapy (DOT) per 1000 days of care (DOC). Our objective was to develop antibiotic use metrics more tailored to PALTCs.
Design:
Retrospective cohort study with a validation cohort.
Setting:
PALTC settings within the same network.
Methods:
We obtained census data and pharmacy dispensing data for 13 community PALTCs (January 2020–December 2023). We calculated antibiotic DOT/1000 DOC, DOT per unique residents, and antibiotic starts per unique residents, at monthly intervals for community PALTCs. The validation cohort was 135 Veterans Affairs Community Living Centers (VA CLCs). For community PALTCs only, we determined the DOT and antibiotics starts per unique residents cared for by individual prescribers.
Results:
For community PALTCs, the correlation between facility-level antibiotic DOT/1000 DOC and antibiotic DOT/unique residents and antibiotic courses/unique residents was 0.97 (P < 0.0001) and 0.84 (P < 0.0001), respectively. For VA CLCs, those values were 0.96 (P < 0.0001) and 0.85 (P < 0.0001), respectively. At community PALTCs, both novel metrics permitted assessment and comparison of antibiotic prescribing among practitioners.
Conclusion:
At the facility level, the novel metric antibiotic DOT/unique residents demonstrated strong correlation with the standard metric. In addition to supporting tracking and reporting of antibiotic use among PALTCs, antibiotic DOT/unique residents permits visualization of the antibiotic prescribing rates among individual practitioners, and thus peer comparison, which in turn can lead to actionable feedback that helps improve antibiotic use in the care of PALTC residents.
Objectives: This work was aimed at characterizing the experiences of discrimination, and report initial psychometric properties of a new tool to capture these experiences, among a global sample of people living with dementia.
Methods: Data from 704 people living with dementia who took part in a global survey from 33 different countries and territories were analysed. Psychometric properties were examined, including internal consistency and construct validity.
Results: A total of 83% of participants reported discrimination in one or more areas of life, and this was similar across WHO Regions. The exploratory factor analysis factor loadings and scree plot supported a unidimensional structure for the Discrimination and Stigma Scale Ultra Short for People Living with Dementia (DISCUS-Dementia). The instrument demonstrated excellent internal consistency, with most of the construct validity hypotheses being confirmed and qualitative responses demonstrating face validity.
Conclusions: The DISCUS-Dementia performs well with a global sample of people living with dementia. This scale can be integrated into large-scale studies to understand factors associated with stigma and discrimination. It can also provide an opportunity for a structured Discussion around stigma and discrimination experiences important to people living with dementia, as well as planning psychosocial services and initiatives to reduce stigma and discrimination.
Background: Infectious Diseases Society of America guidelines recommend antibiotic prescribing for urinary tract infections (UTIs) when there is a positive culture and signs and symptoms of infection. Despite these guidelines, prescribing for asymptomatic bacteriuria remains prevalent. We conducted a chart review of UTI outpatient encounters to determine the prevalence of antibiotic prescribing as well as patient and provider factors associated with inappropriate prescribing for UTIs. Methods: Patients who were seen at any Department of Veterans Affairs (VA) outpatient clinic with a positive urine culture from 1/1/2019-12/31/2022 were evaluated for inclusion. Exclusion criteria were pregnancy, neutropenia, neurogenic bladder, spinal cord injury/disorder, chronic kidney disease stage III and above, and those undergoing urologic surgical procedures within 7 days. Inappropriate prescribing was defined as an antibiotic prescription given for UTI treatment when no signs or symptoms of infection were recorded during the patient encounter. Chi-square, Fisher’s exact and t-tests were used to evaluate the association between patient and provider characteristics and antibiotic prescribing. Results: Among 341 visits, most patients were male (70%), White (40%), older (mean age of 65.8 ± 15.9 years) and treated at an urban facility (57%). Antibiotics were prescribed for 67% (229/341) of visits. Of the 229 antibiotic courses prescribed, 119 (52%) were appropriate; issued to patients with > = 1 sign or symptom consistent with a urinary tract infection. The most common symptom recorded was dysuria, followed by frequency, urgency, and hematuria (Figure 1). The remaining 110 (48%) antibiotic prescriptions were inappropriate; given to patients without documented UTI-related signs or symptoms. The proportion of inappropriate prescribing was higher among advanced practice practitioners (39/56; 69%) compared to physicians (68/113; 60%; P < 0 .0001). Prescribing of an antibiotic did not differ by gender (p-value=0.3779), race (p-value=0.3972), age (p-value=0.7461) or urban versus rural geography (p-value=0.3647). Discussion: In outpatient clinics, nearly half of antibiotics prescribed to patients with a positive urine culture occurred in the absence of documented of signs or symptoms of a UTI. These results suggest that interventions to improve antibiotic use for UTI-related concerns in the outpatient setting should address UTI-related signs and symptoms as well as asymptomatic bacteriuria. Advanced practice practitioners were more likely to prescribe without documentation of relevant signs or symptoms than physicians. Improving meaningful documentation about the presence or absence of signs and symptoms of a UTI may help reduce inappropriate antibiotic prescriptions in the outpatient setting.
Disclosure: Robin Jump: Research support to my institution from Merck and Pfizer; Advisory boards for Pfizer
Background: In rural areas, antimicrobial stewardship programs often have limited access to infectious disease (ID) expertise. Videoconference Antimicrobial Stewardship Teams (VASTs) pair rural Veterans Affairs (VA) medical centers with an ID expert to discuss treatment of patients with concerns for infection. In a pilot study, VASTs were effective at improving antimicrobial use. Here, we evaluated 12-month operating costs for staffing for 3 VASTs. Methods: We used the following data to describe 12 months of clinical encounters for 3 VASTs operating from January 2022 – March 2023: the number of VAST sessions completed and clinical encounters; Current Procedural Terminology (CPT) codes associated with clinical encounters; session attendees (by role) and the time spent (percent effort) on VAST-related activities. The annual operating cost was based on the annual salaries and percent effort of VAST attendees. We used these characteristics combined with private-sector and Medicare reimbursements to evaluate the cost of implementation and number of clinical encounters needed to offset those costs (breakeven) for each site. Results: Three VASTs recorded 229 clinical encounters during 117 sessions (Table 1). Based on CPT codes, the approximate revenue per patient was $516.46. Site A, the only site to break even, had the most sessions and clinical encounters as well as the lowest operating costs. For Site B, a slight increase in the clinical encounters, which might be achieved by 3 additional VAST sessions, would help achieve breakeven. For Site C, increasing the number of clinical encounters to 3-4 per session would have helped their VAST break even without requiring a decrease in operating costs. Conclusions: The frequency of VAST sessions, volume of clinical encounters, and low operating costs all contributed the VAST at Site A achieving a financial break-even point within 12 months. Consideration of the potential number of clinical encounters and sessions will help other VASTs achieve financial sustainment, independent of cost-savings related to potential decreases in expenditures for antibiotics and antibiotic-related adverse events. These results also provide insight into possible adoption and diffusion of VAST-like programs in the Medicare hospital setting.
Background: Clostridioides difficile infection (CDI) is associated with 500,000 infections and 30,000 deaths per year. Inappropriate testing and treatment of patients with asymptomatic colonization occurs frequently (between 15% and 41%). The VA CDI guidelines emphasize avoidance of CDI testing in patients with laxative use within the previous 48 hours due to the high likelihood of non-infectious diarrhea. The objective of this study was to assess laxative administration among inpatients tested for CDI in VA hospitals and identify factors associated with guideline discordance. Methods: Adults hospitalized in Illinois, Wisconsin, and Michigan VA Medical Centers from January 2019-December 2022 with a CDI test performed during the admission were included. CDI tests included Toxin B gene Polymerase Chain Reaction or Toxin Enzyme Immunoassay. Tests were defined as positive, negative, or cancelled according to the diagnostic protocols of the VA testing laboratories. Laxative use, patient demographics, admission data, and comorbidities were collected from the VA Corporate Data Warehouse. Guideline discordant testing was defined as a diagnostic test for CDI ordered within 48 hours of a recorded laxative dose. Factors associated with discordant testing were analyzed using clustered binomial logistic regression models. Analyses were completed using SAS 9.4. Results: There were 7,326 tests ordered for 4,888 patients during the study. Patients were predominantly White (61.8%), male (95.6%), and elderly (mean age=70.0 standard deviation=12.1). Most (59.0%) patients had received at least one dose of laxative in the 48 hours preceding their CDI test. Being Black (Odds Ratio (OR)=0.86 (95%Confidence Interval (95%CI) =0.76,0.98) or Hispanic (OR (95%CI) =0.62(0.48,0.82) vs White) was associated with a decreased likelihood of inappropriate testing due to recent laxative use. Being tested at a rural facility (OR (95%CI) =1.23 (1.07,1.41) vs urban), within a long-term care (LTC) unit (OR (95%CI) =1.67 (1.41,1.97) vs inpatient), or within an intensive care unit (ICU) (OR (95%CI) =1.40 (1.24,1.59)) were all associated with an increased likelihood of being inappropriately tested. Guideline discordant tests were more likely to have negative results (OR (95%CI) =1.25 (1.05,1.49)) compared to guideline concordant tests. Discussion: Laxative administration in the 48 hours preceding CDI testing was common among hospitalized Veterans and associated with a lower likelihood of positive Results: This echoes non-VA studies where laxative use was reported at 44%. An increased likelihood of guideline discordant testing in ICU and LTC settings suggests the need for greater diagnostic stewardship interventions. Additionally, further work to determine negative outcomes associated with inappropriate testing are needed.
Decreasing the time to contact precautions (CP) is critical to carbapenem-resistant Enterobacterales (CRE) prevention. Identifying factors associated with delayed CP can decrease the spread from patients with CRE. In this study, a shorter length of stay was associated with being placed in CP within 3 days.
Clinical outcomes of repetitive transcranial magnetic stimulation (rTMS) for treatment of treatment-resistant depression (TRD) vary widely and there is no mood rating scale that is standard for assessing rTMS outcome. It remains unclear whether TMS is as efficacious in older adults with late-life depression (LLD) compared to younger adults with major depressive disorder (MDD). This study examined the effect of age on outcomes of rTMS treatment of adults with TRD. Self-report and observer mood ratings were measured weekly in 687 subjects ages 16–100 years undergoing rTMS treatment using the Inventory of Depressive Symptomatology 30-item Self-Report (IDS-SR), Patient Health Questionnaire 9-item (PHQ), Profile of Mood States 30-item, and Hamilton Depression Rating Scale 17-item (HDRS). All rating scales detected significant improvement with treatment; response and remission rates varied by scale but not by age (response/remission ≥ 60: 38%–57%/25%–33%; <60: 32%–49%/18%–25%). Proportional hazards models showed early improvement predicted later improvement across ages, though early improvements in PHQ and HDRS were more predictive of remission in those < 60 years (relative to those ≥ 60) and greater baseline IDS burden was more predictive of non-remission in those ≥ 60 years (relative to those < 60). These results indicate there is no significant effect of age on treatment outcomes in rTMS for TRD, though rating instruments may differ in assessment of symptom burden between younger and older adults during treatment.
To describe antimicrobial therapy used for multidrug-resistant (MDR) Acinetobacter spp. bacteremia in Veterans and impacts on mortality.
Methods:
This was a retrospective cohort study of hospitalized Veterans Affairs patients from 2012 to 2018 with a positive MDR Acinetobacter spp. blood culture who received antimicrobial treatment 2 days prior to through 5 days after the culture date. Only the first culture per patient was used. The association between treatment and patient characteristics was assessed using bivariate analyses. Multivariable logistic regression models examined the relationship between antibiotic regimen and in-hospital, 30-day, and 1-year mortality. Generalized linear models were used to assess cost outcomes.
Results:
MDR Acinetobacter spp. was identified in 184 patients. Most cultures identified were Acinetobacter baumannii (90%), 3% were Acinetobacter lwoffii, and 7% were other Acinetobacter species. Penicillins—β-lactamase inhibitor combinations (51.1%) and carbapenems (51.6%)—were the most prescribed antibiotics. In unadjusted analysis, extended spectrum cephalosporins and penicillins—β-lactamase inhibitor combinations—were associated with a decreased odds of 30-day mortality but were insignificant after adjustment (adjusted odds ratio (aOR) = 0.47, 95% CI, 0.21–1.05, aOR = 0.75, 95% CI, 0.37–1.53). There was no association between combination therapy vs monotherapy and 30-day mortality (aOR = 1.55, 95% CI, 0.72–3.32).
Conclusion:
In hospitalized Veterans with MDR Acinetobacter spp., none of the treatments were shown to be associated with in-hospital, 30-day, and 1-year mortality. Combination therapy was not associated with decreased mortality for MDR Acinetobacter spp. bacteremia.
This project surveyed Veterans’ COVID-19 vaccination beliefs and status. 1,080 (30.8%) Veterans responded. Factors associated with being unvaccinated, identified using binomial logistic regression, included negative feelings about vaccines (OR = 3.88, 95%CI = 1.52, 9.90) and logistical difficulties such as finding transportation (OR = 1.95, 95%CI = 1.01, 3.45). This highlights the need for education about and access to vaccination.
Background: Antimicrobial stewardship programs (ASPs) seek to reduce the prevalence of antimicrobial-resistant and healthcare-associated infections. There are limited infectious disease (ID) physicians and pharmacists to support these ASPs, particularly in rural areas. The Veterans Health Administration has a robust telehealth program in place. Our previous work has demonstrated the feasibility of using telehealth modalities to support ASPs at rural Veterans Affairs medical centers (VAMCs) by pairing them with an ID expert from a larger, geographically distant, VAMC. This program, dubbed the Videoconference Antimicrobial Stewardship Team (VAST), emphasizes discussion of patients undergoing treatment for an active infection and additional relevant clinical topics with a multidisciplinary team at the rural VA. VAST implementation is ongoing at VAMCs. To understand and compare the qualitative differences in implementation, we used process maps to describe the VAST at 3 VAMC dyads. Methods: Team members from each dyad participated in interviews at 3, 6, and 9 months after beginning their VAST sessions. Questions addressed several aspects of VAST implementation and included identifying cases and topics to discuss; advance preparation for meetings; the frequency and general structure of VAST meetings; and documentation including workload capture. The research team used the responses to develop process maps to permit visual display and comparison of VAST implementation. Results: The first dyad began in January 2022 and the third in March 2022. The sessions had 3 phases: preparation, team meeting, and documentation of experts’ recommendations. Tasks were shared between VAST champions at the rural VAMC and the ID experts (Fig. 1). The preparation phase showed the most variation among the 3 dyads. In general, champions at the rural VA identified cases and topics for discussion that were sent to the ID expert for review. The approaches used to find cases and the type of preparatory work by the ID expert differed. Team meetings differed in both frequency and participation by professionals from the rural site. Documentation of expert recommendations processes appeared similar among the dyads. Discussion: Each of the 3 dyads implemented VAST differently. These results suggest that the overall structure of the VAST is readily adaptable and that each site tailored VAST to suit the clinical needs, workflow, and culture of their partner facility. Future work will seek to determine which aspects in the preparation, team meeting, or documentation phases are associated with successful ASPs, including assessment of quantitative and qualitative outcomes.
Background: The NHSN Antimicrobial Resistance (AR) Option is an important avenue for acute-care hospitals to electronically report facilitywide antibiogram data. The NEDSS Base System (NBS) is the statewide surveillance system for mandatory reporting of all carbapenem-resistant Enterobacteriaceae (CRE) cases. The state health department (SHD) validated CRE case data reported through the AR Option to assess completeness and accuracy. Methods: NHSN AR Option data from July 2020–December 2021 for 24 facilities were validated by comparing reported CRE and susceptibility results to CRE isolates reported via the NBS. Isolates were matched based on specimen date, sex, birth month and day, pathogen, and specimen source. NHSN susceptibility results were dichotomized as “not resistant” and “resistant” to match the NBS results. Susceptibility discordance (differing proportions of resistant isolates) of matched pairs were evaluated using the McNemar exact test in SAS version 9.4 software. Results: The SHD identified 270 CRE cases from the NHSN and 1,254 unique CRE isolates from the NBS. Of the NHSN events, 72 (26.67%) were matched to the NBS. Among matched isolates, discordance was significant for doripenem (0 resistant isolates in the NHSN vs 13 in the NBS; P < .001) and imipenem (5 resistant isolates in the NHSN vs 23 in the NBS; P < .0001). Discordance was not significant for ertapenem nor meropenem. Sensitivity analyses maximized the match rate at 30.74% (83 matches) when NBS isolates from unknown sources were included and matching factors were specimen date and date of birth ± 1 day, and pathogen. Among all 6,325 CRE isolates in NBS, 290 (4.58%) did not have a specimen source provided. Of all 47,348 NHSN events, 7,624 (16.10%) had impossible patient birthdays. Conclusions: Many NHSN isolates could not be matched to NBS due to either isolates being missing from NBS or to data differences across the systems. This mismatch highlights the need for data validation and standardization at the point of entry for both systems. Discordant susceptibility outcomes raise concerns about using the NHSN as a method for facility and regional antibiogram data.
Background: Healthcare settings without access to infectious diseases experts may struggle to implement effective antibiotic stewardship programs. We previously described a successful pilot project using the Veterans Affairs (VA) telehealth system to form a Videoconference Antimicrobial Stewardship Team (VAST) that connected multidisciplinary teams from rural VA medical centers (VAMCs) with infectious diseases experts at geographically distant locations. VASTs discussed patients from the rural VAMC, with the overarching goal of supporting antibiotic stewardship. This project is currently ongoing. Here, we describe preliminary outcomes describing the cases discussed, recommendations made, and acceptance of those recommendations among 4 VASTs. Methods: Cases discussed at any of the 4 participating intervention sites were independently reviewed by study staff, noting the infectious disease diagnoses, recommendations made by infectious diseases experts and, when applicable, acceptance of those recommendations at the rural VAMC within 1 week. Discrepancies between independent reviewers were discussed and, when consensus could not be reached, discrepancies were discussed with an infectious diseases clinician. Results: The VASTs serving 4 different rural VAMCs discussed 96 cases involving 92 patients. Overall, infection of the respiratory tract was the most common syndrome discussed by VASTs (Fig. 1). The most common specific diagnoses among discussed cases were cellulitis (n = 11), acute cystitis (n = 11), wounds (n = 11), and osteomyelitis (n = 10). Of 172 recommendations, 41 (24%) related to diagnostic imaging or laboratory results and 38 (22%) were to change the antibiotic agent, dose, or duration (Fig. 2). Of the 151 recommendations that could be assessed via chart review, 122 (81%) were accepted within 1 week. Conclusions: These findings indicate successful implementation of telehealth to connect clinicians at rural VAMCs with an offsite infectious diseases expert. The cases represented an array of common infectious syndromes. The most frequent recommendations pertained to getting additional diagnostic information and to adjusting, but not stopping, antibiotic therapy. These results suggest that many of the cases discussed warrant antibiotics and that VASTs may use the results of diagnostic studies to tailor that therapy. The high rate of acceptance suggests that the VASTs are affecting patient care. Future work will describe VAST implementation at 4 additional VAMCs, and we will assess whether using telehealth to disseminate infectious diseases expertise to rural VAMCs supports changes in antibiotic use that align with principles of antimicrobial stewardship.
Background: Respiratory tract infections (RTIs) in long-term care facilities (LTCFs) are particularly burdensome among residents, the COVID-19 pandemic highlighted the devastating consequences of RTIs in LTCFs. This situation has prompted the need for LTCFs to have a robust, active surveillance system to assist LTCFs with RTI identification. Such a system could assist with faster implementation of appropriate antimicrobial therapy and critical infection prevention and control. The TN Emerging Infections Program worked with CDC EIP to implement a pilot project to test the feasibility of performing RTI surveillance to inform future changes to NHSN. Methods: We recruited 6 LTCFs to collect prospective RTI surveillance for 6 consecutive months from October 2021 through March 2022. Data were collected for all residents meeting the RTI surveillance definitions: pneumonia, lower respiratory tract infection, influenza-like illness (including influenza), and COVID-19. These data were entered by facility workers into a REDCap database with a prospective RTI LTCF event form. Monthly data collection summaries were submitted using a designated denominator form. Descriptive statistics were used to analyze RTI data, and analyses were performed using SAS version 9.4 software. Results: In total, 6 facilities participated in the pilot project during the capture period. The total number of RTI cases across all facilities was 195. December had the most cases (n = 50). The most common first triggers were new RTI signs or symptoms (67.69%), laboratory results (17.44%), imaging findings (6.67%), and clinician-diagnosed RTI (8.21%). The most reported symptom was new or increased cough (57.44%). Chest radiographs were performed for 50.77% of patients. Positive viral laboratory test results were documented 29.74% of the time. Antibiotic treatments were given to 70.77% of residents. The most commonly prescribed antibiotics were cephalosporins (22.56%), macrolides (17.95%), fluoroquinolones (12.31%), and doxycycline (9.23%). Also, 17.4% of cases with antibiotic regimens had cephalosporins as monotherapy. Vaccine documentation was as follows: influenza 2020–2021 (40.51%), influenza 2021–2022 (64.1%), complete COVID-19 vaccine series (82.56%), PPSV-23 vaccine (33.85%), and PCV-13 (23.59%). Conclusions: RTI surveillance was incorporated smoothly into the daily workflow for facilities; the biggest barrier to effective implementation was staff turnover. A scheduled weekly time to collect data and fill out forms proved most effective. A high percentage of cases was treated with cephalosporins as monotherapy, which, based on the latest guidelines, may be suboptimal. Individual reports were sent back to facilities with a comparison to the aggregated data. These data will be used to evaluate antibiotic appropriateness and to guide future RTI surveillance efforts in the LTCF setting.
Objectives: To address the importation of multi-drug-resistant organisms (MDROs) when a colonized or infected patient is transferred from another VA facility, the Veterans Health Administration (VHA) launched the Inpatient Pathogen Tracker (IPT) in 2020. IPT tracks MDRO-infected/colonized patients and alerts MDRO Program Coordinators (MPCs) and Infection Preventionists (IPs) when such patients are admitted to their facility to facilitate rapid identification and isolation of infected/colonized patients. IPT usage has been low during initial rollout (32.5%). The VHA and the CARRIAGE QUERI Program developed targeted implementation strategies to increase utilization of IPT’s second iteration, VA Bug Alert (VABA). Methods: Familiarity with IPT was assessed via pre-education survey (3/2022). All sites received standard VABA implementation including: 1) adaptation of VABA features based on end-user feedback (completed 4/2022), 2) development and delivery of an educational module regarding the revised tool (completed 4/2022), and 3) internal facilitation from the VHA MDRO Program Office (ongoing) (see Figure for all key timepoints). Intent to register for VABA was assessed via post-education survey (4-5/2022). Sites (125 eligible) not registered for VABA by 6/1/2022 were randomly assigned to receive one of two conditions from 6/2022–8/2022: continued standard implementation alone or enhanced implementation. Enhanced implementation added the following to standard implementation: 1) audit and feedback reports and 2) external facilitation, including interviews and education about VABA. We compared the number of sites with ≥1 MPC/IP registered for VABA to-date between implementation conditions. Results:Pre-education survey. 168 MPC/IPs across 117 sites responded (94% of eligible sites). Among respondents, 25% had used IPT, 35.1% were familiar with but had not used IPT, and 39.9% were unfamiliar with IPT. Post-education survey. 93 MPC/IPs across 80 sites responded (59% of eligible sites). Of these, 81.7% said they planned to register for VABA, 4.3% said they would not register, and 14.0% said they were unsure. Post-6/1/2022 Registrations. By 6/1/2022, 71% of sites had ≥1 registered VABA user. Of the 28 unregistered sites eligible for enhanced implementation, thirteen were assigned to receive enhanced implementation, and fifteen were assigned to receive continued standard implementation. Eight sites in the enhanced implementation condition (61.5%) registered for VABA. Seven standard-implementation-only sites (46.7%) registered. The number of registered sites did not significantly differ by implementation condition (Fisher’s exact p=0.476). Conclusions: Standard and enhanced implementation were equally effective at encouraging VABA registration, suggesting that allocating resources to enhanced implementation may not be necessary.
The recent World Health Organization (WHO) blueprint for dementia research and Lancet Commission on ending stigma and discrimination in mental health has identified a gap around dementia-related measures of stigma and discrimination that can be used in different cultural, language and regional contexts.
Aims
We aimed to characterise experiences of discrimination, and report initial psychometric properties of a new tool to capture these experiences, among a global sample of people living with dementia.
Method
We analysed data from 704 people living with dementia who took part in a global survey from 33 different countries and territories. Psychometric properties were examined, including internal consistency and construct validity.
Results
A total of 83% of participants reported discrimination in one or more areas of life, and this was similar across WHO Regions. The exploratory factor analysis factor loadings and scree plot supported a unidimensional structure for the Discrimination and Stigma Scale Ultra Short for People Living with Dementia (DISCUS-Dementia). The instrument demonstrated excellent internal consistency, with most of the construct validity hypotheses being confirmed and qualitative responses demonstrating face validity.
Conclusions
Our analyses suggest that the DISCUS-Dementia performs well with a global sample of people living with dementia. This scale can be integrated into large-scale studies to understand factors associated with stigma and discrimination. It can also provide an opportunity for a structured discussion around stigma and discrimination experiences important to people living with dementia, as well as planning psychosocial services and initiatives to reduce stigma and discrimination.
To describe the genomic analysis and epidemiologic response related to a slow and prolonged methicillin-resistant Staphylococcus aureus (MRSA) outbreak.
Design:
Prospective observational study.
Setting:
Neonatal intensive care unit (NICU).
Methods:
We conducted an epidemiologic investigation of a NICU MRSA outbreak involving serial baby and staff screening to identify opportunities for decolonization. Whole-genome sequencing was performed on MRSA isolates.
Results:
A NICU with excellent hand hygiene compliance and longstanding minimal healthcare-associated infections experienced an MRSA outbreak involving 15 babies and 6 healthcare personnel (HCP). In total, 12 cases occurred slowly over a 1-year period (mean, 30.7 days apart) followed by 3 additional cases 7 months later. Multiple progressive infection prevention interventions were implemented, including contact precautions and cohorting of MRSA-positive babies, hand hygiene observers, enhanced environmental cleaning, screening of babies and staff, and decolonization of carriers. Only decolonization of HCP found to be persistent carriers of MRSA was successful in stopping transmission and ending the outbreak. Genomic analyses identified bidirectional transmission between babies and HCP during the outbreak.
Conclusions:
In comparison to fast outbreaks, outbreaks that are “slow and sustained” may be more common to units with strong existing infection prevention practices such that a series of breaches have to align to result in a case. We identified a slow outbreak that persisted among staff and babies and was only stopped by identifying and decolonizing persistent MRSA carriage among staff. A repeated decolonization regimen was successful in allowing previously persistent carriers to safely continue work duties.
To describe national trends in testing and detection of carbapenemasesproduced by carbapenem-resistant Enterobacterales (CRE) and associatetesting with culture and facility characteristics.
Design:
Retrospective cohort study.
Setting:
Department of Veterans’ Affairs medical centers (VAMCs).
Participants:
Patients seen at VAMCs between 2013 and 2018 with cultures positive for CRE,defined by national VA guidelines.
Interventions:
Microbiology and clinical data were extracted from national VA data sets.Carbapenemase testing was summarized using descriptive statistics.Characteristics associated with carbapenemase testing were assessed withbivariate analyses.
Results:
Of 5,778 standard cultures that grew CRE, 1,905 (33.0%) had evidence ofmolecular or phenotypic carbapenemase testing and 1,603 (84.1%) of these hadcarbapenemases detected. Among these cultures confirmed ascarbapenemase-producing CRE, 1,053 (65.7%) had molecular testing for≥1 gene. Almost all testing included KPC (n = 1,047, 99.4%), with KPCdetected in 914 of 1,047 (87.3%) cultures. Testing and detection of otherenzymes was less frequent. Carbapenemase testing increased over the studyperiod from 23.5% of CRE cultures in 2013 to 58.9% in 2018. The South USCensus region (38.6%) and the Northeast (37.2%) region had the highestproportion of CRE cultures with carbapenemase testing. High complexity (vslow) and urban (vs rural) facilities were significantly associated withcarbapenemase testing (P < .0001).
Conclusions:
Between 2013 and 2018, carbapenemase testing and detection increased in theVA, largely reflecting increased testing and detection of KPC. Surveillanceof other carbapenemases is important due to global spread and increasingantibiotic resistance. Efforts supporting the expansion of carbapenemasetesting to low-complexity, rural healthcare facilities and standardizationof reporting of carbapenemase testing are needed.