To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Rice agriculture was brought to Japan during the first millennium BC by migrant communities of farmers from the Korean peninsula. Substantial geographic variation is observed in the uptake of this new subsistence economy, reflecting different forms of interaction between farmers and foragers. Here, the authors analyse a combination of settlement and radiocarbon data to determine the extent to which these different forms of interaction led to regional variations in population growth rate. Their results confirm the presence of different trajectories of growth, providing new insights into the diversity of demographic processes during the earliest stages of farming in Japan.
Background: Invasive candidiasis, including candidemia, is a significant cause of morbidity and mortality in medically complex and immunocompromised children. Understanding the epidemiology and antifungal susceptibility patterns of Candida infections could help guide empiric antifungal therapy. Methods: This fungal antibiogram was created at a large quaternary children’s health system in Georgia. Blood isolates positive for Candida spp. from 2019 through 2023 were included. The number and percentage of isolates for each Candida spp was recorded by year and then as the combined 5-year total. The Clinical and Laboratory Standards Institute (CLSI) antifungal interpretative criteria were used, and we only included one unique Candida spp isolate per patient. Due to the limited number of isolates, the combined 5 years of isolates were used to create the fungal antibiogram. Data are shown as percent susceptible using CLSI interpretative criteria and number of isolates. Results: Between 2019 and 2023 there were 124 unique blood isolates of Candida spp identified. The most common isolates were C. albicans (33%), C. parapsilosis (27%), C. glabrata (14%) and C. tropicalis (11%). Over the 5 years of the study, the percentage of C. albicans isolates decreased from 47% to 21%. The change in epidemiology was not driven by a single Candida species but varied from year to year. For C. albicans, susceptibility was 100% for fluconazole and micafungin. For C. parapsilosis, susceptibility to fluconazole and micafungin was 97% and 94%, respectively. Fluconazole susceptibility was lowest for C. glabrata (88%) and C. krusei (0%). Using CLSI epidemiological cutoff values (ECV) to evaluate the amphotericin B results, none of the isolates had results greater than the CLSI ECVs. Comparing 2019 and 2023, the percentage of Candida blood isolates resistant to fluconazole increased from 5% to 18.5%. Conclusion: C. albicans was the most frequently identified cause of candidemia in children, but there was a gradual increase in fungemia caused by other Candida spp. over the past 5 years including Candida with fluconazole resistance. Overall, our findings demonstrate high susceptibility rates to fluconazole and echinocandins in Candida spp. blood isolates. Further research is needed to identify risk factors for antifungal resistant candidemia in pediatric patients.
Disclosure: Mark Gonzalez: Honoria for a one time consulation with NaviDx consulting in May of 2022. Honoria from the American Society for Microbiology for writing of a chapter in the Clinical Microbiology Procedures Handbook.
Background: Catheter-associated urinary tract infection (CAUTI) is among the most prevalent healthcare-associated infections. Clinical diagnosis of CAUTI and National Healthcare Safety Network (NHSN) definitions do not always align. Most patients with indwelling urinary catheters ultimately develop asymptomatic bacteriuria (ASB) due to bacterial colonization and may be misattributed as CAUTI. Urine cultures ordered on patients with ASB may lead to reporting of non-clinically significant CAUTI to NHSN. We sought to examine factors associated with ordering inappropriate urine cultures in patients with urinary catheters. Methods: All CAUTIs that were reported to the NHSN at a large academic medical center in Eastern North Carolina were evaluated from October 2021-July 2023. A logistic regression model was fit for patients treated for urinary tract infection (UTI) with the following covariates: age, sex, time of urine culture order, provider type, and days that the urinary catheter was in place. All data analysis was performed in SAS (SAS Institute Inc., SAS 9.4, Cary, NC: SAS Institute Inc., 2002-2023). Results: Table 1 demonstrates patient characteristics stratified by treatment for UTI. The analysis suggests that abnormalresults from urine cultures ordered overnight were less likely to be treated with antibiotics,and this result was statistically significant in both the adjusted and unadjusted analyses – see table 2 and 3. The model also suggests abnormal results from urine cultures ordered by housestaff and older patients were more likely to be treated for UTI, but these results were not statistically significant – see table 3. Finally, the longer a catheter was in place the less likely an abnormalurine culture was to be treated and this finding was statistically significant – see table 3. Conclusion: Cultures that did not prompt antimicrobial treatment did not impact patient care decisions and could be considered as inappropriate orders. This can result in CAUTIs reported to NHSN that were not clinically significant. Abnormal results from cultures that were ordered by the overnight team were less likely to be treated for clinical UTI and this may represent an important target for diagnostic stewardship interventions.
In recent years, anti-refugee hate crimes have soared across Europe. We know this violence has spread fear among refugees, but we know less about its effects on the non-refugee population. This is an oversight, as research suggests political violence often has effects on the broader population. Those effects can range from increased solidarity with the targets of the violence to reduced pro-social behavior and less support for the targets of the violence. In this research note, we examine the effects of exposure to anti-refugee hate crimes in Germany. Our results suggest no direct effect of exposure to anti-refugee hate crimes on support for refugees. These results have several implications for our understanding of political divides over refugees in Europe.
Background: Candida auris is an emerging threat to hospitalized patients and invasive disease is associated with high mortality. This study describes clinical and microbiological characteristics of nine patients identified with C. auris at Ohio State Wexner Medical Center discovered through active surveillance or clinical investigation and uses whole genome sequencing (WGS) to compare isolates. Methods: In November 2022, an active C. auris surveillance program was implemented to screen patients admitted to high-risk units (intensive care units and progressive care units). Bilateral axilla and groin swabs were obtained upon unit admission and, if positive, were submitted for C. auris polymerase chain reaction (PCR) with culture and sensitivity testing. Patients with a positive screening or clinical isolate from November 2022 to November 2023 underwent chart review for clinical characteristics, microbiologic data, and index admission information. For each isolate, DNA was extracted and WGS was performed. Core single nucleotide polymorphism (SNP) variation identified from the sequence data was used to infer genetic relationships among the isolates. Results: Nine patients were identified between November 2022 and November 2023. The clinical and microbiologic characteristics are summarized in Table 1. All patients were hospitalized at various acute care facilities across the state at least once in the preceding 12 months. C. auris was determined to be present on admission for 6 patients. For 5 of these patients, it was their first interaction with our healthcare system. Three patients were not in contact isolation for >3 days before C. auris was identified. Unit wide point-prevalence screening was completed in these cases and no evidence of transmission was found. WGS showed eight of the nine isolates were related with 28 or less core SNP differences between isolates (Figure 1). One isolate (8) was genetically distinct with >45000 core SNP differences. Five isolates were highly related with a range of 4-15 SNP differences. No temporal or spatial overlap at our institution was identified among these five patients. Conclusions: The active surveillance program identified several patients colonized with C. auris in addition to those found through clinical testing. Multiple risk factors for C. auris were identified with high patient mortality (67%). Majority of the isolates were closely related without association with a known outbreak or epidemiologic link, suggesting a possible diffuse common reservoir. Next steps with surveillance in acute care and long-term care facilities will be critical for early detection to halt transmission of this organism.
Background: Automated sepsis alerts have become a widely implemented screening tool aimed at early detection of clinically unstable patients. Prior research has shown mixed results depending on the type of screening tools used and the patient population studied. This study aimed to evaluate the predictive value of an alert system created for identifying patients with sepsis to determine utility in clinical practice prior to implementation. Additionally, clinical management of those with and without sepsis was compared to measure potential added benefit of this system in clinical decision making. Methods: A TheraDoc® software sepsis alert was generated for non-ICU patients meeting >2 SIRS criteria within a 24-hour time period (temperature >38°C or 90, respiratory rate >20 or partial pressure CO2 12,000 or 10% bands/immature cells) during March 2023. Alerts were excluded if they were duplicates (using identical criteria or a second alert within 24 hours), triggered by labs collected >48 hours prior, or death or discharge occurred before the time of alert. The primary outcome was positive predictive value (PPV) of sepsis identification, confirmed by ICD-10 codes and diagnostic studies (cultures, imaging). Secondary outcomes included clinical management (antibiotic utilization [AU] and choice, infectious disease [ID] consultations and culture collection). Antibiotics were categorized as broad-spectrum using National Healthcare Safety Network (NSHN) criteria. Secondary outcomes were compared between sepsis and SIRS without infection groups (SIRS) by chi-square analysis. Results: After applying exclusion criteria, 116 of 166 alerts were analyzed; 55 of 116 alerts had confirmed sepsis (PPV 47.4%). Patients with sepsis were more likely to have an ID consult (16% [9/55] vs 7% [4/61]) and cultures collected (70.9% [39/55] vs 39.3% [24/61]) compared to SIRS patients, however these differences were not statistically significant. AU was higher with confirmed infections compared to SIRS patients (94.5% [52/55] vs 32.8% [20/61], p < 0 .05) along with use of broad-spectrum antibiotics (73% [38/52] vs 40% [ 8/20] p < 0 .05). Conclusions: While automated alerts may enable early identification of sepsis, use of SIRS criteria alone has poor specificity, which was borne out by the low PPV in this study. Our study found that management of sepsis patients (as measured by AU and culture ordering) was better than expected and combined with the low PPV of this alert system resulted in our team rejecting widespread adoption of SIRS-based sepsis alerts.
Carbapenamase-producing carbapenem-resistant Enterobacterales (CP-CRE) is an urgent public health threat for healthcare facilities. Solid organ transplant (SOT) recipients carry an increased risk for CRE infection and colonization due to prolonged exposures to antimicrobials, healthcare facilities and immunosuppression. CRE infection in SOT patients is associated with an increase in morbidity and mortality. Here, we describe a hospital outbreak investigation of three cases of New Delhi metallo-beta-lactamase (NDM) - CRE that led to novel findings with implications for further interdisciplinary investigations. An NDM-CRE infection in a critically-ill patient was identified during passive surveillance and prompted an investigation. Previous CP-CRE passive surveillance cases were reviewed. Rectal screening was performed for potentially exposed patients. 403 rectal swabs were tested for carbapenemase genes in active surveillance. Patients identified to have a new NDM-CRE isolate on active or passive surveillance were considered cases and underwent in-depth chart review including possible patient-to-patient exposures, hospital locations, procedures, devices, and consultations. NDM-CRE isolates were sent to the Minnesota Department of Health (MDH) for whole genome sequencing (WGS) to assess relatedness. Five NDM-CRE cases were identified, with all isolates harboring blaNDM including three NDM-Klebsiella pneumoniae (NDM-KP) cases (Figure 1). The first NDM-KP case, patient 1, developed mediastinal infection following lung transplantation. Review of United Network for Organ Sharing revealed that respiratory specimens from patient 1’s donor grew NDM-KP and a bronchial wash at the time of transplant yielded NDM-KP. The second NDM-KP case (patient 3) developed ventilator-associated pneumonia and was found to have used sequentially the same ventilator as patient 1. The third NDM-KP case (patient 4) was detected via rectal swab in active surveillance and shared wound care personnel in common with patients 1 and 3 (Figure 2). WGS demonstrated two single nucleotide polymorphisms (SNP) among all three isolates, strongly suggesting relatedness (Figure 3). Best practices for infection prevention were reviewed with wound care personnel. To date, no further NDM-KP isolates have been identified. Investigation was facilitated by in-depth chart review and WGS via the Central Region Antimicrobial Resistance Laboratory Network at MDH. Detection of the NDM-KP from a lung donor specimen appears genetically linked to clinical isolates in other patients, raising the possibility of a donor-derived hospital outbreak. This investigation is the first to describe a donor-derived NDM outbreak in a healthcare facility. Communication between organ procurement agencies, transplant centers, and infection prevention must be optimized to prevent CRE-associated morbidity in SOT receipts and CRE hospital outbreaks.
The minimum diameter of the patent ductus arteriosus measured in the lateral angiographic view is usually used to determine the device size. Sometimes the device can be easily removed from the patent ductus arteriosus, even if it appears to be the optimum size.
Methods:
From 2016 to 2021, 29 patients who underwent contrast-enhanced CT prior to patent ductus arteriosus closure included. Morphological evaluation of the narrowest part of the patent ductus arteriosus was performed on contrast-enhanced CT. We also examined whether there were differences in morphology depended on Krichenko classification, age, and the diameter of the narrowest portion of the patent ductus arteriosus.
Results:
At the time of treatment, the median age was 4.8 (range, 1–52) months, and the median weight was 5.0 (2.5–12.7) kg. The median minimum vertical diameter of patent ductus arteriosus was 2.9 (1.6–6.6) mm. The narrowest patent ductus arteriosus part in the contrast CT imaging showed horizontal-to-vertical diameter ratios in the range of 1.0–1.7, with no case where the vertical diameter was larger than the horizontal diameter. The median horizontal-to-vertical diameter ratio by Krichenko type was: A, 1.22; C, 1.29; E, 1.62(p = 0.017). When classifying the patients into a group aged under six months (n = 21) and a group aged six months or older (n = 8), the respective median horizontal-to-vertical diameter ratio was 1.34 and 1.15 (p = 0.027). The vertical patent ductus arteriosus diameter was not correlated with the elliptical shape.
Conclusions:
Most patent ductus arteriosus cases have a horizontally oriented elliptical shape in this study. This characteristic showed high reproducibility and is important information that angiography cannot evaluate.
Background: Current epidemiological methods have limitations in identifying transmission of bacteria causing healthcare-associated infections (HAIs). Recent whole genome sequencing (WGS) studies found that genetically related strains can cause HAIs without meeting standard epidemiologic definitions, but these results could not provide data in a timely fashion needed for intervention. Given recent advances in Oxford Nanopore Technologies (ONT) sequencing, we sought to establish a validated ONT pipeline capable of providing accurate WGS-based comparisons of clinical pathogens within a short time frame that would allow for infection control interventions. Method: Using electronic medical record data, we identified potential healthcare acquisition of methicillin-resistant Staphylococcus aureus (MRSA), vancomycin-resistant enterococci (VRE), and carbapenem-resistant gram-negative rods. Bacterial genomic DNA was directly extracted from clinical microbiology lab plates. Sequencing was conducted with the ONT MinION sequencer and R10.4.1 flow cell. MINTyper for single nucleotide polymorphism (SNP) calling and Ridom SeqSphere+ for core genome MLST were used to determine genetic relatedness. The main outcome was time from pathogen identification to completed genetic analysis. Result: The weekly workflow, from genomic DNA extraction to complete data analysis, averaged 2.6 days with a standard deviation of 1.3 days. (range: 1 to 6 days). Starting in August 2023, we have sequenced a total of 177 bacterial isolates from 156 unique patients. Isolates came from blood (38%), tissue/wound/body fluid (24%), urinary tract (20%), respiratory tract (16%), and rectal swab (2%). To date, six genetically related clusters have been identified. Three clusters involved ST117 vancomycin-resistant Enterococcus faecium (VREfm), comprising a total of 13 unique patients distributed as 2, 3, and 8 patients in each group, with pairwise SNP differences of 20, 11, and 14. Patients within the same clusters showed epidemiological links through overlapping admissions and temporally shared ICU stays. Additionally, another cluster consisted of five genetically related ST633 Pseudomonas aeruginosa isolates, with a pairwise SNP difference of 57.5. Each patient in this cluster had potential epidemiological links through overlapping admission times, despite the absence of identified shared spaces. The last two clusters involved Klebsiella pneumoniae and Escherichia coli (two cases each), with pairwise SNP differences of 18 and 9, respectively. In both cases, each patient showed potential epidemiological links through overlapping admission times. Conclusion: Our stand-alone ONT pipeline was able to rapidly and accurately detect genetically related AMR pathogens, aligning closely with epidemiological data. Our approach has the potential to assist in the efficient detection and deployment of preventative measures against healthcare-associated infection transmission.
The aim of this study is to track the evolution in the use of the markers nenny, non + verb (non fait ‘no, it doesn’t’) and non in its absolute use between the middle of the 15th century and the end of the 18th. In Middle French, non already covers all the uses of the old markers nenny and non fait, but it remains in the minority. In Pre-Classical French (1550–1650), the frequency of nenny and non fait decreases considerably and, in Classical French (1650–1789), they become archaic. In the mid-17th century, non definitively assumes the functions of the medieval markers, which disappeared. The analysis of the temporal distribution of these markers helps to date the transition from ancient to modern uses. Several studies of phonetic, morphological and syntactic phenomena have also aimed to date the turning point between the medieval and the “classical” language, which occurs during the so-called “pre-classical” period. This research also seeks to contribute to the debate on the position of the boundary between Pre-Classical and Classical French on the basis of pragmatic criteria. The results support placing this boundary within the decade 1620–1630, as other studies did for morphosyntactic phenomena.
Background: Machine-learning (ML) models, such as neural networks (NNs), have been proposed to predict antimicrobial susceptibility at the patient level while incorporating patient-level information from electronic medical record (EMR) systems. However, NNs often do not perform well in predicting rare outcomes, such as carbapenem resistance. We aimed to apply a novel multitask NN to create personalized antibiograms for individual patients with Escherichia coli clinical isolates to predict antimicrobial resistance (AMR) for four major antimicrobial classes simultaneously with improved accuracy for carbapenem resistance by using shared hidden layers (Figure 1). Methods: We analyzed all E. coli clinical isolates from the US Veterans Health Administration’s network from January 1, 2017, to December 31, 2019, focusing on AMR profiles of aminopenicillins, narrow-spectrum (NS) cephalosporins, extended-spectrum (ES) cephalosporins, and carbapenems. Patient-level clinical data (demographics, antimicrobial exposure history, previous isolates (if any), comorbidities, and recent procedures) were extracted from EMR. Antibiograms for all hospitals were generated using standard methods for the preceding calendar years. We employed logistic regression to evaluate the efficacy of conventional antibiograms in predicting AMR profiles. We adopted the ML approach using conventional NNs and novel multitask NNs on all extracted clinical data and hospital antibiograms. The models were trained with data from 2017 and 2018 and then tested on 2019 data, assessing their performance using the area under the receiver-operating curve (AUC). Results: The study included 257,968 E. coli isolates, split into 171,391 for training and 86,577 for validation. The prevalence of AMR in the test data from 2019 was 49.8% for aminopenicillins, 28.4% for NS cephalosporins, 10.7% for ES cephalosporins, and 0.2% for carbapenems, respectively. Conventional hospital antibiograms showed low prediction accuracy with AUC scores of 0.56 for aminopenicillins, 0.67 for NS cephalosporins, 0.61 for ES cephalosporin, and 0.67 for carbapenem. AUC scores from preliminary models for conventional and multitask NNs were 0.78/0.78 for aminopenicillins, 0.83/0.82 for NS cephalosporins, 0.84/0.85 for ES cephalosporins, 0.68/0.75 for carbapenems. While producing improved accuracy for carbapenem and comparable accuracies for three other classes, multitask NNs took approximately 66% less time for model training than conventional NNs. Conclusions: Integrating EMR data with NNs improved their predictive accuracy, potentially leading to a decision-support tool for better empirical antimicrobial therapy guidance in the window between species identification and confirmed susceptibilities. Multitask NNs can potentially improve the prediction accuracy of uncommon AMRs while maintaining comparable prediction accuracies for common AMRs and optimizing the efficiency of model training.
Disclosure: Michi Goto: Contracted Research - Merck
Background: Bloodstream infections (BSIs) are an important cause of morbidity and mortality in severely ill patients, contributing to increased length of hospital stay and higher cost of care. Alberta Health Services Infection Prevention and Control (IPC) conducts inpatient surveillance of new episodes of BSIs with methicillin-resistant Staphylococcus aureus (MRSA), vancomycin-resistant enterococci (VRE) or carbapenemase-producing organisms (CPO) in 112 acute care facilities. A case-finding process was undertaken to verify the accuracy of BSI data entry. Methods: All positive MRSA, VRE or CPO blood cultures in 2021 were linked to the Inpatient Discharge Abstract Database (DAD) and the National Ambulatory Care Reporting System (NACRS) to identify new cases during acute care admissions. The results were then compared to surveillance records captured by infection control professionals (ICPs). Cases with unmatched culture date and/or encounter date and cases not identified by ICPs were screened by the study team with final decision made by ICPs. Results were analyzed by ARO and by % increase in number of surveillance records. Results: The laboratory linkage identified 286 new cases. Comparing to surveillance records (n = 248) captured by ICPs, 137 (57.3%) had matching collection dates and encounter dates, 85 (35.6%) had close matches on collection dates and encounter dates, 17 (7.1%) records had either matching collection dates or encounter dates, and 1 (0.4%) record did not have any matches on dates. There were 46 records identified in the laboratory data that were not in the surveillance system and 8 records that were in the surveillance system but not matched to the laboratory data. After review, 22 Surveillance records had data entry errors (1 CPO BSI, 20 MRSA BSI, and 1 VRE BSI), and there were 14 BSI records found to be missing (13 MRSA BSI, 1 VRE BSI). This represents a 6% increase in MRSA BSI and a 3% increase in VRE BSI identified in 2021 and no increase in CPO BSI. Conclusions: A laboratory validation to determine if BSIs with an ARO were missed during routine IPC surveillance identified a small proportion of missed bloodstream infections. The most common reason for the miss was admission through the emergency department with multiple blood cultures collected during a single admission. These results will be shared with the Infection Control program to facilitate correct BSI capture.
Background: Medical tape is one of the most ubiquitous resources in the hospital. Although tape is advertised by manufacturers as a single patient-use item, half-used rolls are a common sight in hospitals. Tape is often manipulated by un-sanitized and ungloved hands and comes in close contact with patient skin. Medical tape has the potential to be a source of hospital-acquired infection as it has been documented to be colonized with pathogens ranging from MRSA to Rhizopus. Despite infection risk, currently the only clinical guidelines of tape use are outlined in the Centers for Medicare & Medicaid Services guidance for Hemodialysis patients issued in 2008 that requires “tape should be dedicated to a single patient and discarded after use” as hemodialysis patients are at higher risk of infection. However, there is a lack of standards in the practice of tape use across hospital systems. Methods: To understand the current practices and beliefs of tape use at our institution, we developed a standardized survey to query individuals from various roles (RN, Physician, Patient Care Technicians, respiratory therapists, phlebotomists) across all patient care areas at a 746-bed academic, tertiary care center. Results: 52 units were surveyed, including 225 employees. Qualitative analysis revealed a wide variety of uses for medical tape for patient care, with venipuncture, securing IVs, and wound dressings being the most common. Only 1.4% of individuals reported single use of tape rolls. 54% of individuals reported tape use behaviors that carry an elevated risk for inoculation of pathogens. 70% of individuals reported that tape was discarded after the patient was discharged from their respective area. These practices did not change across procedure-heavy areas such as the Emergency Department or the Operating Rooms, in fact only 22% of individuals surveyed reported single use of tape in these areas. Beliefs about tape use varied: 95% of individuals agreed that a roll of tape could be used multiple times on a single patient, and 52% of individuals agreed that a roll of tape could be used on multiple patients. Conclusions: Tape use practices varied across hospital units, indicating the need for standardized policies for tape use and storage. Beliefs about tape not being a single-use item were consistent across the hospital and suggest that education and culture change efforts are needed to decrease the risk for hospital-acquired infections from improper medical tape use.
Background: Inequities in healthcare-associated infections (HAI) incidence and prevention measures are critically important to understand (Chen,2021). While evaluations are beginning to characterize these disparities by infection type (Gettler, 2023), our work expands this by characterizing disparities by prevention strategies. By better understanding how evidence-based prevention strategies are implemented at the patient level, infection preventionists and hospital epidemiologists can better design strategies that provide equitable care to all patients. Methods: Beginning January 2023, gender, race, ethnicity, spoken language, and age group fields were added to daily chlorhexidine gluconate (CHG) treatment and C. difficile test order compliance data captured via electronic medical record. In July 2023, fields on recorded race, ethnicity, and gender were added to well-established foley and vascular access real-time peer audit tools that are used by infection preventionists (IPs). Each prevention strategy variable was summarized by demographic variables and differences in compliance were measured using chi-square tests. Results: 899 vascular audits and 420 foley audits were completed by IPs between July – December 2023. In 2023, there were 114,066 opportunities for CHG Treatment and 1,991 C. difficile test orders. Missing data varied by metric but ranged from 0-60%. Statistically significant differences by race were found in 3 of 8 components (i.e., intact seal, secured catheter and absence of dependent loop) in the foley audit (p < 0 .01) and compliance with C. difficile test ordering (p < 0 .01). No differences in race were found in vascular access audits or CHG treatment. No differences in gender or ethnicity were noted in foley, vascular access audits, CHG treatment compliance, or C. difficile testing. Differences in gender and age were found in CHG treatment compliance (p < 0 .001). Conclusions: By focusing more on patient level process measures rather than only presenting stratified outcomes data, we can identify targeted opportunities for improvement in health equity before our patients develop an HAI. Further evaluations should also focus on assessing the clinical relevance of statistical findings to better inform intervention strategies. Separately, efforts are needed to improve completeness and integrity of demographic data in the electronic medical record.
Introduction: Many central line-associated bloodstream infections are identified in patients nearing the end of life. Stanford Health Care recently introduced the General Inpatient Hospice program. This program offers inpatient hospice care for patients who, due to uncontrolled symptoms, cannot be discharged to a hospice facility or receive home hospice care. We investigated whether this program would impact blood cultures practices near the time of death. Methods: We performed a retrospective cohort study at Stanford Health Care using records of blood culture events from May 2019 to October 2023. We defined a blood culture near-death as those collected within 2 days before the date of death. We performed an interrupted time series linear regression before and after the implementation of the General Inpatient Hospice program on July 1, 2022 to assess blood culture intensity near-death. Blood culture intensity was defined as the proportion of cultures collected near-death in relation to the total number of blood cultures. Additionally, we calculated blood culture positivity rate, which was defined as the proportion of positive blood cultures among all those collected during our study period. Results: Out of 220,269 blood cultures from 24,955 unique patients, a total of 6,147 cultures (9%) were obtained near the time of death. Among these subjects, the median age was 65 years (range 20–102), with 43% identifying as being of White race-ethnicity and 57% as male. Of these cultures, 3044 were positive (49.5%), with Escherichia coli (618, 24%), Klebsiella pneumoniae (341, 13%), and Staphylococcus aureus (166, 10%) being the most common organisms. After the implementation of the General Inpatient Hospice program, the median enrollment was 12 patients (range 3–18) and the median mortality rate was 2.3% (range 2–3%). The blood culture intensity near death decreased by 0.81%, a change that was not statistically significant (95% CI -2.4% to 0.8%, p=.32; Figure 1). Subsequently, the blood culture intensity showed a non-significant increasing trend of 0.05% (95% CI -0.1% to 0.2%, p=0.53). The blood culture positivity rate near the time of death increased by 16% following the intervention, but this increase was not statistically significant (95% CI – 11.8% to 43.3%, p=.26; Figure 2), and it was followed by a non-significant downtrend of 1.9% (95% CI -3.9% to 1.4%, p=.36). Conclusion: We found no significant association between the implementation of an inpatient hospice program and blood culture practices near the time of death, likely due to low patient enrollment.
What are the consequences of including a “don't know” (DK) response option to attitudinal survey questions? Existing research, based on traditional survey modes, argues that it reduces the effective sample size without improving the quality of responses. We contend that it can have important effects not only on estimates of aggregate public opinion, but also on estimates of opinion differences between subgroups of the population who have different levels of political information. Through a pre-registered online survey experiment conducted in the United States, we find that the DK response option has consequences for opinion estimates in the present day, where most organizations rely on online panels, but mainly for respondents with low levels of political information and on low salience issues. These findings imply that the exclusion of a DK option can matter, with implications for assessments of preference differences and our understanding of their impacts on politics and policy.
Although resilient youth provide an important model of successful adaptation to adversity, we know relatively little about the origins of their positive outcomes, particularly the role of biological mechanisms. The current study employed a series of methylome-wide association studies to identify methylomic biomarkers of resilience in a unique sample of 276 twins within 141 families residing in disadvantaged neighborhoods. Results revealed methylome-wide significant differentially methylated probes (DMPs) for social and academic resilience and suggestive DMPs for psychological resilience and resilience across domains. Pathway analyses informed our understanding of the biological underpinnings of significant differentially methylated probes. Monozygotic twin difference analyses were then employed to narrow in on DMPs that were specifically environmental in origin. Our findings suggest that alterations in the DNA methylome may be implicated in youth resilience to neighborhood adversity and that some of the suggestive DMPs may be environmentally engendered. Importantly, our ability to replicate our findings in a well-powered sample was hindered by the scarcity of twin samples with youth exposed to moderate to substantial levels of adversity. Thus, although preliminary, the present study is the first to identify DNA methylation biomarkers of academic and social resilience.
Background: Contact precautions (CP) to prevent transmission of multidrug-resistant gram-negative (MDRGN) Enterobacteriaceae are recommended, although studies of discontinuation of CP (DcCP) have found no change in healthcare associated infections (HAI) due to extended-spectrum beta-lactamase (ESBL) producing Enterobacteriaceae. Limited data exists on DcCP for MDRGN in a large health system. Methods: We performed a retrospective observational study analyzing the relationship between use of CP and HAI due to two definitions of MDRGN Enterobacteriaceae: ESBL, and non-susceptibility to ≥3 drug classes (3DC-GNR), with carbapenem resistant Enterobacteriaceae (CRE) serving as control. The study included all inpatient admissions from 2/2017 through 9/2022 at 21 acute care hospitals. Hospitals had latitude to determine CP practices based on local risk assessment, but in 2/2018, system-wide transmission-based precautions guidance was updated to recommend DcCP for MDRGN Enterobacteriaceae and in 12/2019 was updated to clarify DcCP specifically for ESBL and 3DC-GNR while continuing CP for carbapenem-resistant organism carriage. We interviewed infection preventionists to define when CP were used for CRE, ESBL, and 3DC-GNR Enterobacteriaceae. HAI were defined using National Healthcare Safety Network criteria including all HAI categories. We compared the incidence rate of HAI attributable to the two MDRGN types in hospital months with and without use of CP, with HAI due to CRE as a comparison group since all hospitals used CP for CRE throughout the study period. Results: The periods of CP use, by hospital, are shown in Figure 1. Throughout the study period, there were 987 HAI attributed to ESBL Enterobacteriaceae, 579 due to 3DC-GNR Enterobacteriaceae, and 329 due to CRE. Figure 2 shows the unadjusted aggregate rate of HAI for each of the three MRGN types, including among hospitals with and without CP in each month, for ESBL and 3DC-GNR. In months with and without CP, the rate of HAI was 1.482/10,000 and 1.093/10,000 patient days (incidence rate ratio [IRR], 1.356 [95% confidence interval, 1.195-1.540]) for ESBL Enterobacteriaceae. In months with and without CP, the rate of HAI was 1.071/10,000 and 0.493/10,000 patient days (IRR,2.173[95% confidence interval, 1.838-2.569]) for 3DC-GNR Enterobacteriaceae. Conclusion: DcCP was not associated with an increase in HAI due to ESBL and 3DC-GNR Enterobacteriaceae in aggregated facilities that self-selected for DcCP. Facilities that used CP were associated with significantly higher rates of ESBL and 3DC-GNR Enterobacteriaceae, a relationship that did not change as hospitals DcCP for these MDRGN. Further analyses are necessary to assess for a causal relationship.
Background: Only a few studies have assessed the relationship between deprivation and excessive antibiotic use. In Texas, antimicrobial prescription is particularly high compared with the rest of the US. This study analyzed the association between local area socioeconomic deprivation and providers’ fluoroquinolone claim rates among beneficiaries 65 years and older in Texas. Method: This ecological study utilized provider- and area-level data from Medicare Part D Prescribers and the Social Deprivation Index (SDI) repositories. To identify geographic patterns and autocorrelation in and between SDI and fluoroquinolone claims, spatial dependence of these two variables was assessed by bivariate Local Indicators of Spatial Association (LISA) cluster mapping along with the global and local Moran’s I analyses. Negative binomial regression models were employed to evaluate the relationship between provider- and area-level characteristics (prescriber’s gender, specialty, rural-urban community area, beneficiaries' demographics, area-level population, and normalized SDI) and fluoroquinolone claim rates per 1,000 beneficiaries. Result: A total of 11,996 providers were included. There was no spatial dependence between SDI and rates of fluoroquinolone claims in Texas (Global Moran’s I =0.01, P=0.618). Bivariant LISA maps showed 85 high-high and 38 low-low spatial clusters. Higher SDI (incidence rate ratio (IRR) 0.98, 95% confidence interval (CI) 0.97-0.99 per 1-unit increment) and male providers (IRR 0.96, 95%CI 0.94-0.99) were associated with lower claim rates. In contrast, several factors were associated with higher claim rates, including non-metropolitan areas (1.04, 95%CI 1.00-1.09), and practices with a high proportion of male patients (IRR 1.12, 95%CI 1.10-1.14), Black patients (IRR 1.05, 95%CI 1.03-1.07), or Medicaid beneficiaries (IRR 1.15, 95%CI 1.12-1.17). Effect modification was observed between SDI and rurality, with higher SDI in non-metropolitan areas associated with higher claim rates, whereas SDI in metropolitan areas was inversely related to claim rates. Conclusion: This study showed that the distribution of high and low SDI and rates of fluoroquinolone claims were more geographically clustered than expected by random chance alone. Lower fluoroquinolone claim rates among Texas Medicare providers were seen in metropolitan areas with higher SDI, indicating potential barriers to care. Conversely, higher claim rates were observed in rural areas with higher SDI, signifying a possible knowledge or attitude gap towards fluoroquinolone use. These findings provide opportunities for public health professionals to explore gaps in the knowledge and attitudes of patients and providers related to antimicrobial use, particularly in rural regions, and investigate barriers to healthcare access in metropolitan areas.