We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We present a re-discovery of G278.94+1.35a as possibly one of the largest known Galactic supernova remnants (SNRs) – that we name Diprotodon. While previously established as a Galactic SNR, Diprotodon is visible in our new Evolutionary Map of the Universe (EMU) and GaLactic and Extragalactic All-sky MWA (GLEAM) radio continuum images at an angular size of $3{{{{.\!^\circ}}}}33\times3{{{{.\!^\circ}}}}23$, much larger than previously measured. At the previously suggested distance of 2.7 kpc, this implies a diameter of 157$\times$152 pc. This size would qualify Diprotodon as the largest known SNR and pushes our estimates of SNR sizes to the upper limits. We investigate the environment in which the SNR is located and examine various scenarios that might explain such a large and relatively bright SNR appearance. We find that Diprotodon is most likely at a much closer distance of $\sim$1 kpc, implying its diameter is 58$\times$56 pc and it is in the radiative evolutionary phase. We also present a new Fermi-LAT data analysis that confirms the angular extent of the SNR in gamma rays. The origin of the high-energy emission remains somewhat puzzling, and the scenarios we explore reveal new puzzles, given this unexpected and unique observation of a seemingly evolved SNR having a hard GeV spectrum with no breaks. We explore both leptonic and hadronic scenarios, as well as the possibility that the high-energy emission arises from the leftover particle population of a historic pulsar wind nebula.
Control of carbapenem-resistant Acinetobacter baumannii and Pseudomonas aeruginosa spread in healthcare settings begins with timely and accurate laboratory testing practices. Survey results show most Veterans Affairs facilities are performing recommended tests to identify these organisms. Most facilities report sufficient resources to perform testing, though medium-complexity facilities report some perceived barriers.
The number of test translations and adaptations has risen exponentially over the last two decades, and these processes are now becoming a common practice. The International Test Commission (ITC) Guidelines for Translating and Adapting Tests (Second Edition, 2017) offer principles and practices to ensure the quality of translated and adapted tests. However, they are not specific to the cognitive processes examined with clinical neuropsychological measures. The aim of this publication is to provide a specialized set of recommendations for guiding neuropsychological test translation and adaptation procedures.
Methods:
The International Neuropsychological Society’s Cultural Neuropsychology Special Interest Group established a working group tasked with extending the ITC guidelines to offer specialized recommendations for translating/adapting neuropsychological tests. The neuropsychological application of the ITC guidelines was formulated by authors representing over ten nations, drawing upon literature concerning neuropsychological test translation, adaptation, and development, as well as their own expertise and consulting colleagues experienced in this field.
Results:
A summary of neuropsychological-specific commentary regarding the ITC test translation and adaptation guidelines is presented. Additionally, examples of applying these recommendations across a broad range of criteria are provided to aid test developers in attaining valid and reliable outcomes.
Conclusions:
Establishing specific neuropsychological test translation and adaptation guidelines is critical to ensure that such processes produce reliable and valid psychometric measures. Given the rapid global growth experienced in neuropsychology over the last two decades, the recommendations may assist researchers and practitioners in carrying out such endeavors.
Background: Carbapenem-resistant Acinetobacter baumannii (CRAB) and Pseudomonas aeruginosa (CRPA) are drug-resistant pathogens causing high mortality rates with limited treatment options. Understanding the incidence of these organisms and laboratory knowledge of testing protocols is important for controlling their spread in healthcare settings. This project assessed how often Veterans Affairs (VA) healthcare facilities identify CRAB and CRPA and testing practices used. Method: An electronic survey was distributed to 126 VA acute care facilities September-October 2023. The survey focused on CRAB and CRPA incidence, testing and identification, and availability of testing resources. Responses were analyzed by complexity of patients treated at VA facilities (High, Medium, Low) using Fisher’s exact tests. Result: 77 (61.1%) facilities responded, most in urban settings (85.4%). Most respondents were lead or supervisory laboratory technologists (84.2%) from high complexity facilities (69.0%). Few facilities detected CRAB ≥ once/month (4.4%), with most reporting that they have not seen CRAB at their facility (55.0%). CRPA was detected more frequently: 19% of facilities with isolates ≥ once/month, 29.2% a few times per year, and 26.9% reporting had not seen the organism. No differences in CRAB or CRPA incidence was found by facility complexity. Nearly all facilities, regardless of complexity, utilize the recommended methods of MIC or disk diffusion to identify CRAB or CRPA (91.9%) with remaining facilities reporting that testing is done off-site (7.8%). More high complexity facilities perform on-site testing compared to low complexity facilities (32.0% vs 2.7%, p=0.04). 83% of laboratories test for Carbapenemase production, with one-fourth using off-site reference labs. One-fourth of facilities perform additional antibiotic susceptibility testing for CRAB and CRPA isolates, most of which test for susceptibility to combination antibiotics; no differences between complexities were found. Agreement that sufficient laboratory and equipment resources were available was higher in high complexity than in medium complexity facilities (70.7% vs 33.3%, p=0.01), but not low complexity facilities (43.8%). Conclusion: Having timely and accurate testing protocols for CRAB and CRPA are important to quickly control spread and reduce associated mortality. This study shows that most VA protocols follow recommended testing and identification guidelines. Interestingly, there was no difference in CRAB or CRPA incidence for facilities providing higher vs lower complexity of care. While high and low complexity facilities generally reported sufficient resources for CRAB and CRPA evaluation, some medium-complexity labs, who may feel more compelled than low-complexity labs to bring testing in house, reported that additional resources would be required.
Background: Incorporating diversity, equity, inclusion, and justice into healthcare ensures equitable opportunity to achieve optimal health. Infectious diseases, antimicrobial stewardship, and infection prevention teams rely on consultative recommendations to improve patient care which may be influenced by implicit and explicit biases of the recipient treatment teams. Little is known about how race, ethnicity, and other characteristics impact stewardship and infection control recommendations. Methods: A survey of infectious diseases, antimicrobial stewardship, and infection prevention practitioners was developed through the Society of Healthcare Epidemiology of America (SHEA) Antimicrobial Stewardship Committee. The survey was sent electronically to members of the SHEA Research Network and was promoted to attendees of two sessions at IDWeek 2022 and SHEA Spring 2023. Survey questions included demographics, awareness of (and participation in) unconscious bias and microaggression training at their institutions, antibiotic prescribing bias observations, and perceptions of how race, ethnicity, and other characteristics have influenced participants' antimicrobial stewardship and infection prevention recommendations. Descriptive statistics were performed using SAS V.9.4 . Results: Among 175 survey respondents, 75% (n=129) were White, 16% (n=27) were Asian, 4% (n=7) were Black, 85% (n=150) were non-Hispanic, 5% (n=8) were Hispanic, and 3% (n=5) reported ethnicity as multiethnic. 76% of respondents identified as female, and 2% as non-binary or gender-fluid. 29% of respondents had a medical degree, 12% had a nursing degree, 7% had a pharmacy degree, and 52% had a degree listed as other (7% had a PhD, 23% had an MPH/MSPH degree, and 15% had an MS degree). 65% and 49% of respondents had participated in unconscious bias and microaggression training, respectively. 18% (n=22) of White respondents, 43% (n=3) of Black respondents, and 30% (n=8) of Asian respondents reported witnessing antimicrobial prescribing influenced by race, ethnicity, or other characteristics. 17% and 15% of respondents felt that their antimicrobial stewardship and infection prevention recommendations, respectively, had not been accepted due to their race, ethnicity, gender identity, or other personal identifiers. Conclusion: This survey showed demographic characteristics of professionals working in infectious diseases and their perceptions of how certain aspects of their identity have influenced their recommendations. Differences between racial groups were observed in how frequently respondents witnessed inequities in antimicrobial prescribing, and many respondents felt their recommendations had not been accepted due to their identity. A limitation of this analysis is that few Black individuals completed the survey, which makes comparisons by race difficult; however, the respondents were consistent with SHEA membership demographics.
Background: Tracking antimicrobial use (AU) is a Core Element of Hospital Antimicrobial Stewardship Programs and important to help curb the public health threat of antimicrobial resistance. The National Healthcare Safety Network’s (NHSN) AU Option serves as a way for facilities, healthcare systems, and health departments to track and report AU rates within their jurisdictions. Many analyses at the state and national levels do not assess unit-level AU rates. This study investigates AU rates among patient care units in Tennessee reporting facilities from 2015 to 2023 and the most frequently used antimicrobial agents based on AU rates within select unit types. Methods: A retrospective analysis was conducted utilizing data obtained from the NHSN AU Option for inpatient units in Tennessee acute care hospitals. Units were defined as critical care (including neonatal), ward, oncology ward, stepdown, operating room (OR), and mixed acuity and specialty care areas, termed ‘other’. Unit types with fewer than five facilities represented were excluded. AU rates were determined by Antimicrobial Days of Therapy (DOT) per 1000 Days Present (DP). Statistical analyses , including descriptive statistics and comparison among the units by ANOVA test , were calculated using SAS Version 9.4. Results: Eighty-three facilities reported at least one month of data into the NHSN AU Option between 2015–2023. Among 70 facilities reporting inpatient units, the highest AU rate was observed in oncology ward units (n=12, 1114.6 DOT/1000 DP). A significant difference in AU rates was observed between oncology ward units compared to different unit types (p < 0 .001). Vancomycin, ceftriaxone, and piperacillin/tazobactam were the most used antimicrobials with AU rates of 83, 76, and 65 DOT/1000 DP, respectively. Vancomycin AU rates were significantly higher in oncology ward units compared to stepdown, ward, other, and OR units (p < 0 .0001). Ceftriaxone AU rate was significantly higher in stepdown units compared to oncology ward, other, and OR units (p < 0 .0001). Piperacillin/tazobactam AU rate was significantly higher in critical care units compared to different unit types (p < 0 .0001). Conclusion: During the study period, the AU rate varied across hospital inpatient units in Tennessee, with the highest AU rate in oncology wards. This unit-specific approach is critical to address the diverse antimicrobial prescribing behaviors observed, indicating that interventions should be customized to each unit’s distinct antimicrobial usage patterns for successful stewardship efforts. Targeted strategies focused on individual wards rather than facility-wide initiatives appear essential for effective reduction in antibiotic usage.
Develop and implement a system in the Veterans Health Administration (VA) to alert local medical center personnel in real time when an acute- or long-term care patient/resident is admitted to their facility with a history of colonization or infection with a multidrug-resistant organism (MDRO) previously identified at any VA facility across the nation.
Methods:
An algorithm was developed to extract clinical microbiology and local facility census data from the VA Corporate Data Warehouse initially targeting carbapenem-resistant Enterobacterales (CRE) and methicillin-resistant Staphylococcus aureus (MRSA). The algorithm was validated with chart review of CRE cases from 2010-2018, trialed and refined in 24 VA healthcare systems over two years, expanded to other MDROs and implemented nationwide on 4/2022 as “VA Bug Alert” (VABA). Use through 8/2023 was assessed.
Results:
VABA performed well for CRE with recall of 96.3%, precision of 99.8%, and F1 score of 98.0%. At the 24 trial sites, feedback was recorded for 1,011 admissions with a history of CRE (130), MRSA (814), or both (67). Among Infection Preventionists and MDRO Prevention Coordinators, 338 (33%) reported being previously unaware of the information, and of these, 271 (80%) reported they would not have otherwise known this information. By fourteen months after nationwide implementation, 113/130 (87%) VA healthcare systems had at least one VABA subscriber.
Conclusions:
A national system for alerting facilities in real-time of patients admitted with an MDRO history was successfully developed and implemented in VA. Next steps include understanding facilitators and barriers to use and coordination with non-VA facilities nationwide.
We present radio observations of the galaxy cluster Abell S1136 at 888 MHz, using the Australian Square Kilometre Array Pathfinder radio telescope, as part of the Evolutionary Map of the Universe Early Science program. We compare these findings with data from the Murchison Widefield Array, XMM-Newton, the Wide-field Infrared Survey Explorer, the Digitised Sky Survey, and the Australia Telescope Compact Array. Our analysis shows the X-ray and radio emission in Abell S1136 are closely aligned and centered on the Brightest Cluster Galaxy, while the X-ray temperature profile shows a relaxed cluster with no evidence of a cool core. We find that the diffuse radio emission in the centre of the cluster shows more structure than seen in previous low-resolution observations of this source, which appeared formerly as an amorphous radio blob, similar in appearance to a radio halo; our observations show the diffuse emission in the Abell S1136 galaxy cluster contains three narrow filamentary structures visible at 888 MHz, between $\sim$80 and 140 kpc in length; however, the properties of the diffuse emission do not fully match that of a radio (mini-)halo or (fossil) tailed radio source.
This study investigates the impact of primary care utilisation of a symptom-based head and neck cancer risk calculator (Head and Neck Cancer Risk Calculator version 2) in the post-coronavirus disease 2019 period on the number of primary care referrals and cancer diagnoses.
Methods
The number of referrals from April 2019 to August 2019 and from April 2020 to July 2020 (pre-calculator) was compared with the number from the period January 2021 to August 2022 (post-calculator) using the chi-square test. The patients’ characteristics, referral urgency, triage outcome, Head and Neck Cancer Risk Calculator version 2 score and cancer diagnosis were recorded.
Results
In total, 1110 referrals from the pre-calculator period were compared with 1559 from the post-calculator period. Patient characteristics were comparable for both cohorts. More patients were referred on the cancer pathway in the post-calculator cohort (pre-calculator patients 51.1 per cent vs post-calculator 64.0 per cent). The cancer diagnosis rate increased from 2.7 per cent in the pre-calculator cohort to 3.3 per cent in the post-calculator cohort. A lower rate of cancer diagnosis in the non-cancer pathway occurred in the cohort managed using the Head and Neck Cancer Risk Calculator version 2 (10 per cent vs 23 per cent, p = 0.10).
Conclusion
Head and Neck Cancer Risk Calculator version 2 demonstrated high sensitivity in cancer diagnosis. Further studies are required to improve the predictive strength of the calculator.
Background: Antimicrobial stewardship programs (ASPs) seek to reduce the prevalence of antimicrobial-resistant and healthcare-associated infections. There are limited infectious disease (ID) physicians and pharmacists to support these ASPs, particularly in rural areas. The Veterans Health Administration has a robust telehealth program in place. Our previous work has demonstrated the feasibility of using telehealth modalities to support ASPs at rural Veterans Affairs medical centers (VAMCs) by pairing them with an ID expert from a larger, geographically distant, VAMC. This program, dubbed the Videoconference Antimicrobial Stewardship Team (VAST), emphasizes discussion of patients undergoing treatment for an active infection and additional relevant clinical topics with a multidisciplinary team at the rural VA. VAST implementation is ongoing at VAMCs. To understand and compare the qualitative differences in implementation, we used process maps to describe the VAST at 3 VAMC dyads. Methods: Team members from each dyad participated in interviews at 3, 6, and 9 months after beginning their VAST sessions. Questions addressed several aspects of VAST implementation and included identifying cases and topics to discuss; advance preparation for meetings; the frequency and general structure of VAST meetings; and documentation including workload capture. The research team used the responses to develop process maps to permit visual display and comparison of VAST implementation. Results: The first dyad began in January 2022 and the third in March 2022. The sessions had 3 phases: preparation, team meeting, and documentation of experts’ recommendations. Tasks were shared between VAST champions at the rural VAMC and the ID experts (Fig. 1). The preparation phase showed the most variation among the 3 dyads. In general, champions at the rural VA identified cases and topics for discussion that were sent to the ID expert for review. The approaches used to find cases and the type of preparatory work by the ID expert differed. Team meetings differed in both frequency and participation by professionals from the rural site. Documentation of expert recommendations processes appeared similar among the dyads. Discussion: Each of the 3 dyads implemented VAST differently. These results suggest that the overall structure of the VAST is readily adaptable and that each site tailored VAST to suit the clinical needs, workflow, and culture of their partner facility. Future work will seek to determine which aspects in the preparation, team meeting, or documentation phases are associated with successful ASPs, including assessment of quantitative and qualitative outcomes.
Background: The NHSN Antimicrobial Resistance (AR) Option is an important avenue for acute-care hospitals to electronically report facilitywide antibiogram data. The NEDSS Base System (NBS) is the statewide surveillance system for mandatory reporting of all carbapenem-resistant Enterobacteriaceae (CRE) cases. The state health department (SHD) validated CRE case data reported through the AR Option to assess completeness and accuracy. Methods: NHSN AR Option data from July 2020–December 2021 for 24 facilities were validated by comparing reported CRE and susceptibility results to CRE isolates reported via the NBS. Isolates were matched based on specimen date, sex, birth month and day, pathogen, and specimen source. NHSN susceptibility results were dichotomized as “not resistant” and “resistant” to match the NBS results. Susceptibility discordance (differing proportions of resistant isolates) of matched pairs were evaluated using the McNemar exact test in SAS version 9.4 software. Results: The SHD identified 270 CRE cases from the NHSN and 1,254 unique CRE isolates from the NBS. Of the NHSN events, 72 (26.67%) were matched to the NBS. Among matched isolates, discordance was significant for doripenem (0 resistant isolates in the NHSN vs 13 in the NBS; P < .001) and imipenem (5 resistant isolates in the NHSN vs 23 in the NBS; P < .0001). Discordance was not significant for ertapenem nor meropenem. Sensitivity analyses maximized the match rate at 30.74% (83 matches) when NBS isolates from unknown sources were included and matching factors were specimen date and date of birth ± 1 day, and pathogen. Among all 6,325 CRE isolates in NBS, 290 (4.58%) did not have a specimen source provided. Of all 47,348 NHSN events, 7,624 (16.10%) had impossible patient birthdays. Conclusions: Many NHSN isolates could not be matched to NBS due to either isolates being missing from NBS or to data differences across the systems. This mismatch highlights the need for data validation and standardization at the point of entry for both systems. Discordant susceptibility outcomes raise concerns about using the NHSN as a method for facility and regional antibiogram data.
Background: Healthcare settings without access to infectious diseases experts may struggle to implement effective antibiotic stewardship programs. We previously described a successful pilot project using the Veterans Affairs (VA) telehealth system to form a Videoconference Antimicrobial Stewardship Team (VAST) that connected multidisciplinary teams from rural VA medical centers (VAMCs) with infectious diseases experts at geographically distant locations. VASTs discussed patients from the rural VAMC, with the overarching goal of supporting antibiotic stewardship. This project is currently ongoing. Here, we describe preliminary outcomes describing the cases discussed, recommendations made, and acceptance of those recommendations among 4 VASTs. Methods: Cases discussed at any of the 4 participating intervention sites were independently reviewed by study staff, noting the infectious disease diagnoses, recommendations made by infectious diseases experts and, when applicable, acceptance of those recommendations at the rural VAMC within 1 week. Discrepancies between independent reviewers were discussed and, when consensus could not be reached, discrepancies were discussed with an infectious diseases clinician. Results: The VASTs serving 4 different rural VAMCs discussed 96 cases involving 92 patients. Overall, infection of the respiratory tract was the most common syndrome discussed by VASTs (Fig. 1). The most common specific diagnoses among discussed cases were cellulitis (n = 11), acute cystitis (n = 11), wounds (n = 11), and osteomyelitis (n = 10). Of 172 recommendations, 41 (24%) related to diagnostic imaging or laboratory results and 38 (22%) were to change the antibiotic agent, dose, or duration (Fig. 2). Of the 151 recommendations that could be assessed via chart review, 122 (81%) were accepted within 1 week. Conclusions: These findings indicate successful implementation of telehealth to connect clinicians at rural VAMCs with an offsite infectious diseases expert. The cases represented an array of common infectious syndromes. The most frequent recommendations pertained to getting additional diagnostic information and to adjusting, but not stopping, antibiotic therapy. These results suggest that many of the cases discussed warrant antibiotics and that VASTs may use the results of diagnostic studies to tailor that therapy. The high rate of acceptance suggests that the VASTs are affecting patient care. Future work will describe VAST implementation at 4 additional VAMCs, and we will assess whether using telehealth to disseminate infectious diseases expertise to rural VAMCs supports changes in antibiotic use that align with principles of antimicrobial stewardship.
Background: Respiratory tract infections (RTIs) in long-term care facilities (LTCFs) are particularly burdensome among residents, the COVID-19 pandemic highlighted the devastating consequences of RTIs in LTCFs. This situation has prompted the need for LTCFs to have a robust, active surveillance system to assist LTCFs with RTI identification. Such a system could assist with faster implementation of appropriate antimicrobial therapy and critical infection prevention and control. The TN Emerging Infections Program worked with CDC EIP to implement a pilot project to test the feasibility of performing RTI surveillance to inform future changes to NHSN. Methods: We recruited 6 LTCFs to collect prospective RTI surveillance for 6 consecutive months from October 2021 through March 2022. Data were collected for all residents meeting the RTI surveillance definitions: pneumonia, lower respiratory tract infection, influenza-like illness (including influenza), and COVID-19. These data were entered by facility workers into a REDCap database with a prospective RTI LTCF event form. Monthly data collection summaries were submitted using a designated denominator form. Descriptive statistics were used to analyze RTI data, and analyses were performed using SAS version 9.4 software. Results: In total, 6 facilities participated in the pilot project during the capture period. The total number of RTI cases across all facilities was 195. December had the most cases (n = 50). The most common first triggers were new RTI signs or symptoms (67.69%), laboratory results (17.44%), imaging findings (6.67%), and clinician-diagnosed RTI (8.21%). The most reported symptom was new or increased cough (57.44%). Chest radiographs were performed for 50.77% of patients. Positive viral laboratory test results were documented 29.74% of the time. Antibiotic treatments were given to 70.77% of residents. The most commonly prescribed antibiotics were cephalosporins (22.56%), macrolides (17.95%), fluoroquinolones (12.31%), and doxycycline (9.23%). Also, 17.4% of cases with antibiotic regimens had cephalosporins as monotherapy. Vaccine documentation was as follows: influenza 2020–2021 (40.51%), influenza 2021–2022 (64.1%), complete COVID-19 vaccine series (82.56%), PPSV-23 vaccine (33.85%), and PCV-13 (23.59%). Conclusions: RTI surveillance was incorporated smoothly into the daily workflow for facilities; the biggest barrier to effective implementation was staff turnover. A scheduled weekly time to collect data and fill out forms proved most effective. A high percentage of cases was treated with cephalosporins as monotherapy, which, based on the latest guidelines, may be suboptimal. Individual reports were sent back to facilities with a comparison to the aggregated data. These data will be used to evaluate antibiotic appropriateness and to guide future RTI surveillance efforts in the LTCF setting.
Objectives: To address the importation of multi-drug-resistant organisms (MDROs) when a colonized or infected patient is transferred from another VA facility, the Veterans Health Administration (VHA) launched the Inpatient Pathogen Tracker (IPT) in 2020. IPT tracks MDRO-infected/colonized patients and alerts MDRO Program Coordinators (MPCs) and Infection Preventionists (IPs) when such patients are admitted to their facility to facilitate rapid identification and isolation of infected/colonized patients. IPT usage has been low during initial rollout (32.5%). The VHA and the CARRIAGE QUERI Program developed targeted implementation strategies to increase utilization of IPT’s second iteration, VA Bug Alert (VABA). Methods: Familiarity with IPT was assessed via pre-education survey (3/2022). All sites received standard VABA implementation including: 1) adaptation of VABA features based on end-user feedback (completed 4/2022), 2) development and delivery of an educational module regarding the revised tool (completed 4/2022), and 3) internal facilitation from the VHA MDRO Program Office (ongoing) (see Figure for all key timepoints). Intent to register for VABA was assessed via post-education survey (4-5/2022). Sites (125 eligible) not registered for VABA by 6/1/2022 were randomly assigned to receive one of two conditions from 6/2022–8/2022: continued standard implementation alone or enhanced implementation. Enhanced implementation added the following to standard implementation: 1) audit and feedback reports and 2) external facilitation, including interviews and education about VABA. We compared the number of sites with ≥1 MPC/IP registered for VABA to-date between implementation conditions. Results:Pre-education survey. 168 MPC/IPs across 117 sites responded (94% of eligible sites). Among respondents, 25% had used IPT, 35.1% were familiar with but had not used IPT, and 39.9% were unfamiliar with IPT. Post-education survey. 93 MPC/IPs across 80 sites responded (59% of eligible sites). Of these, 81.7% said they planned to register for VABA, 4.3% said they would not register, and 14.0% said they were unsure. Post-6/1/2022 Registrations. By 6/1/2022, 71% of sites had ≥1 registered VABA user. Of the 28 unregistered sites eligible for enhanced implementation, thirteen were assigned to receive enhanced implementation, and fifteen were assigned to receive continued standard implementation. Eight sites in the enhanced implementation condition (61.5%) registered for VABA. Seven standard-implementation-only sites (46.7%) registered. The number of registered sites did not significantly differ by implementation condition (Fisher’s exact p=0.476). Conclusions: Standard and enhanced implementation were equally effective at encouraging VABA registration, suggesting that allocating resources to enhanced implementation may not be necessary.
Cyril Fox's publication The archaeology of the Cambridge region (1923) is celebrated as a milestone in the development of landscape archaeology. Its centenary invites reflection on Fox's approach to landscape and on the development of knowledge about the archaeology of the Cambridge region over the intervening years. Here, the authors compare the evidence available to Fox with the results of three decades of development-led archaeology. The latter have revealed very high numbers of sites, with dense ‘packing’ of settlements in all areas of the landscape; the transformation in knowledge of clayland areas is particularly striking. These high-density pasts have far-reaching implications for the understanding of later prehistoric and Roman-period land-use and social relations.
The 2019 UK General Election had seismic consequences for British politics. After three years of political turmoil following the 2016 referendum on Britain’s membership of the European Union (EU), the 2019 election marked a victory for the Leave side of the Brexit debate, putting to rest questions of a second referendum and any chance of Parliament blocking the Withdrawal Bill. The United Kingdom left the EU on January 31, 2020. Although there were clear consequences for Britain’s EU membership, there is debate about whether 2019 was a “Brexit election” (Prosser 2020)—even a critical election (Green 2021)—or the continuation of long-term realignments in British politics (Cutts et al. 2020; Jennings and Stoker 2017). By most accounts, Brexit dominated the 2019 election as a political issue, but whether this represents a key moment in a process of realignment of voters in Britain remains to be seen.
The true incidence and risk factors for secondary bacterial infections in coronavirus disease 2019 (COVID-19) remains poorly understood. Knowledge of risk factors for secondary infections in hospitalized patients with COVID-19 is necessary to optimally guide selective use of empiric antimicrobial therapy.
Design:
Single-center retrospective cohort study of symptomatic inpatients admitted for COVID-19 from April 15, 2020, through June 30, 2021.
Setting:
Academic quaternary-care referral center in Portland, Oregon.
Patients:
The study included patients who were 18 years or older with a positive severe acute respiratory coronavirus virus 2 (SARS-CoV-2) PCR test up to 10 days prior to admission.
Methods:
Secondary infections were identified based on clinical, radiographic, and microbiologic data. Logistic regression was used to identify risk factors for secondary infection. We also assessed mortality, length of stay, and empiric antibiotics among those with and without secondary infections.
Results:
We identified 118 patients for inclusion; 31 (26.3%) had either culture-proven or possible secondary infections among hospitalized patients with COVID-19. Mortality was higher among patients with secondary infections (35.5%) compared to those without secondary infection (4.6%). Empiric antibiotic use on admission was high in both the secondary and no secondary infection groups at 71.0% and 48.3%, respectively.
Conclusions:
The incidence of secondary bacterial infection was moderate among hospitalized patients with COVID-19. However, a higher proportion of patients received empiric antibiotics regardless of an identifiable secondary infection. Transfer from an outside hospital, baseline immunosuppressant use, and corticosteroid treatment were independent risk factors for secondary infection. Additional studies are needed to validate risk factors and best guide antimicrobial stewardship efforts.
Online experiments have become a valuable research tool for researchers interested in the processes underlying cooperation. Typically, online experiments are asynchronous, participants complete an experiment individually and are matched with partners after data collection has been completed. We conducted a registered report to compare asynchronous and synchronous designs, where participants interact and receive feedback in real-time. We investigated how two features of synchronous designs, pre-decision matching and immediate feedback, influence cooperation in the prisoners dilemma. We hypothesized that 1) pre-decision matching (assigning participants to specific interaction partners before they make decisions) would lead to decreased social distance and increased cooperation; 2) immediate feedback would reduce feelings of aversive uncertainty and lead to increased cooperation; and 3) individuals with prosocial Social Value Orientations would be more sensitive to the differences between synchronous and asynchronous designs. We found no support for these hypotheses. In our study (N = 1,238), pre-decision matching and immediate feedback had no significant effects on cooperative behavior or perceptions of the interaction; and their effects on cooperation were not significantly moderated by Social Value Orientation. The present results suggest that synchronous designs have little effect on cooperation in online social dilemma experiments.