We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Antibiotic stewardship programs (ASPs) are crucial to prevent the emergence of antibiotic resistance and to improve outcomes for patients. A validated instrument rooted in a theoretically derived implementation science framework will increase our understanding of ASP implementation and enable comparisons across implementation sites.
Design:
Methods:
Antibiotic stewards (infectious disease pharmacists and physicians) were recruited from Veterans Affairs (VA) hospitals to complete a survey on stewardship implementation. We used the Consolidated Framework for Implementation Research (CFIR) to guide development of an ASP implementation survey assessing 22 potential determinants of implementation across five domains of CFIR. We conducted confirmatory factor analyses (CFA) to assess construct validity of 8 construct measures and evaluated internal consistency.
Results:
A total of 150 stewards completed the survey from 110 VA hospitals. CFA for most CFIR constructs exhibited good fit. Internal consistency for CFIR construct subscales (Cronbach’s alpha) ranged from 0.54–0.96, indicating modest to strong internal consistency. Determinants that were rated highly present at the sites (across site means ≥ 4.0 or above) included Self-Efficacy, Engaging, Evidence Strength and Quality and Relative Advantage, indicating stewards found ASP evidence compelling and felt their personal involvement was effective in engendering positive results for the ASP.
Conclusions:
Psychometric properties indicate validity of the first CFIR-based survey of determinants for ASP implementation outcomes. Clinical, quality improvement, and research teams can use this survey to identify contextual determinants of ASP implementation and use this information to guide selection of strategies and compare results across multiple sites.
Identifying optimal methods for sampling surfaces in the healthcare environment is critical for future research requiring the identification of multidrug-resistant organisms (MDROs) on surfaces.
Methods:
We compared 2 swabbing methods, use of a flocked swab versus a sponge-stick, for recovery of MDROs by both culture and recovery of bacterial DNA via quantitative 16S polymerase chain reaction (PCR). This comparison was conducted by assessing swab performance in a longitudinal survey of MDRO contamination in hospital rooms. Additionally, a laboratory-prepared surface was also used to compare the recovery of each swab type with a matching surface area.
Results:
Sponge-sticks were superior to flocked swabs for culture-based recovery of MDROs, with a sensitivity of 80% compared to 58%. Similarly, sponge-sticks demonstrated greater recovery of Staphylococcus aureus from laboratory-prepared surfaces, although the performance of flocked swabs improved when premoistened. In contrast, recovery of bacterial DNA via quantitative 16S PCR was greater with flocked swabs by an average of 3 log copies per specimen.
Conclusions:
The optimal swabbing method of environmental surfaces differs by method of analysis. Sponge-sticks were superior to flocked swabs for culture-based detection of bacteria but inferior for recovery of bacterial DNA.
Objectives/Goals: The creatine (Cr) system is impaired in Alzheimer’s disease (AD). Data show that creatine monohydrate (CrM) supplementation may improve AD symptoms in AD mouse models, but no human studies have been reported. Thus, we investigated whether an eight-week CrM supplementation was feasible and associated with increased brain creatine in patients with AD. Methods/Study Population: Twenty participants with probable AD were allocated to an open-label, eight-week intervention of 20 g/day CrM. Fasting blood draws were taken at baseline, 4-, and 8-week visits to measure serum creatine (Quest Diagnostics). 1H magnetic resonance spectroscopy was performed at baseline and 8-week visits to measure brain Cr as a ratio to unsuppressed water. Self-reported compliance (with assistance from study partners) was assessed with daily CrM trackers. The mean compliance percentage across all participants was used to describe overall compliance with the intervention. We used paired t-tests to analyze the mean changes in serum Cr levels from baseline to 4- and 8-week visits and the mean change in brain Cr from baseline to 8-week visits. Statistical significance was set at p<0.05. Results/Anticipated Results: Participants were 65% male with a mean age of 73.1±6.3 years. All participants completed the study, with 19 out of 20 achieving the dose compliance target of ≥80%. The mean self-reported dose intake was 90%. Serum Cr levels were significantly increased at 4- and 8-week visits compared to baseline (0.6±0.4 mg/dL vs. 14.0±9.9 mg/dL and 15.0±13.6 mg/dL, respectively; p<0.001). Brain Cr levels also significantly increased (330.5±36.80 i.u. vs. 366.9±57.52 i.u., p<0.001). Discussion/Significance of Impact: We are the first to demonstrate that 20 g/day of CrM for eight weeks is feasible and associated with increased brain Cr in patients with AD. Our findings support further investigation of brain target engagement of CrM and its efficacy in AD. With AD cases expected to rise, CrM could serve as an effective, affordable therapeutic to slow AD progression.
The recommended first-line treatment for insomnia is cognitive behavioral therapy for insomnia (CBTi), but access is limited. Telehealth- or internet-delivered CBTi are alternative ways to increase access. To date, these intervention modalities have never been compared within a single study. Further, few studies have examined (a) predictors of response to the different modalities, (b) whether successfully treating insomnia can result in improvement of health-related biomarkers, and (c) mechanisms of change in CBTi. This protocol was designed to compare the three CBTi modalities to each other and a waitlist control for adults aged 50–65 years (N = 100). Participants are randomly assigned to one of four study arms: in-person- (n = 30), telehealth- (n = 30) internet-delivered (n = 30) CBTi, or 12-week waitlist control (n = 10). Outcomes include self-reported insomnia symptom severity, polysomnography, circadian rhythms of activity and core body temperature, blood- and sweat-based biomarkers, cognitive functioning and magnetic resonance imaging.
The authors offer reflections and lessons learned in a single pediatric tertiary center’s experience during a pediatric mass casualty incident (MCI). The MCI occurred at a holiday parade and the patients were brought to multiple community emergency departments for initial resuscitation prior to transfer to the Pediatric level 1 trauma center. In total, 18 children presented with severe blunt force trauma after a motor vehicle entered the parade route. Following initial triage in emergency departments, 10 of 18 children injured during the incident were admitted to the Pediatric Intensive Care Unit, collectively representing a system-wide stressor of emergency medicine, critical care, and surgical services. Institutional characteristics, activation of personnel and supplies, and psychosocial support for families during an MCI are important to consider in children’s hospitals’ disaster preparedness planning.
The prevalence of schizophrenia is relatively low, yet increasing globally, and the disorder imparts a substantial burden of disease on both individuals and health systems. With regard to schizophrenia treatments, including long-acting injectable antipsychotics (LAIs), social media listening provides a unique source of insight into the experiences and perceptions of healthcare professionals (HCPs), patients, and caregivers who live with and manage this disorder daily.
Objective
To gain insight into HCP and patient/caregiver perceptions of LAIs for the treatment of schizophrenia.
Methods
Publicly available online conversations in global English about LAIs for schizophrenia from May 2, 2022, to May 2, 2023, were analyzed. Posts were collected using customized search strings from social media analysis tools, including Talkwalker and Meltwater. Online forums, such as Reddit, were the main source for patient/caregiver conversations. Conversations among HCPs were examined using publicly available posts from Twitter about schizophrenia/LAIs. Random samples of posts on forums (100) and Twitter (100) were coded for primary topic, author type (patient, caregiver, or HCP), sentiment toward LAIs, and signs of LAI hesitancy. Additional topics in posts, such as barriers and benefits to LAI use, were also examined.
Results
In the analyzed samples, some differences were observed between patients/caregivers (mostly patients) and HCPs (mostly psychiatrists) in lexicon, focus, and perspective. The most common terms for LAIs among patients/caregivers were “injection” or “shot,” while HCPs used the terms “LAIs” or “injectables.” The most frequent primary topic among patients/caregivers was treatment regimen, including impact of symptoms and side effects on quality of life. HCPs focused on drug efficacy, including broader health outcomes such as relapse, hospitalization, adherence, and mortality. Patients/caregivers expressed fewer positive sentiments (11% of posts) and more negative sentiments (35%) than HCPs (34% positive, 14% negative). Both groups noted reduced relapse and improved adherence among the top treatment benefits. Barriers to LAI use commonly cited by patients/caregivers included side effects and lack of effect on negative symptoms, while common barriers cited by HCPs included patient access/cost and limited knowledge around best prescribing practices. Treatment comparisons and/or switching were more commonly mentioned among patients/caregivers (51%) than HCPs (30%), suggesting a greater interest in optimizing treatment among patients. Patients/caregivers often compared individual LAIs with oral antipsychotics (OAs) or different LAIs, whereas it was more typical for HCPs to compare LAIs with OAs than to distinguish between different LAIs.
Conclusions
Based on social media posts, patients/caregivers and HCPs had different primary treatment goals/concerns and generally used different lexicons, which may affect communication. Overall, HCPs were more positive and less negative toward LAIs than patients/caregivers. Top benefits noted (relapse and adherence) were similar between groups, while top treatment barriers differed. These differences highlight the need to improve communication between patients/caregivers and HCPs in order to increase treatment satisfaction and potentially improve treatment outcomes.
In response to the COVID-19 pandemic, we rapidly implemented a plasma coordination center, within two months, to support transfusion for two outpatient randomized controlled trials. The center design was based on an investigational drug services model and a Food and Drug Administration-compliant database to manage blood product inventory and trial safety.
Methods:
A core investigational team adapted a cloud-based platform to randomize patient assignments and track inventory distribution of control plasma and high-titer COVID-19 convalescent plasma of different blood groups from 29 donor collection centers directly to blood banks serving 26 transfusion sites.
Results:
We performed 1,351 transfusions in 16 months. The transparency of the digital inventory at each site was critical to facilitate qualification, randomization, and overnight shipments of blood group-compatible plasma for transfusions into trial participants. While inventory challenges were heightened with COVID-19 convalescent plasma, the cloud-based system, and the flexible approach of the plasma coordination center staff across the blood bank network enabled decentralized procurement and distribution of investigational products to maintain inventory thresholds and overcome local supply chain restraints at the sites.
Conclusion:
The rapid creation of a plasma coordination center for outpatient transfusions is infrequent in the academic setting. Distributing more than 3,100 plasma units to blood banks charged with managing investigational inventory across the U.S. in a decentralized manner posed operational and regulatory challenges while providing opportunities for the plasma coordination center to contribute to research of global importance. This program can serve as a template in subsequent public health emergencies.
Background: Prior research has implicated contaminated surfaces in the transmission of Clostridioides difficile within the hospital. To reduce the risk of transmission, enhanced environmental hygiene is performed in rooms of patients with known C.difficile infection (CDI). We wished to evaluate the residual impact of environmental surfaces on hospital-onset CDI (HO-CDI) by comparing HO-CDI rates before and after the opening of a new 504-bed hospital building, HUP Pavilion (PAV). We hypothesized that we would observe a reduction in HO-CDI after opening of PAV due to a reduced burden of C.difficile spores in the environment. Methods: We included NHSN reported HO-CDI rates for 28 months prior and 24 months after opening of PAV. Upon opening, patients were divided between the old building (HUP) and PAV. We included all patient units before and after opening. We created hierarchical models of HO-CDI rates using Stan Hamiltonian Monte Carlo (HMC) version 2.30.1, via the “cmdstanr” and “brms” packages with a GAM smooth function by month and intervention period with default, weakly-informative priors. Results: At baseline, there was an average of approximately 20,100 patient days per month, subsequently divided between HUP and PAV (mean 10,100 and 12,100 patient days per month). After opening of PAV, we observed a reduced HO-CDI rate (mean 0.21 vs 0.31 per 1000 patient days, P=0.01). When comparing the two specific buildings after opening of PAV, there was a greater reduction noticed in the old building (HUP) as compared to the new building (PAV) (0.12 vs 0.29 per 1000 patient days) (Figure 1). The predicted contrast in HO-CDI rate (Figure 2), shows no immediate change in HO-CDI after opening, however a sustained reduction estimated at 0.1 HO-CDI events per 1000 patient days for the duration of follow-up. Conclusions: We observed a reduction in HO-CDI rates after the opening of a new hospital building. The difference in HO-CDI rates between hospital buildings after the move is likely due to the concentration of high-risk patient cohorts within this building. Our findings suggests that there remains an opportunity to reduce HO-CDI through environmental hygiene. However, it is possible that other factors beyond surface environment contributed to an observed reduction in HO-CDI, including other concurrent infection control interventions that focused on smaller populations within the hospital. In future work we will investigate the durability of this observed effect with additional analyses including patient-level risk for HO-CDI.
Military Servicemembers and Veterans are at elevated risk for suicide, but rarely self-identify to their leaders or clinicians regarding their experience of suicidal thoughts. We developed an algorithm to identify posts containing suicide-related content on a military-specific social media platform.
Methods
Publicly-shared social media posts (n = 8449) from a military-specific social media platform were reviewed and labeled by our team for the presence/absence of suicidal thoughts and behaviors and used to train several machine learning models to identify such posts.
Results
The best performing model was a deep learning (RoBERTa) model that incorporated post text and metadata and detected the presence of suicidal posts with relatively high sensitivity (0.85), specificity (0.96), precision (0.64), F1 score (0.73), and an area under the precision-recall curve of 0.84. Compared to non-suicidal posts, suicidal posts were more likely to contain explicit mentions of suicide, descriptions of risk factors (e.g. depression, PTSD) and help-seeking, and first-person singular pronouns.
Conclusions
Our results demonstrate the feasibility and potential promise of using social media posts to identify at-risk Servicemembers and Veterans. Future work will use this approach to deliver targeted interventions to social media users at risk for suicide.
Bacterial resistance is known to diminish the effectiveness of antibiotics for treatment of urinary tract infections. Review of recent healthcare and antibiotic exposures, as well as prior culture results is recommended to aid in selection of empirical treatment. However, the optimal approach for assessing these data is unclear. We utilized data from the Veterans Health Administration to evaluate relationships between culture and treatment history and the subsequent probability of antibiotic-resistant bacteria identified in urine cultures to further guide clinicians in understanding these risk factors.
Methods:
Using the XGBoost algorithm, a retrospective cohort of outpatients with urine culture results and antibiotic prescriptions from 2017 to 2022 was used to develop models for predicting antibiotic resistance for three classes of antibiotics: cephalosporins, fluoroquinolones, and trimethoprim/sulfamethoxazole (TMP/SMX) obtained from urine cultures. Model performance was assessed using Area Under the Receiver Operating Characteristic curve (AUC) and Precision-Recall AUC (PRAUC)
Results:
There were 392,647 prior urine cultures identified in 214,656 patients. A history of bacterial resistance to the specific treatment was the most important predictor of subsequent resistance for positive cultures, followed by a history of specific antibiotic exposure. The models performed better than previously established risk factors alone, especially for fluoroquinolone resistance, with an AUC of .84 and PRAUC of .70. Notably, the models’ performance improved markedly (AUC = .90, PRAUC = .87) when applied to cultures from patients with a known history of resistance to any of the antibiotic classes.
Conclusion:
These predictive models demonstrate potential in guiding antibiotic prescription and improving infection management.
To describe an outbreak of sequence type (ST)2 Clostridioides difficile infection (CDI) detected by a recently implemented multilocus sequence type (MLST)-based prospective genomic surveillance system using Oxford Nanopore Technologies (ONT) sequencing.
Setting:
Hemato-oncology ward of a public tertiary referral centre.
Methods:
From February 2022, we began prospectively sequencing all C. difficile isolated from inpatients at our institution on the ONT MinION device, with the output being an MLST. Bed-movement data are used to construct real-time ST-specific incidence charts based on ward exposures over the preceding three months.
Results:
Between February and October 2022, 76 of 118 (64.4%) CDI cases were successfully sequenced. There was wide ST variation across cases and the hospital, with only four different STs being seen in >4 patients. A clear predominance of ST2 CDI cases emerged among patients with exposure to our hemato-oncology ward between May and October 2022, which totalled ten patients. There was no detectable rise in overall CDI incidence for the ward or hospital due to the outbreak. Following a change in cleaning product to an accelerated hydrogen peroxide wipe and several other interventions, no further outbreak-associated ST2 cases were detected. A retrospective phylogenetic analysis using original sequence data showed clustering of the suspected outbreak cases, with the exception of two cases that were retrospectively excluded from the outbreak.
Conclusions:
Prospective genomic surveillance of C. difficile using ONT sequencing permitted the identification of an outbreak of ST2 CDI that would have otherwise gone undetected.
Airway management is a cornerstone in the prehospital care of critically ill or injured patients. Surgical cricothyrotomy offers a rapid and effective solution when oxygenation and ventilation fail using less-invasive techniques. However, the exact indications, incidence, and success of prehospital surgical cricothyrotomy are unknown, with variable rates reported in the literature. This study aimed to examine prehospital indications and success rates for surgical cricothyrotomy within a large, suburban, ground-based Emergency Medical Services (EMS) system.
Methods:
This is a retrospective analysis of 31 patients who underwent paramedic performed surgical cricothyrotomy from 2012 through 2022. Key demographic parameters were analyzed, including the incidence of cardiac arrest, call type (trauma versus medical), initial airway management attempts, number of endotracheal intubation (ETI) attempts before surgical airway, and average time to the establishment of a surgical airway in relation to the number of ETI attempts. Surgical cricothyrotomy success was defined as the acquisition of four-phase end-tidal capnography reading. The primary data sources were the EMS electronic medical records, and descriptive statistics were calculated.
Results:
A total of 31 patients were included in the final analysis. Of those who received a surgical cricothyrotomy, 42% (13/31) occurred in the trauma setting, while 58% (18/31) were medical calls. In all patients who underwent surgical cricothyrotomy, the median (IQR) time to the procedure was 17 minutes (IQR = 11-24). In trauma patients, the median time to surgical cricothyrotomy was 12 minutes (IQR = 9-19) versus 19 minutes (IQR = 14-33) in medical patients. End-tidal carbon dioxide (ETCO2) detection and placement success was confirmed in 94% (29/31) of patients. Endotracheal intubation was attempted in 55% (17/31) before subsequent surgical cricothyrotomy, with 29% (9/31) receiving more than one ETI attempt. The median time to surgical cricothyrotomy when multiple prior intubation attempts occurred was 33 minutes (IQR = 23-36) compared to 14.5 minutes (IQR = 6-19) in patients without a preceding intubation attempt.
Conclusion:
Prehospital surgical airway can be performed by paramedics with a high degree of success. Identification of the need for surgical cricothyrotomy should be determined as soon as possible to allow for rapid securement of the airway and to ensure adequate oxygenation and ventilation.
OBJECTIVES/GOALS: Adoption of the Observational Medical Outcomes Partnership (OMOP) common data model promises to transform large-scale observational health research. However, there are diverse challenges for operationalizing OMOP in terms of interoperability and technical skills among coordinating centers throughout the US. METHODS/STUDY POPULATION: A team from the Critical Path Institute (C-Path) collaborated with the informatics team members at Johns Hopkins to provide technical support to participating sites as part of the Extract, Transform, and Load (ETL) process linking existing concepts to OMOP concepts. Health systems met regularly via teleconference to review challenges and progress in ETL process. Sites were responsible for performing the local ETL process with assistance and securely provisioning de-identified data as part of the CURE ID program. RESULTS/ANTICIPATED RESULTS: More than twenty health systems participated in the CURE ID effort.Laboratory measures, basic demographics, disease diagnoses and problem list were more easily mapped to OMOP concepts by CURE ID partner institutions. Outcomes, social determinants of health, medical devices, and specific treatments were less easily characterized as part of the project. Concepts within the medical record presented very different technical challenges in terms of representation. There is a lack of standardization in OMOP implementation even among centers using the same electronic medical health record. Readiness to adopt OMOP varied across the institutions who participated. Health systems achieved variable level of coverage using OMOP medical concepts as part of the initiative. DISCUSSION/SIGNIFICANCE: Adoption of OMOP involves local stakeholder knowledge and implementation. Variable complexity of health concepts contributed to variable coverage. Documentation and support require extensive time and effort. Open-source software can be technically challenging. Interoperability of secure data systems presents unique problems.
Patients with ventricular assist devices (VADs) represent a growing population presenting to Emergency Medical Services (EMS), but little is known about their prehospital care. This study aimed to characterize current EMS protocols in the United States for patients with VADs.
Methods:
States with state-wide EMS protocols were included. Protocols were obtained from the state EMS website. If not available, the office of the state medical director was contacted. For each state, protocols were analyzed for patient and VAD assessment and treatment variables.
Results:
Of 32 states with state-wide EMS protocols, 21 had VAD-specific protocols. With 17 (81%) states noting a pulse may not be palpable, protocols recommended assessing alternate measures of perfusion and mean arterial pressure (MAP; 15 [71%]). Assessment of VAD was advised through listening for pump hum (20 [95%]) and alarms (20 [95%]) and checking the power supply (15 [71%]). For treatment, EMS prehospital consultation was required to begin chest compression in three (14%) states, and mechanical (device) chest compressions were not permitted in two (10%) states. Contact information for VAD coordinator was listed in a minority of five (24%) states. Transport of VAD equipment/backup bag was advised in 18 (86%) states.
Discussion:
This national analysis of EMS protocols found VAD-specific EMS protocols are not universally adopted in the United States and are variable when implemented, highlighting a need for VAD teams to partner with EMS agencies to inform standardized protocols that optimize these patients’ care.
Interventional clinical studies of convalescent plasma to treat COVID-19 were predominantly funded and led by public sector actors, including blood services operators. We aimed to analyze the processes of clinical studies of convalescent plasma to understand alternatives to pharmaceutical industry biopharmaceutical research and development, particularly where public sector actors play a dominant role. We conducted a qualitative, critical case study of purposively sampled prominent and impactful clinical studies of convalescent plasma during 2020-2021.
Cognitive reserve and health-related fitness are associated with favorable cognitive aging, but Black/African American older adults are underrepresented in extant research. Our objective was to explore the relative contributions and predictive value of cognitive reserve and health-related fitness metrics on cognitive performance at baseline and cognitive status at a 4-year follow up in a large sample of Black/African American older adults.
Participants and Methods:
Participants aged 65 years and older from the Health and Retirement Study (HRS) who identified as Black/African American and completed baseline and follow-up interviews (including physical, health, and cognitive assessments) were included in the study. The final sample included 321 Black/African American older adults (mean age = 72.8; sd = 4.8; mean years of education = 12.3; sd = 2.9; mean body mass index (BMI) = 29.1; sd = 5.2; 60.4% identified as female). A cross-sectional analysis of relative importance – a measure of partitioned variance controlling for collinearity and model order – was first used to explore predictor variables and inform the hierarchical model order. Next, hierarchical multiple regression was used to examine cross-sectional relationships between cognitive reserve (years of education), health-related fitness variables (grip strength, lung capacity, gait speed, BMI), and global cognition. Multiple logistic regression was used to examine prospective relationships between predictors and longitudinal cognitive status (maintainers versus decliners). Control variables in all models included age, gender identity, and a chronic disease index score.
Results:
Cross-sectional relative importance analyses identified years of education and gait speed as important predictors of global cognition. The cross-sectional hierarchical regression model explained 33% of variance in baseline global cognition. Education was the strongest predictor of cognitive performance (β = 0.48, p < 0.001). Holding all other variables constant, gait speed was significantly associated with baseline cognitive performance and accounted for a significant additional amount of explained variance (ΔR = 0.01, p = 0.032). In a prospective analysis dividing the sample into cognitive maintainers and decliners, a single additional year of formal education increased chances of being classified as a cognitive maintainer (OR = 1.30, 95% CI = 1.17-1.45). There were no significant relationships between rate of change in health-related fitness and rate of change in cognition.
Conclusions:
Education, a proxy for cognitive reserve, was a robust predictor of global cognition at baseline and was associated with increased odds of maintaining cognitive ability at 4-year follow up in Black/African American older adults. Of the physical performance metrics, gait speed was associated with cognitive performance at baseline. The lack of observed association between other fitness variables and cognition may be attributable to the brief assessment procedures implemented in this large-scale study.
We tested 85 isolates of β-hemolytic Streptococcus spp. against trimethoprim/sulfamethoxazole (TMP/SMX), clindamycin, and doxycycline by broth microdilution (BMD) and BD Phoenix. Susceptibility rates via BMD for TMP/SMX, clindamycin, and doxycycline were 100%, 85.5%, and 56.6%, respectively. TMP/SMX is a potential monotherapy agent for β-hemolytic Streptococcus skin and soft tissue infections.
We recently reported on the radio-frequency attenuation length of cold polar ice at Summit Station, Greenland, based on bi-static radar measurements of radio-frequency bedrock echo strengths taken during the summer of 2021. Those data also allow studies of (a) the relative contributions of coherent (such as discrete internal conducting layers with sub-centimeter transverse scale) vs incoherent (e.g. bulk volumetric) scattering, (b) the magnitude of internal layer reflection coefficients, (c) limits on signal propagation velocity asymmetries (‘birefringence’) and (d) limits on signal dispersion in-ice over a bandwidth of ~100 MHz. We find that (1) attenuation lengths approach 1 km in our band, (2) after averaging 10 000 echo triggers, reflected signals observable over the thermal floor (to depths of ~1500 m) are consistent with being entirely coherent, (3) internal layer reflectivities are ≈–60$\to$–70 dB, (4) birefringent effects for vertically propagating signals are smaller by an order of magnitude relative to South Pole and (5) within our experimental limits, glacial ice is non-dispersive over the frequency band relevant for neutrino detection experiments.