We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Identifying optimal methods for sampling surfaces in the healthcare environment is critical for future research requiring the identification of multidrug-resistant organisms (MDROs) on surfaces.
Methods:
We compared 2 swabbing methods, use of a flocked swab versus a sponge-stick, for recovery of MDROs by both culture and recovery of bacterial DNA via quantitative 16S polymerase chain reaction (PCR). This comparison was conducted by assessing swab performance in a longitudinal survey of MDRO contamination in hospital rooms. Additionally, a laboratory-prepared surface was also used to compare the recovery of each swab type with a matching surface area.
Results:
Sponge-sticks were superior to flocked swabs for culture-based recovery of MDROs, with a sensitivity of 80% compared to 58%. Similarly, sponge-sticks demonstrated greater recovery of Staphylococcus aureus from laboratory-prepared surfaces, although the performance of flocked swabs improved when premoistened. In contrast, recovery of bacterial DNA via quantitative 16S PCR was greater with flocked swabs by an average of 3 log copies per specimen.
Conclusions:
The optimal swabbing method of environmental surfaces differs by method of analysis. Sponge-sticks were superior to flocked swabs for culture-based detection of bacteria but inferior for recovery of bacterial DNA.
To describe neutropenic fever management practices among healthcare institutions.
Design:
Survey.
Participants:
Members of the Society for Healthcare Epidemiology of America Research Network (SRN) representing healthcare institutions within the United States.
Methods:
An electronic survey was distributed to SRN representatives, with questions pertaining to demographics, antimicrobial prophylaxis, supportive care, and neutropenic fever management. The survey was distributed from fall 2022 through spring 2023.
Results:
40 complete responses were recorded (54.8% response rate), with respondent institutions accounting for approximately 15.7% of 2021 US hematologic malignancy hospitalizations and 14.9% of 2020 US bone marrow transplantations. Most entities have institutional guidelines for neutropenic fever management (35, 87.5%) and prophylaxis (31, 77.5%), and first-line treatment included IV antipseudomonal antibiotics (35, 87.5% cephalosporin; 5, 12.5% penicillin; 0, 0% carbapenem).
We observed significant heterogeneity in treatment course decisions, with roughly half (18, 45.0%) of respondents continuing antibiotics until neutrophil recovery, while the remainder having criteria for de-escalation prior to neutrophil recovery. Respondents were more willing to de-escalate prior to neutrophil recovery in patients with identified clinical (27, 67.5% with pneumonia) or microbiological (30, 75.0% with bacteremia) sources after dedicated treatment courses.
Conclusions:
We found substantial variation in the practice of de-escalation of empiric antibiotics relative to neutrophil recovery, highlighting a need for more robust evidence for and adoption of this practice. No respondents use carbapenems as first-line therapy, comparing favorably to prior survey studies conducted in other countries.
NASA’s all-sky survey mission, the Transiting Exoplanet Survey Satellite (TESS), is specifically engineered to detect exoplanets that transit bright stars. Thus far, TESS has successfully identified approximately 400 transiting exoplanets, in addition to roughly 6 000 candidate exoplanets pending confirmation. In this study, we present the results of our ongoing project, the Validation of Transiting Exoplanets using Statistical Tools (VaTEST). Our dedicated effort is focused on the confirmation and characterisation of new exoplanets through the application of statistical validation tools. Through a combination of ground-based telescope data, high-resolution imaging, and the utilisation of the statistical validation tool known as TRICERATOPS, we have successfully discovered eight potential super-Earths. These planets bear the designations: TOI-238b (1.61$^{+0.09} _{-0.10}$ R$_\oplus$), TOI-771b (1.42$^{+0.11} _{-0.09}$ R$_\oplus$), TOI-871b (1.66$^{+0.11} _{-0.11}$ R$_\oplus$), TOI-1467b (1.83$^{+0.16} _{-0.15}$ R$_\oplus$), TOI-1739b (1.69$^{+0.10} _{-0.08}$ R$_\oplus$), TOI-2068b (1.82$^{+0.16} _{-0.15}$ R$_\oplus$), TOI-4559b (1.42$^{+0.13} _{-0.11}$ R$_\oplus$), and TOI-5799b (1.62$^{+0.19} _{-0.13}$ R$_\oplus$). Among all these planets, six of them fall within the region known as ‘keystone planets’, which makes them particularly interesting for study. Based on the location of TOI-771b and TOI-4559b below the radius valley we characterised them as likely super-Earths, though radial velocity mass measurements for these planets will provide more details about their characterisation. It is noteworthy that planets within the size range investigated herein are absent from our own solar system, making their study crucial for gaining insights into the evolutionary stages between Earth and Neptune.
Hard-to-treat childhood cancers are those where standard treatment options do not exist and the prognosis is poor. Healthcare professionals (HCPs) are responsible for communicating with families about prognosis and complex experimental treatments. We aimed to identify HCPs’ key challenges and skills required when communicating with families about hard-to-treat cancers and their perceptions of communication-related training.
Methods
We interviewed Australian HCPs who had direct responsibilities in managing children/adolescents with hard-to-treat cancer within the past 24 months. Interviews were analyzed using qualitative content analysis.
Results
We interviewed 10 oncologists, 7 nurses, and 3 social workers. HCPs identified several challenges for communication with families including: balancing information provision while maintaining realistic hope; managing their own uncertainty; and nurses and social workers being underutilized during conversations with families, despite widespread preferences for multidisciplinary teamwork. HCPs perceived that making themselves available to families, empowering them to ask questions, and repeating information helped to establish and maintain trusting relationships with families. Half the HCPs reported receiving no formal training for communicating prognosis and treatment options with families of children with hard-to-treat cancers. Nurses, social workers, and less experienced oncologists supported the development of communication training resources, more so than more experienced oncologists.
Significance of results
Resources are needed which support HCPs to communicate with families of children with hard-to-treat cancers. Such resources may be particularly beneficial for junior oncologists and other HCPs during their training, and they should aim to prepare them for common challenges and foster greater multidisciplinary collaboration.
Background: Neutropenic fever management decisions are complex and result in prolonged duration of broad-spectrum antibiotics. Strategies for antibiotic stewardship in this context have been studied, including de-escalation of antibiotics prior to resolution of neutropenia, with unclear implementation. Here, we present the first survey study to describe real-world neutropenic fever management practices in US healthcare institutions, with particular emphasis on de-escalation strategies after initiation of broad-spectrum antibiotics. Methods: Using REDCap, we conducted a survey of US healthcare institutions through the SHEA Research Network (SRN). Questions pertained to antimicrobial prophylaxis and supportive care in the management of oncology patients and neutropenic fever management (including specific antimicrobial choices and clinical scenarios). Hematologic malignancy hospitalization (2020) and bone-marrow transplantation (2016–2020) volumes were obtained from CMS and Health Resources & Services Administration databases, respectively. Results: Overall, 23 complete responses were recorded (response rate, 35.4%). Collectively, these entities account for ~11.0% of hematologic malignancy hospitalizations and 13.3% bone marrow transplantations nationwide. Of 23 facilities, 19 had institutional guidelines for neutropenic fever management and 18 had institutional guidelines for prophylaxis, with similar definitions for neutropenic fever. Firstline treatment universally utilized antipseudomonal broad-spectrum IV antibiotics (20 of 23 use cephalosporin, 3 of 23 use penicillin agent, and no respondents use carbapenem). Fluoroquinolone prophylaxis was common for leukemia induction patients (18 of 23) but was mixed for bone-marrow transplantation (10 of 23). We observed significant heterogeneity in treatment decisions. For stable neutropenic fever patients with no clinical source of infection identified, 13 of 23 respondents continued IV antibiotics until ANC (absolute neutrophil count) recovery. The remainder had criteria for de-escalation back to prophylaxis prior to this (eg, a fever-free period). Respondents were more willing to de-escalate prior to ANC recovery in patients with identified clinical sources (14 of 23 de-escalations in patients with pneumonia) or microbiological sources (15 of 23 de-escalations in patients with bacteremia) after dedicated treatment courses. In free-text responses, several respondents described opportunities for more systemic de-escalation for antimicrobial stewardship in these scenarios. Conclusions: Our results illustrate the real-world management of neutropenic fever in US hospitals, including initiation of therapy, prophylaxis, and treatment duration. We found significant heterogeneity in de-escalation of empiric antibiotics relative to ANC recovery, highlighting a need for more robust evidence for and adoption of this practice.
Ordering Clostridioides difficile diagnostics without appropriate clinical indications can result in inappropriate antibiotic prescribing and misdiagnosis of hospital onset C. difficile infection. Manual processes such as provider review of order appropriateness may detract from other infection control or antibiotic stewardship activities.
Methods:
We developed an evidence-based clinical algorithm that defined appropriateness criteria for testing for C. difficile infection. We then implemented an electronic medical record–based order-entry tool that utilized discrete branches within the clinical algorithm including history of prior C. difficile test results, laxative or stool-softener administration, and documentation of unformed bowel movements. Testing guidance was then dynamically displayed with supporting patient data. We compared the rate of completed C. difficile tests after implementation of this intervention at 5 hospitals to a historic baseline in which a best-practice advisory was used.
Results:
Using mixed-effects Poisson regression, we found that the intervention was associated with a reduction in the incidence rate of both C. difficile ordering (incidence rate ratio [IRR], 0.74; 95% confidence interval [CI], 0.63–0.88; P = .001) and C. difficile–positive tests (IRR, 0.83; 95% CI, 0.76–0.91; P < .001). On segmented regression analysis, we identified a sustained reduction in orders over time among academic hospitals and a new reduction in orders over time among community hospitals.
Conclusions:
An evidence-based dynamic order panel, integrated within the electronic medical record, was associated with a reduction in both C. difficile ordering and positive tests in comparison to a best practice advisory, although the impact varied between academic and community facilities.
The spatial and temporal extent of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) environmental contamination has not been precisely defined. We sought to elucidate contamination of different surface types and how contamination changes over time.
Methods:
We sampled surfaces longitudinally within COVID-19 patient rooms, performed quantitative RT-PCR for the detection of SARS-CoV-2 RNA, and modeled distance, time, and severity of illness on the probability of detecting SARS-CoV-2 using a mixed-effects binomial model.
Results:
The probability of detecting SARS-CoV-2 RNA in a patient room did not vary with distance. However, we found that surface type predicted probability of detection, with floors and high-touch surfaces having the highest probability of detection: floors (odds ratio [OR], 67.8; 95% credible interval [CrI], 36.3–131) and high-touch elevated surfaces (OR, 7.39; 95% CrI, 4.31–13.1). Increased surface contamination was observed in room where patients required high-flow oxygen, positive airway pressure, or mechanical ventilation (OR, 1.6; 95% CrI, 1.03–2.53). The probability of elevated surface contamination decayed with prolonged hospitalization, but the probability of floor detection increased with the duration of the local pandemic wave.
Conclusions:
Distance from a patient’s bed did not predict SARS-CoV-2 RNA deposition in patient rooms, but surface type, severity of illness, and time from local pandemic wave predicted surface deposition.
We prospectively surveyed SARS-CoV-2 RNA contamination in staff common areas within an acute-care hospital. An increasing prevalence of surface contamination was detected over time. Adjusting for patient census or community incidence of coronavirus disease 2019 (COVID-19), the proportion of contaminated surfaces did not predict healthcare worker COVID-19 infection on study units.
Multidrug-resistant organisms (MDROs) colonizing the healthcare environment have been shown to contribute to risk for healthcare-associated infections (HAIs), with adverse effects on patient morbidity and mortality. We sought to determine how bacterial contamination and persistent MDRO colonization of the healthcare environment are related to the position of patients and wastewater sites.
Methods:
We performed a prospective cohort study, enrolling 51 hospital rooms at the time of admitting a patient with an eligible MDRO in the prior 30 days. We performed systematic sampling and MDRO culture of rooms, as well as 16S rRNA sequencing to define the environmental microbiome in a subset of samples.
Results:
The probability of detecting resistant gram-negative organisms, including Enterobacterales, Acinetobacter spp, and Pseudomonas spp, increased with distance from the patient. In contrast, Clostridioides difficile and methicillin-resistant Staphylococcus aureus were more likely to be detected close to the patient. Resistant Pseudomonas spp and S. aureus were enriched in these hot spots despite broad deposition of 16S rRNA gene sequences assigned to the same genera, suggesting modifiable factors that permit the persistence of these MDROs.
Conclusions:
MDRO hot spots can be defined by distance from the patient and from wastewater reservoirs. Evaluating how MDROs are enriched relative to bacterial DNA deposition helps to identify healthcare micro-environments and suggests how targeted environmental cleaning or design approaches could prevent MDRO persistence and reduce infection risk.
This SHEA white paper identifies knowledge gaps and challenges in healthcare epidemiology research related to coronavirus disease 2019 (COVID-19) with a focus on core principles of healthcare epidemiology. These gaps, revealed during the worst phases of the COVID-19 pandemic, are described in 10 sections: epidemiology, outbreak investigation, surveillance, isolation precaution practices, personal protective equipment (PPE), environmental contamination and disinfection, drug and supply shortages, antimicrobial stewardship, healthcare personnel (HCP) occupational safety, and return to work policies. Each section highlights three critical healthcare epidemiology research questions with detailed description provided in supplementary materials. This research agenda calls for translational studies from laboratory-based basic science research to well-designed, large-scale studies and health outcomes research. Research gaps and challenges related to nursing homes and social disparities are included. Collaborations across various disciplines, expertise and across diverse geographic locations will be critical.
Background:Clostridioides difficile infection (CDI) is a major contributor to morbidity and mortality in patients with hematologic malignancy. Due to both immunosuppression and frequent antibiotic exposures, up to one-third of inpatients receiving chemotherapy or stem-cell transplant develop CDI. Transmission of C. difficile in healthcare facilities occurs due to environmental surface contamination and hand carriage by healthcare workers from colonized and infected patients. We investigated the effectiveness of enhanced room cleaning in collaboration with environmental services (EVS) staff to prevent CDI transmission and infection.
Methods: From April 1, 2018, to September 30, 2018, a multimodal enhanced cleaning intervention was implemented on 2 oncology units at the Hospital of the University of Pennsylvania. This intervention included real-time feedback to EVS staff following ATP bioluminescence monitoring. Additionally, all rooms on the intervention units underwent UV disinfection after terminal cleaning. We performed a system-level cohort study, comparing rates of CDI on the 2 study units to historic and 2 concurrent control units. Historic and concurrent control units received UV disinfection only for rooms with prior occupants with MRSA or CDI. All units during the intervention period received education on the importance of environmental cleaning for infection prevention. Mixed-effects Poisson regression was used to adjust for system-level confounders. Results: A median of 1.34 CDI cases per 1,000 patient days (IQR, 1.20–3.62) occurred during the 12-month baseline period. There was a trend toward a reduced rate of CDI across all units during the intervention period (median, 1.19; IQR, 0.00–2.47; P = .13) compared with all units during the historical period. Using mixed-effects Poisson regression, accounting for the random effects of study units, the intervention was associated with an incidence rate ratio for C. difficile of 0.72 compared to control units (95% CI, 0.53–0.97; P = .03). Average room turnaround time (TAT) increased across all units during the study period, from 78 minutes (IQR 74–81) to 92 minutes (IQR, 85–96; P < .001). Within the intervention period, TAT was higher on intervention units (median, 94 minutes; IQR, 92–98) compared to concurrent control units (median, 85; IQR, 80–92; P = .005). Conclusions: Enhanced environmental cleaning, including UV disinfection of all patient rooms and ATP bioluminescent monitoring with real-time feedback, was associated with a reduction in the incidence of CDI.
We implemented a guideline for appropriate acid suppressant use in hematology-oncology patients. This intervention resulted in a sustained reduction in proton pump inhibitor (PPI) use without an increase in rates of gastrointestinal bleeding. Practice guidelines are effective in reducing PPI use, which is associated with risk of Clostridioides difficile infection.
BACKGROUND: IGTS is a rare phenomenon of paradoxical germ cell tumor (GCT) growth during or following treatment despite normalization of tumor markers. We sought to evaluate the frequency, clinical characteristics and outcome of IGTS in patients in 21 North-American and Australian institutions. METHODS: Patients with IGTS diagnosed from 2000-2017 were retrospectively evaluated. RESULTS: Out of 739 GCT diagnoses, IGTS was identified in 33 patients (4.5%). IGTS occurred in 9/191 (4.7%) mixed-malignant GCTs, 4/22 (18.2%) immature teratomas (ITs), 3/472 (0.6%) germinomas/germinomas with mature teratoma, and in 17 secreting non-biopsied tumours. Median age at GCT diagnosis was 10.9 years (range 1.8-19.4). Male gender (84%) and pineal location (88%) predominated. Of 27 patients with elevated markers, median serum AFP and Beta-HCG were 70 ng/mL (range 9.2-932) and 44 IU/L (range 4.2-493), respectively. IGTS occurred at a median time of 2 months (range 0.5-32) from diagnosis, during chemotherapy in 85%, radiation in 3%, and after treatment completion in 12%. Surgical resection was attempted in all, leading to gross total resection in 76%. Most patients (79%) resumed GCT chemotherapy/radiation after surgery. At a median follow-up of 5.3 years (range 0.3-12), all but 2 patients are alive (1 succumbed to progressive disease, 1 to malignant transformation of GCT). CONCLUSION: IGTS occurred in less than 5% of patients with GCT and most commonly after initiation of chemotherapy. IGTS was more common in patients with IT-only on biopsy than with mixed-malignant GCT. Surgical resection is a principal treatment modality. Survival outcomes for patients who developed IGTS are favourable.
In a cohort of inpatients with hematologic malignancy and positive enzyme immunoassay (EIA) or polymerase chain reaction (PCR) Clostridium difficile tests, we found that clinical characteristics and outcomes were similar between these groups. The method of testing is unlikely to predict infection in this population, and PCR-positive results should be treated with concern.
Hypovitaminosis D may be associated with diabetes, hypertension and CHD. However, because studies examining the associations of all three chronic conditions with circulating 25-hydroxyvitamin D (25(OH)D) and 1,25-dihydroxyvitamin D (1,25(OH)2D) are limited, we examined these associations in the US Prostate, Lung, Colorectal and Ovarian (PLCO) Cancer Screening Trial (n 2465). Caucasian PLCO participants selected as controls in previous nested case–control studies of 25(OH)D and 1,25(OH)2D were included in this analysis. Diabetes, CHD and hypertension prevalence, risk factors for these conditions and intake of vitamin D and Ca were collected from a baseline questionnaire. Results indicated that serum levels of 25(OH)D were low ( < 50 nmol/l) in 29 % and very low ( < 37 nmol/l) in 11 % of subjects. The prevalence of diabetes, hypertension and CHD was 7, 30 and 10 %, respectively. After adjustment for confounding by sex, geographical location, educational level, smoking history, BMI, physical activity, total dietary energy and vitamin D and Ca intake, only diabetes was significantly associated with lower 25(OH)D and 1,25(OH)2D levels. Caucasians who had 25(OH)D ≥ 80 nmol/l were half as likely to have diabetes (OR 0·5 (95 % CI 0·3, 0·9)) compared with those who had 25(OH)D < 37 nmol/l. Those in the highest quartile of 1,25(OH)2D ( ≥ 103 pmol/l) were less than half as likely to have diabetes (OR 0·3 (95 % CI 0·1, 0·7)) than those in the lowest quartile ( < 72 pmol/l). In conclusion, the independent associations of 25(OH)D and 1,25(OH)2D with diabetes prevalence in a large population are new findings, and thus warrant confirmation in larger, prospective studies.
Silicon oxide has been widely used to encapsulate biomolecules to preserve their activity in less than ideal environments. However, there are other inorganic oxides with inherent properties that would be advantageous in creating a multifunctional material. Titanium oxide exhibits properties that have applications in areas such as electronics, energy conversion, and decontamination. Herein is reported the formation of titania coatings fabricated on polymer beads using a biomimetic approach and characterized with scanning electron microscopy and energy dispersive x-ray spectroscopy. The approach involves the use of functionalized polymer beads, which initiate oxide formation from a water-soluble titanium complex. The method was used to encapsulate the enzyme diisopropylfluorophosphatase, in situ, within the oxide matrix under buffered aqueous conditions while retaining its enzymatic activity against diisopropylfluorophosphate. In addition, the biomimetically produced titania was shown to exhibit UV-assisted degradation activity against an ethidium bromide dye, upon liberation from the coating template.
Lenses and other transparent optical materials suffer rapid damage when
subjected to blowing abrasive particulates. The time-scale of these impact
event falls between typical scratch tests (less than 1m/s) and ballistic
tests (100s of m/s) and has not been studied in depth to date. Polymeric
lens materials like polycarbonate are usually treated with a
scratch-resistant coating, which is commonly silica-based. The coating
provides some protection, yet is not sufficient at resisting abrasion from
blown sand in most commercial products. We demonstrate that silicone
elastomeric coatings are superior to polycarbonate and silica glass at
resisting damage by blown sand particles. Sand abrasion tests were conducted
using a custom-built test apparatus that exposes the sample to 400 micron
diameter quartz silica moving at 16.5 m/s (approx. 38 mph). Scanning
electron microscopy revealed the presence of small cracks and pits in
polycarbonate, coated polycarbonate, and silica glass after sand exposure.
No such damage was observed in the silicone-coated samples after an
identical exposure.
We speculate that the elastic tensile strain at the surface is an important
predictor of the material response at the time-scale of the impact. A simple
mathematical model was developed using a momentum balance pre- and
post-impact, and was used to approximate the maximum deformation and impact
time-scale. A semispherical interaction volume was used in the model with a
radius of 1.5x the particle diameter, determined through profilometry
experiments. The material’s resistance to deformation was measured
experimentally through a static mechanical test using a spherical indenter
to represent the particle. Tensile tests were performed on both materials to
identify the maximum elastic strain.Additionally, dynamic mechanical tests
were performed to confirm that the mechanical behavior at long time-scales
was valid at shorter time-scales of the impacts. DMA curves were shifted
using the WLF equation. Profilometry and scanning electron microscopy (SEM)
imaging were used to confirm the presence or absence of blown-sand induced
damage.
Cheetahs (Acinonyx jubatus) held ex situ can provide an important resource for obtaining new biological information that usually cannot be gleaned from free-living individuals. However, consistent captive propagation of the cheetah, a prerequisite for establishing a self-sustaining population, has not been accomplished so far. This study examined the effect of a husbandry regimen commonly used in ex situ facilities on female cheetahs. Although generally solitary in the wild, zoos frequently house cheetahs in pairs or groups. Using non-invasive hormone monitoring and quantitative behavioural observations, we studied the impact of such enforced social conditions on behaviour and ovarian/adrenal activity. Eight female cheetahs were evaluated for two consecutive 6-month periods, first while maintained in pairs and then as individuals. Subsequently four females were regrouped into two new pairs and monitored for another 6 months. Females in five of six pairings demonstrated prolonged anoestrus and displayed agonistic behaviours. After pair separation all females rapidly resumed oestrous cyclicity. Females in the sixth pair continued cycling throughout the year while consistently displaying affiliative grooming and no agonistic behaviours. Faecal corticoid patterns varied significantly among individuals, but appeared unrelated to behavioural or ovarian hormone patterns. Thus, data appear to indicate that same-sex pair-maintenance of behaviourally incompatible female cheetahs may lead to suppressed ovarian cyclicity. This suppression appears linked to agonistic behaviours but not to any particular adrenal hormone excretion pattern. Results clearly demonstrate the value of applying knowledge about in situ social behaviour to ex situ management practices. Conversely, however, non-invasive hormone monitoring conducted ex situ may help us to identify physiological phenomena of potential relevance for future in situ studies.