We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Validate a public health model identifying patients at high risk for carbapenem-resistant Enterobacterales (CRE) on admission and evaluate performance across a healthcare network.
Design:
Retrospective case-control studies
Participants:
Adults hospitalized with a clinical CRE culture within 3 days of admission (cases) and those hospitalized without a CRE culture (controls).
Methods:
Using public health data from Atlanta, GA (1/1/2016–9/1/2019), we validated a CRE prediction model created in Chicago. We then closely replicated this model using clinical data from a healthcare network in Atlanta (1/1/2015–12/31/2021) (“Public Health Model”) and optimized performance by adding variables from the healthcare system (“Healthcare System Model”). We frequency-matched cases and controls based on year and facility. We evaluated model performance in validation datasets using area under the curve (AUC).
Results:
Using public health data, we matched 181 cases to 764,408 controls, and the Chicago model performed well (AUC 0.85). Using clinical data, we matched 91 cases to 384,013 controls. The Public Health Model included age, prior infection diagnosis, number of and mean length of stays in acute care hospitalizations (ACH) in the prior year. The final Healthcare System Model added Elixhauser score, antibiotic days of therapy in prior year, diabetes, admission to the intensive care unit in prior year and removed prior number of ACH. The AUC increased from 0.68 to 0.73.
Conclusions:
A CRE risk prediction model using prior healthcare exposures performed well in a geographically distinct area and in an academic healthcare network. Adding variables from healthcare networks improved model performance.
Background: Prompt identification of patients colonized or infected with carbapenem-resistant Enterobacterales (CRE) upon admission can help ensure rapid initiation of infection prevention measures and may reduce intrafacility transmission of CRE. The Chicago CDC Prevention Epicenters Program previously created a CRE prediction model using state-wide public health data (doi: 10.1093/ofid/ofz483). We evaluated how well a similar model performed using data from a single academic healthcare system in Atlanta, Georgia, and we sought to determine whether including additional variables improved performance. Methods: We performed a case–control study using electronic medical record data. We defined cases as adult encounters to acute-care hospitals in a 4-hospital academic healthcare system from January 1, 2014, to December 31, 2021, with CRE identified from a clinical culture within the first 3 hospital days. Only the first qualifying encounter per patient was included. We frequency matched cases to control admissions (no CRE identified) from the same hospital and year. Using multivariable logistic regression, we compared 2 models. The “public health model” included 4 variables from the Chicago Epicenters model (age, number of hospitalizations in the prior 365 days, mean length of stay in hospitalizations in the prior 365 days, and hospital admission with an infection diagnosis in the prior 365 days). The “healthcare system model” added 4 additional variables (admission to the ICU in the prior 365 days, malignancy diagnosis, Elixhauser score and inpatient antibiotic days of therapy in the prior 365 days) to the public health model. We used billing codes to determine Elixhauser score, malignancy status, and recent infection diagnoses. We compared model performance using the area under the receiver operating curve (AUC). Results: We identified 105 cases and 441,460 controls (Table 1). CRE was most frequently identified in urine cultures (46%). All 4 variables included in the public health model and the 4 additional variables in the healthcare system model were all significantly associated with being a case in unadjusted analyses (Table 1). The AUC for the public health model was 0.76, and the AUC for the healthcare system model was 0.79 (Table 2; Fig. 1). In both models, a prior admission with an infection diagnosis was the most significant risk factor. Conclusions: A modified CRE prediction model developed using public health data and focused on prior healthcare exposures performed reasonably well when applied to a different academic healthcare system. The addition of variables accessible in large healthcare networks did not meaningfully improve model discrimination.
Smutgrass is a non-native perennial weed that is problematic because of its poor palatability to cattle and its difficulty to control once established. Limited literature exists to explain the effectiveness of herbicides other than hexazinone for smutgrass control and forage injury. This study aimed to evaluate seasonal applications of labeled herbicides used on forage for maximum smutgrass control. The second objective was to evaluate preemergent herbicides and hexazinone for their ability to control smutgrass germinating from seed. Hexazinone, nicosulfuron + metsulfuron-methyl, and glyphosate + imazapic were the most effective postemergence treatments, while quinclorac exhibited little activity on smutgrass. Common bermudagrass forage fully recovered from all treatments by 3 mo after treatment. Hexazinone, nicosulfuron + metsulfuron methyl, glyphosate, and imazapic were applied postemergence to smutgrass in spring, summer, and fall. Summer applications of hexazinone resulted in the greatest level of control, while spring treatments provided the least control. Applications of hexazinone or glyphosate resulted in the most effective smutgrass control. However, fall applications resulted in the least forage injury. Results of the study of preemergence herbicides indicate that treatments with indaziflam and hexazinone provide adequate control of germinating smutgrass seedlings in the greenhouse at 0.25×, 0.5×, and 0.75× of the lowest recommended labeled rate for seedling grass control. Indaziflam treatments prevented the emergence of any visible smutgrass seedling tissue, compared to hexazinone, which fully controlled the germinating seedlings by 21 d after treatment, whereas pendimethalin significantly reduced seedling numbers at the 0.5× and 0.75× rates.
To determine the incidence of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection among healthcare personnel (HCP) and to assess occupational risks for SARS-CoV-2 infection.
Design:
Prospective cohort of healthcare personnel (HCP) followed for 6 months from May through December 2020.
Setting:
Large academic healthcare system including 4 hospitals and affiliated clinics in Atlanta, Georgia.
Participants:
HCP, including those with and without direct patient-care activities, working during the coronavirus disease 2019 (COVID-19) pandemic.
Methods:
Incident SARS-CoV-2 infections were determined through serologic testing for SARS-CoV-2 IgG at enrollment, at 3 months, and at 6 months. HCP completed monthly surveys regarding occupational activities. Multivariable logistic regression was used to identify occupational factors that increased the risk of SARS-CoV-2 infection.
Results:
Of the 304 evaluable HCP that were seronegative at enrollment, 26 (9%) seroconverted for SARS-CoV-2 IgG by 6 months. Overall, 219 participants (73%) self-identified as White race, 119 (40%) were nurses, and 121 (40%) worked on inpatient medical-surgical floors. In a multivariable analysis, HCP who identified as Black race were more likely to seroconvert than HCP who identified as White (odds ratio, 4.5; 95% confidence interval, 1.3–14.2). Increased risk for SARS-CoV-2 infection was not identified for any occupational activity, including spending >50% of a typical shift at a patient’s bedside, working in a COVID-19 unit, or performing or being present for aerosol-generating procedures (AGPs).
Conclusions:
In our study cohort of HCP working in an academic healthcare system, <10% had evidence of SARS-CoV-2 infection over 6 months. No specific occupational activities were identified as increasing risk for SARS-CoV-2 infection.
To determine the impact of an inpatient stewardship intervention targeting fluoroquinolone use on inpatient and postdischarge Clostridioides difficile infection (CDI).
Design:
We used an interrupted time series study design to evaluate the rate of hospital-onset CDI (HO-CDI), postdischarge CDI (PD-CDI) within 12 weeks, and inpatient fluoroquinolone use from 2 years prior to 1 year after a stewardship intervention.
Setting:
An academic healthcare system with 4 hospitals.
Patients:
All inpatients hospitalized between January 2017 and September 2020, excluding those discharged from locations caring for oncology, bone marrow transplant, or solid-organ transplant patients.
Intervention:
Introduction of electronic order sets designed to reduce inpatient fluoroquinolone prescribing.
Results:
Among 163,117 admissions, there were 683 cases of HO-CDI and 1,104 cases of PD-CDI. In the context of a 2% month-to-month decline starting in the preintervention period (P < .01), we observed a reduction in fluoroquinolone days of therapy per 1,000 patient days of 21% after the intervention (level change, P < .05). HO-CDI rates were stable throughout the study period. In contrast, we also detected a change in the trend of PD-CDI rates from a stable monthly rate in the preintervention period to a monthly decrease of 2.5% in the postintervention period (P < .01).
Conclusions:
Our systemwide intervention reduced inpatient fluoroquinolone use immediately, but not HO-CDI. However, a downward trend in PD-CDI occurred. Relying on outcome measures limited to the inpatient setting may not reflect the full impact of inpatient stewardship efforts.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
Background: Effective inpatient stewardship initiatives can improve antibiotic prescribing, but impact on outcomes like Clostridioides difficile infections (CDIs) is less apparent. However, the effect of inpatient stewardship efforts may extend to the postdischarge setting. We evaluated whether an intervention targeting inpatient fluoroquinolone (FQ) use in a large healthcare system reduced incidence of postdischarge CDI. Methods: In August 2019, 4 acute-care hospitals in a large healthcare system replaced standalone FQ orders with order sets containing decision support. Order sets redirected prescribers to syndrome order sets that prioritize alternative antibiotics. Monthly patient days (PDs) and antibiotic days of therapy (DOT) administered for FQs and NHSN-defined broad-spectrum hospital-onset (BS-HO) antibiotics were calculated using patient encounter data for the 23 months before and 13 months after the intervention (COVID-19 admissions in the previous 7 months). We evaluated hospital-onset CDI (HO-CDI) per 1,000 PD (defined as any positive test after hospital day 3) and 12-week postdischarge (PDC- CDI) per 100 discharges (any positive test within healthcare system <12 weeks after discharge). Interrupted time-series analysis using generalized estimating equation models with negative binomial link function was conducted; a sensitivity analysis with Medicare case-mix index (CMI) adjustment was also performed to control for differences after start of the COVID-19 pandemic. Results: Among 163,117 admissions, there were 683 HO-CDIs and 1,009 PDC-CDIs. Overall, FQ DOT per 1,000 PD decreased by 21% immediately after the intervention (level change; P < .05) and decreased at a consistent rate throughout the entire study period (−2% per month; P < .01) (Fig. 1). There was a nonsignificant 5% increase in BS-HO antibiotic use immediately after intervention and a continued increase in use after the intervention (0.3% per month; P = .37). HO-CDI rates were stable throughout the study period, with a nonsignificant level change decrease of 10% after the intervention. In contrast, there was a reversal in the trend in PDC-CDI rates from a 0.4% per month increase in the preintervention period to a 3% per month decrease in the postintervention period (P < .01). Sensitivity analysis with adjustment for facility-specific CMI produced similar results but with wider confidence intervals, as did an analysis with a distinct COVID-19 time point. Conclusion: Our systemwide intervention using order sets with decision support reduced inpatient FQ use by 21%. The intervention did not significantly reduce HO-CDI but significantly decreased the incidence of CDI within 12 weeks after discharge. Relying on outcome measures limited to inpatient setting may not reflect the full impact of inpatient stewardship efforts and incorporating postdischarge outcomes, such as CDI, should increasingly be considered.
Among 353 healthcare personnel in a longitudinal cohort in 4 hospitals in Atlanta, Georgia (May–June 2020), 23 (6.5%) had severe acute respiratory coronavirus virus 2 (SARS-CoV-2) antibodies. Spending >50% of a typical shift at the bedside (OR, 3.4; 95% CI, 1.2–10.5) and black race (OR, 8.4; 95% CI, 2.7–27.4) were associated with SARS-CoV-2 seropositivity.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
The SPARC tokamak is a critical next step towards commercial fusion energy. SPARC is designed as a high-field ($B_0 = 12.2$ T), compact ($R_0 = 1.85$ m, $a = 0.57$ m), superconducting, D-T tokamak with the goal of producing fusion gain $Q>2$ from a magnetically confined fusion plasma for the first time. Currently under design, SPARC will continue the high-field path of the Alcator series of tokamaks, utilizing new magnets based on rare earth barium copper oxide high-temperature superconductors to achieve high performance in a compact device. The goal of $Q>2$ is achievable with conservative physics assumptions ($H_{98,y2} = 0.7$) and, with the nominal assumption of $H_{98,y2} = 1$, SPARC is projected to attain $Q \approx 11$ and $P_{\textrm {fusion}} \approx 140$ MW. SPARC will therefore constitute a unique platform for burning plasma physics research with high density ($\langle n_{e} \rangle \approx 3 \times 10^{20}\ \textrm {m}^{-3}$), high temperature ($\langle T_e \rangle \approx 7$ keV) and high power density ($P_{\textrm {fusion}}/V_{\textrm {plasma}} \approx 7\ \textrm {MW}\,\textrm {m}^{-3}$) relevant to fusion power plants. SPARC's place in the path to commercial fusion energy, its parameters and the current status of SPARC design work are presented. This work also describes the basis for global performance projections and summarizes some of the physics analysis that is presented in greater detail in the companion articles of this collection.
To determine the effect of an electronic medical record (EMR) nudge at reducing total and inappropriate orders testing for hospital-onset Clostridioides difficile infection (HO-CDI).
Design:
An interrupted time series analysis of HO-CDI orders 2 years before and 2 years after the implementation of an EMR intervention designed to reduce inappropriate HO-CDI testing. Orders for C. difficile testing were considered inappropriate if the patient had received a laxative or stool softener in the previous 24 hours.
Setting:
Four hospitals in an academic healthcare network.
Patients:
All patients with a C. difficile order after hospital day 3.
Intervention:
Orders for C. difficile testing in patients administered a laxative or stool softener in <24 hours triggered an EMR alert defaulting to cancellation of the order (“nudge”).
Results:
Of the 17,694 HO-CDI orders, 7% were inappropriate (8% prentervention vs 6% postintervention; P < .001). Monthly HO-CDI orders decreased by 21% postintervention (level-change rate ratio [RR], 0.79; 95% confidence interval [CI], 0.73–0.86), and the rate continued to decrease (postintervention trend change RR, 0.99; 95% CI, 0.98–1.00). The intervention was not associated with a level change in inappropriate HO-CDI orders (RR, 0.80; 95% CI, 0.61–1.05), but the postintervention inappropriate order rate decreased over time (RR, 0.95; 95% CI, 0.93–0.97).
Conclusion:
An EMR nudge to minimize inappropriate ordering for C. difficile was effective at reducing HO-CDI orders, and likely contributed to decreasing the inappropriate HO-CDI order rate after the intervention.
In 2017, the NYU Clinical and Translational Science Institute’s Recruitment and Retention Unit created a Patient Advisory Council for Research (PACR) to provide feedback on clinical trials and health research studies. We collaborated with our clinical research informatics team to generate a random sample of patients, based on the International Classification of Diseases, Tenth Revision codes and demographic factors, for invitation via the patient portal. This approach yielded in a group that was diverse with regard to age, race/ethnicity, sex, and health conditions. This report highlights the benefits and limitations of using an electronic health record-based strategy to identify and recruit members for a PACR.
Jaswal & Akhtar provide several quotes ostensibly from people with autism but obtained via the discredited techniques of Facilitated Communication and the Rapid Prompting Method, and they do not acknowledge the use of these techniques. As a result, their argument is substantially less convincing than they assert, and the article lacks transparency.
Tett, Hundley, and Christiansen (2017) argue that the concept of validity generalization in meta-analysis is a myth, as the variability of the effect size appears to decrease with increasing moderator specificity such that the level of precision needed to deem an estimate “generalizable” is actually reached at levels of situational specificity that are so high as to (paradoxically) refute an inference of generalizability. This notion highlights the need to move away from claiming that effects are either “generalizable” or “situationally specific” and instead look more critically and less dichotomously at degrees of generalizability, or effect size variability.
Recent acceleration and thinning of Thwaites Glacier, West Antarctica, motivates investigation of the controls upon, and stability of, its present ice-flow pattern. Its eastern shear margin separates Thwaites Glacier from slower-flowing ice and the southern tributaries of Pine Island Glacier. Troughs in Thwaites Glacier’s bed topography bound nearly all of its tributaries, except along this eastern shear margin, which has no clear relationship with regional bed topography along most of its length. Here we use airborne ice-penetrating radar data from the Airborne Geophysical Survey of the Amundsen Sea Embayment, Antarctica (AGASEA) to investigate the nature of the bed across this margin. Radar data reveal slightly higher and rougher bed topography on the slower-flowing side of the margin, along with lower bed reflectivity. However, the change in bed reflectivity across the margin is partially explained by a change in bed roughness. From these observations, we infer that the position of the eastern shear margin is not strongly controlled by local bed topography or other bed properties. Given the potential for future increases in ice flux farther downstream, the eastern shear margin may be vulnerable to migration. However, there is no evidence that this margin is migrating presently, despite ongoing changes farther downstream.
Introduction: Between June 15 and Aug 31st 2014, Canada’s Northwest Territories (pop 44,000: Stats Can), a subarctic region which is over 2°C warmer than it was in the 1950’s, experienced an unprecedented number of forest fires, with 385 fires and approximately 3.4 million hectares of forest affected. This resulted in one of Canada’s most severe and prolonged urban smoke exposures for the capital city of Yellowknife and surrounding Aboriginal communities. Our objective was to obtain a big-picture sense of the health impact of the Summer of Smoke on the population of these communities through a mixture of quantitative and qualitative analysis. Methods: We analyzed PM2.5 levels, salbutamol dispensations, clinic and hospital cardiorespiratory variables, and in-depth video interviews with community members from Yellowknife, N’Dilo, Dettah and Kakisa. Results: 49% of days June15-Aug31 in 2014 had a PM2.5 over 30 mcg/m3, as compared to 3% in 2012 and 9% in 2013 and 2015. Max daily PM 2.5 in 2014 was 320.4 mcg/m3. There was a 22% increase in outpatient salbutamol dispensations in 2014 compared to the average of 2012, 2013 and 2015. More cough, pneumonia and asthma were seen in clinics compared to 2012-2015 (P<0.001). There was a 42% increase in respiratory ER visits in 2014 compared to 2012-13, but no change in cardiac variables. The respiratory effect was most pronounced in children 0-4 (114% increase in ER visits). Qualitative analysis demonstrates themes of fear, isolation, lack of physical activity, alteration of traditional summertime activities for both aboriginal and non-aboriginal subjects, elements of resilience and expectation for future smoky summers in the context of a changing climate. Conclusion: Prolonged wildfire seasons have a profound effect on overall wellbeing. Responses to help minimize mental and physical impacts such as the creation of clean-air community shelters, recreation programming, initiatives to support community cohesion, and “go outside when it is not smoky” messaging require further study.
Cultivated rice yield losses due to red rice infestation vary by cultivar, red rice density, and duration of interference. The competition effects of red rice could be influenced further by emergence characteristics, red rice biotype, and planting time of cultivated rice. We aimed to characterize the emergence of red rice biotypes at different planting dates and evaluate the effect of red rice biotype, rice cultivar, and planting date on cultivated rice yield loss. Field experiments were conducted at the Southeast Research and Extension Center, Rohwer, AR, and at the Arkansas Rice Research and Extension Center, Stuttgart, AR, in the summer of 2005 and 2006. The experimental design was a split-split plot with three or four replications. Planting time, ClearfieldTM (CL) rice cultivar, and red rice biotype were the main plot, subplot, and sub-subplot factors, respectively. There were three planting times from mid-April to mid-May at 2-wk intervals. CL rice cultivars, CL161 and hybrid CLXL8, and 12 red rice biotypes were planted. The emergence rate and coefficient of uniformity of germination differed among some red rice biotypes within a planting time. Planting date affected the emergence characteristics of red rice biotypes. The mean emergence rate of red rice was 0.043 d−1 in the mid-April planting and 0.058 d−1 in the late April planting. For the mid-April planting, 50% of red rice biotypes emerged in 20 ± 2 d compared with 15 ± 2 d for CL rice cultivars. Yield losses due to red rice biotypes generally increased in later planting dates, up to 49%. Yield losses due to interference from red rice biotypes ranged from 14 to 45% and 6 to 35% in CL161 and CLXL8, respectively. Cultivated rice became less competitive with red rice in later plantings, resulting in higher yield losses.
Insulating silicon dioxide (SiO2) films can be produced by hydrolysis of metal alkoxide tetraethylorthosilicate (TEOS) in the presence of an acid catalyst in supercritical fluid CO2 (sc-CO2). In this study, SiO2 films are formed on different substrates using TEOS as a source of silicon, and acetic acid (HAc) as a catalyst. Water required for the hydrolysis reaction is from in situ generation of esterification and condensation reactions involving HAc and the alcohol produced. The acid catalyzed deposition reaction actually starts at room temperature but produces decent films in sc-CO2 at moderately high temperatures (e.g. 50 °C). Supercritical fluid CO2 is known to have near zero surface tension and provides an ideal medium for fabrication of SiO2 films. Formation of SiO2 films via hydrolysis reaction in sc-CO2 is more rapid compared to the traditional hydrolysis reaction at room temperature. In general, metal alkoxide hydrolysis reactions carried out in a closed sc-CO2 system is not affected by moisture in air compared with traditional open-air hydrolysis systems. Using sc-CO2 as a reaction medium can eliminate undesirable organic solvents utilized in traditional alkoxide hydrolysis reactions.
X-ray diffraction (XRD) and electron diffraction (ED) measurements demonstrated that the SiO2 films produced are amorphous. Energy dispersive spectroscopy (EDS), attenuated total reflectance-Fourier transform infrared (ATR-FTIR) and X-ray photoelectron (XPS) spectroscopy show elemental compositions of the films formed on the substrate surfaces to be SiO2. Film thickness formation by controlling the amount of the catalyst is discussed.