We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Preventing neonatal calf diarrhea (NCD) and bovine respiratory disease (BRD) in cow–calf herds is essential to optimizing calfhood health. Disease control can prevent morbidity and mortality; however, evidence concerning the effectiveness of practices to achieve this is limited. The objective of this systematic review was to assess and summarize the evidence on the effectiveness of management practices to prevent calf morbidity and mortality from NCD and BRD in beef cow–calf herds. The population of interest was preweaned beef calves. The outcomes were calf morbidity and mortality caused by NCD and BRD. Only studies reporting naturally occurring diseases were included. Seventeen studies were deemed relevant, 6 studies of which were controlled trials or randomized controlled trials (RCTs), and 11 were observational studies. Most management practices had some evidence to support their use; however, the certainty of the findings was low to very low. Most of the practices were shown to impact both NCD and BRD. Yet, the different levels of consistency in the directionality of the findings suggest that some outcomes are more affected by some practices than others. More well-designed RCTs and cohort studies are required to provide reliable estimates to support recommended practices for cow–calf herds.
Calves sold at weaning are the main source of income for cow–calf operations, and their survival should be a priority. Given this, the effective use of management practices for pregnant dams and calves to prevent calf mortality is essential; however, decision-makers often do not have access to information about the effectiveness of many management practices. A systematic review was conducted to summarize the evidence of the effectiveness of biosecurity, vaccination, colostrum management, breeding and calving season management, and nutritional management practices for preventing preweaned beef calf mortality. The population of interest was preweaned beef calves from birth until at least 3 months of age. The outcome of interest was general preweaning calf mortality with stillbirths excluded. Eleven studies were deemed relevant. Ten were observational cross-sectional studies, and one was a randomized controlled trial (RCT). The practices that were statistically significantly associated with calf mortality were intervening with colostrum in case a calf had not nursed from its dam or was assisted at calving, timing and length of the calving season, and injecting selenium and vitamin E at birth. More well-executed RCTs and cohort studies are needed to provide evidence of effectiveness and help support implementation of recommended practices in herds.
This chapter reviews research on a contemporary form of prejudice – aversive racism – and considers the important role of implicit bias in the subtle expressions of discrimination associated with aversive racism. Aversive racism characterizes the racial attitudes of a substantial portion of well-intentioned people who genuinely endorse egalitarian values and believe that they are not prejudiced but at the same time possess automatically activated, often nonconscious, negative feelings and beliefs about members of another group. Our focus in this chapter is on the bias of White Americans toward Black Americans, but we also discuss relevant findings in other intergroup contexts. We emphasize the importance of considering, jointly, both explicit and implicit biases for understanding subtle, and potentially unintentional, expressions of discrimination. The chapter concludes by discussing how research on aversive racism and implicit bias has been mutually informative and suggests specific promising directions for future work.
Inappropriate diagnosis and treatment of urinary tract infections (UTIs) contribute to antibiotic overuse. The Inappropriate Diagnosis of UTI (ID-UTI) measure uses a standard definition of asymptomatic bacteriuria (ASB) and was validated in large hospitals. Critical access hospitals (CAHs) have different resources which may make ASB stewardship challenging. To address this inequity, we adapted the ID-UTI metric for use in CAHs and assessed the adapted measure’s feasibility, validity, and reliability.
Design:
Retrospective observational study
Participants:
10 CAHs
Methods:
From October 2022 to July 2023, CAHs submitted clinical information for adults admitted or discharged from the emergency department who received antibiotics for a positive urine culture. Feasibility of case submission was assessed as the number of CAHs achieving the goal of 59 cases. Validity (sensitivity/specificity) and reliability of the ID-UTI definition were assessed by dual-physician review of a random sample of submitted cases.
Results:
Among 10 CAHs able to participate throughout the study period, only 40% (4/10) submitted >59 cases (goal); an additional 3 submitted >35 cases (secondary goal). Per the ID-UTI metric, 28% (16/58) of cases were ASB. Compared to physician review, the ID-UTI metric had 100% specificity (ie all cases called ASB were ASB on clinical review) but poor sensitivity (48.5%; ie did not identify all ASB cases). Measure reliability was high (93% [54/58] agreement).
Conclusions:
Similar to measure performance in non-CAHs, the ID-UTI measure had high reliability and specificity—all cases identified as ASB were considered ASB—but poor sensitivity. Though feasible for a subset of CAHs, barriers remain.
Weight loss results in obligatory reductions in energy expenditure (EE) due to loss of metabolically active fat-free mass (FFM). This is accompanied by adaptive reductions (i.e. adaptive thermogenesis) designed to restore energy balance while in an energy crisis. While the ‘3500-kcal rule’ is used to advise weight loss in clinical practice, the assumption that EE remains constant during energy restriction results in a large overestimation of weight loss. Thus, this work proposes a novel method of weight-loss prediction to more accurately account for the dynamic trajectory of EE. A mathematical model of weight loss was developed using ordinary differential equations relying on simple self-reported inputs of weight and energy intake to predict weight loss over a specified time. The model subdivides total daily EE into resting EE, physical activity EE, and diet-induced thermogenesis, modelling obligatory and adaptive changes in each compartment independently. The proposed model was tested and refined using commercial weight-loss data from participants enrolled on a very low-energy total-diet replacement programme (LighterLife UK, Essex). Mathematical modelling predicted post-intervention weight loss within 0.75% (1.07 kg) of that observed in females with overweight or obesity. Short-term weight loss was consistently underestimated, likely due to considerable FFM reductions reported on the onset of weight loss. The best model agreement was observed from 6 to 9 weeks where the predicted end-weight was within 0.35 kg of that observed. The proposed mathematical model simulated rapid weight loss with reasonable accuracy. Incorporated terms for energy partitioning and adaptive thermogenesis allow us to easily account for dynamic changes in EE, supporting the potential use of such a model in clinical practice.
In response to the COVID-19 pandemic, we rapidly implemented a plasma coordination center, within two months, to support transfusion for two outpatient randomized controlled trials. The center design was based on an investigational drug services model and a Food and Drug Administration-compliant database to manage blood product inventory and trial safety.
Methods:
A core investigational team adapted a cloud-based platform to randomize patient assignments and track inventory distribution of control plasma and high-titer COVID-19 convalescent plasma of different blood groups from 29 donor collection centers directly to blood banks serving 26 transfusion sites.
Results:
We performed 1,351 transfusions in 16 months. The transparency of the digital inventory at each site was critical to facilitate qualification, randomization, and overnight shipments of blood group-compatible plasma for transfusions into trial participants. While inventory challenges were heightened with COVID-19 convalescent plasma, the cloud-based system, and the flexible approach of the plasma coordination center staff across the blood bank network enabled decentralized procurement and distribution of investigational products to maintain inventory thresholds and overcome local supply chain restraints at the sites.
Conclusion:
The rapid creation of a plasma coordination center for outpatient transfusions is infrequent in the academic setting. Distributing more than 3,100 plasma units to blood banks charged with managing investigational inventory across the U.S. in a decentralized manner posed operational and regulatory challenges while providing opportunities for the plasma coordination center to contribute to research of global importance. This program can serve as a template in subsequent public health emergencies.
Asymptomatic bacteriuria (ASB) treatment is a common form of antibiotic overuse and diagnostic error. Antibiotic stewardship using the inappropriate diagnosis of urinary tract infection (ID-UTI) measure has reduced ASB treatment in diverse hospitals. However, critical access hospitals (CAHs) have differing resources that could impede stewardship. We aimed to determine if stewardship including the ID-UTI measure could reduce ASB treatment in CAHs.
Methods:
From October 2022 to July 2023, ten CAHs participated in an Intensive Quality Improvement Cohort (IQIC) program including 3 interventions to reduce ASB treatment: 1) learning labs (ie, didactics with shared learning), 2) mentoring, and 3) data-driven performance reports including hospital peer comparison based on the ID-UTI measure. To assess effectiveness of the IQIC program, change in the ID-UTI measure (ie, percentage of patients treated for a UTI who had ASB) was compared to two non-equivalent control outcomes (antibiotic duration and unjustified fluoroquinolone use).
Results:
Ten CAHs abstracted a total of 608 positive urine culture cases. Over the cohort period, the percentage of patients treated for a UTI who had ASB declined (aOR per month = 0.935, 95% CI: 0.873, 1.001, P = 0.055) from 28.4% (range across hospitals, 0%-63%) in the first to 18.6% (range, 0%-33%) in the final month. In contrast, antibiotic duration and unjustified fluoroquinolone use were unchanged (P = 0.768 and 0.567, respectively).
Conclusions:
The IQIC intervention, including learning labs, mentoring, and performance reports using the ID-UTI measure, was associated with a non-significant decrease in treatment of ASB, while control outcomes (duration and unjustified fluoroquinolone use) did not change.
Depression is an independent risk factor for cardiovascular disease (CVD), but it is unknown if successful depression treatment reduces CVD risk.
Methods
Using eIMPACT trial data, we examined the effect of modernized collaborative care for depression on indicators of CVD risk. A total of 216 primary care patients with depression and elevated CVD risk were randomized to 12 months of the eIMPACT intervention (internet cognitive-behavioral therapy [CBT], telephonic CBT, and select antidepressant medications) or usual primary care. CVD-relevant health behaviors (self-reported CVD prevention medication adherence, sedentary behavior, and sleep quality) and traditional CVD risk factors (blood pressure and lipid fractions) were assessed over 12 months. Incident CVD events were tracked over four years using a statewide health information exchange.
Results
The intervention group exhibited greater improvement in depressive symptoms (p < 0.01) and sleep quality (p < 0.01) than the usual care group, but there was no intervention effect on systolic blood pressure (p = 0.36), low-density lipoprotein cholesterol (p = 0.38), high-density lipoprotein cholesterol (p = 0.79), triglycerides (p = 0.76), CVD prevention medication adherence (p = 0.64), or sedentary behavior (p = 0.57). There was an intervention effect on diastolic blood pressure that favored the usual care group (p = 0.02). The likelihood of an incident CVD event did not differ between the intervention (13/107, 12.1%) and usual care (9/109, 8.3%) groups (p = 0.39).
Conclusions
Successful depression treatment alone is not sufficient to lower the heightened CVD risk of people with depression. Alternative approaches are needed.
Medical researchers are increasingly prioritizing the inclusion of underserved communities in clinical studies. However, mere inclusion is not enough. People from underserved communities frequently experience chronic stress that may lead to accelerated biological aging and early morbidity and mortality. It is our hope and intent that the medical community come together to engineer improved health outcomes for vulnerable populations. Here, we introduce Health Equity Engineering (HEE), a comprehensive scientific framework to guide research on the development of tools to identify individuals at risk of poor health outcomes due to chronic stress, the integration of these tools within existing healthcare system infrastructures, and a robust assessment of their effectiveness and sustainability. HEE is anchored in the premise that strategic intervention at the individual level, tailored to the needs of the most at-risk people, can pave the way for achieving equitable health standards at a broader population level. HEE provides a scientific framework guiding health equity research to equip the medical community with a robust set of tools to enhance health equity for current and future generations.
A version of the classical Buffon problem in the plane naturally extends to the setting of any Riemannian surface with constant Gaussian curvature. The Buffon probability determines a Buffon deficit. The relationship between Gaussian curvature and the Buffon deficit is similar to the relationship that the Bertrand–Diguet–Puiseux theorem establishes between Gaussian curvature and both circumference and area deficits.
With the current interest in the use of transition metal-exchanged phyllosilicates as catalysts for novel organic syntheses, investigations into the factors which affect the movement of reactant, product, and solvent molecules into and out of their interlamellar region are of considerable importance. Mixed organic-water intercalates of a Wyoming montmorillonite, exemplified by the Na-montmorillonitepyridine-water system which can form four different intercalates exhibiting basal spacing of 29.3, 23.3, 19.4, and 14.8 Å depending on the pyridine: water ratio, have been used as a model system. X-ray and neutron diffraction and quasielastic neutron scattering data relating to the interconversion of interlayer species indicate that access to and exit from the interlayer space is hindered at high partial pressures of water by a water-film diffusion barrier in the interparticulate voids which exist between the aggregated silicate layers. At lower water vapor pressures the rate-limiting step for interconversion from one intercalate to another is the rate of transport of reagents and products to and from the clay particles. Under conditions where these rates are fixed, the rate-limiting step is the rate of diffusion of the pyridine molecule in the lower-spacing intercalate. Processes which involve a change in basal spacing do not necessarily proceed via a single discrete step, but are also affected by the amount of water made available to the system. In organic reactions catalyzed in the interlamellar space of various cation-exchanged montmorillonites (e.g., the conversion of alk-1-enes to di-2,2’-alkyl ethers and the reaction of alcohols to form ethers), rate-determining steps similar to those found above are likely to be operative. In particular, for reactions carried out in the liquid phase, where mass transport is facile and where phase-transfer problems are avoided, such reactions are likely to be diffusion controlled.
When 1-hexene was adsorbed by Cr3+-montmorillonite at room temperature, all evidence of C=C(str) vibrations was lost. Protonation of the alkene occurred, and the secondary carbocation formed was bound at a site on the primary coordination sphere of the interlayer cation. Some of the hydrogen atoms of these primary-sphere water molecules were involved in strong hydrogen bonds to the silicate sheets, whereas others did not form such bonds, but were free and directed into the interlayer space. These latter hydrogen atoms were labile and protonated the alkene molecules.
Helium or neopentane can be used as surrogate gas fill for deuterium (D2) or deuterium-tritium (DT) in laser-plasma interaction studies. Surrogates are convenient to avoid flammability hazards or the integration of cryogenics in an experiment. To test the degree of equivalency between deuterium and helium, experiments were conducted in the Pecos target chamber at Sandia National Laboratories. Observables such as laser propagation and signatures of laser-plasma instabilities (LPI) were recorded for multiple laser and target configurations. It was found that some observables can differ significantly despite the apparent similarity of the gases with respect to molecular charge and weight. While a qualitative behaviour of the interaction may very well be studied by finding a suitable compromise of laser absorption, electron density, and LPI cross sections, a quantitative investigation of expected values for deuterium fills at high laser intensities is not likely to succeed with surrogate gases.
Identifying youths most at risk to COVID-19-related mental illness is essential for the development of effective targeted interventions.
Aims
To compare trajectories of mental health throughout the pandemic in youth with and without prior mental illness and identify those most at risk of COVID-19-related mental illness.
Method
Data were collected from individuals aged 18–26 years (N = 669) from two existing cohorts: IMAGEN, a population-based cohort; and ESTRA/STRATIFY, clinical cohorts of individuals with pre-existing diagnoses of mental disorders. Repeated COVID-19 surveys and standardised mental health assessments were used to compare trajectories of mental health symptoms from before the pandemic through to the second lockdown.
Results
Mental health trajectories differed significantly between cohorts. In the population cohort, depression and eating disorder symptoms increased by 33.9% (95% CI 31.78–36.57) and 15.6% (95% CI 15.39–15.68) during the pandemic, respectively. By contrast, these remained high over time in the clinical cohort. Conversely, trajectories of alcohol misuse were similar in both cohorts, decreasing continuously (a 15.2% decrease) during the pandemic. Pre-pandemic symptom severity predicted the observed mental health trajectories in the population cohort. Surprisingly, being relatively healthy predicted increases in depression and eating disorder symptoms and in body mass index. By contrast, those initially at higher risk for depression or eating disorders reported a lasting decrease.
Conclusions
Healthier young people may be at greater risk of developing depressive or eating disorder symptoms during the COVID-19 pandemic. Targeted mental health interventions considering prior diagnostic risk may be warranted to help young people cope with the challenges of psychosocial stress and reduce the associated healthcare burden.
OBJECTIVES/GOALS: Glioblastomas (GBMs) are heterogeneous, treatment-resistant tumors that are driven by populations of cancer stem cells (CSCs). In this study, we perform an epigenetic-focused functional genomics screen in GBM organoids and identify WDR5 as an essential epigenetic regulator in the SOX2-enriched, therapy resistant cancer stem cell niche. METHODS/STUDY POPULATION: Despite their importance for tumor growth, few molecular mechanisms critical for CSC population maintenance have been exploited for therapeutic development. We developed a spatially resolved loss-of-function screen in GBM patient-derived organoids to identify essential epigenetic regulators in the SOX2-enriched, therapy resistant niche. Our niche-specific screens identified WDR5, an H3K4 histone methyltransferase responsible for activating specific gene expression, as indispensable for GBM CSC growth and survival. RESULTS/ANTICIPATED RESULTS: In GBM CSC models, WDR5 inhibitors blocked WRAD complex assembly and reduced H3K4 trimethylation and expression of genes involved in CSC-relevant oncogenic pathways. H3K4me3 peaks lost with WDR5 inhibitor treatment occurred disproportionally on POU transcription factor motifs, required for stem cell maintenance and including the POU5F1(OCT4)::SOX2 motif. We incorporated a SOX2/OCT4 motif driven GFP reporter system into our CSC cell models and found that WDR5 inhibitor treatment resulted in dose-dependent silencing of stem cell reporter activity. Further, WDR5 inhibitor treatment altered the stem cell state, disrupting CSC in vitro growth and self-renewal as well as in vivo tumor growth. DISCUSSION/SIGNIFICANCE: Our results unveiled the role of WDR5 in maintaining the CSC state in GBM and provide a rationale for therapeutic development of WDR5 inhibitors for GBM and other advanced cancers. This conceptual and experimental framework can be applied to many cancers, and can unmask unique microenvironmental biology and rationally designed combination therapies.
The United Nations (UN) established an umbrella of organizations to manage distinct clusters of humanitarian aid. The World Health Organization (WHO) oversees the health cluster, giving it responsibility for global, national, and local medical responses to natural disasters. However, this centralized structure insufficiently engages local players, impeding robust local implementation. The Gorkha earthquake struck Nepal on April 25, 2015, becoming Nepal’s most severe natural disaster since the 1934 Nepal-Bihar earthquake. In coordinated response, 2 organizations, Empower Nepali Girls and International Neurosurgical Children’s Association, used a hybrid approach integrating continuous communication with local recipients. Each organization mobilized its principal resource strengths—material medical supplies or human capital—thereby efficiently deploying resources to maximize the impact of the medical response. In addition to efficient resource use, this approach facilitates dynamic medical responses from highly mobile organizations. Importantly, in addition to future earthquakes in Nepal, this medical response strategy is easily scalable to other natural disaster contexts and other medical relief organizations. Preemptively identifying partner organizations with complementary strengths, continuous engagement with recipient populations, and creating disaster- and region-specific response teams may represent viable variations of the WHO cluster model with greater efficacy in local implementation of treatment in acute disaster scenarios.
To evaluate variables that affect risk of contamination for endoscopic retrograde cholangiopancreatography and endoscopic ultrasound endoscopes.
Design:
Observational, quality improvement study.
Setting:
University medical center with a gastrointestinal endoscopy service performing ∼1,000 endoscopic retrograde cholangiopancreatography and ∼1,000 endoscopic ultrasound endoscope procedures annually.
Methods:
Duodenoscope and linear echoendoscope sampling (from the elevator mechanism and instrument channel) was performed from June 2020 through September 2021. Operational changes during this period included standard reprocessing with high-level disinfection with ethylene oxide gas sterilization (HLD–ETO) was switched to double high-level disinfection (dHLD) (June 16, 2020–July 15, 2020), and duodenoscopes changed to disposable tip model (March 2021). The frequency of contamination for the co-primary outcomes were characterized by calculated risk ratios.
Results:
The overall pathogenic contamination rate was 4.72% (6 of 127). Compared to duodenoscopes, linear echoendoscopes had a contamination risk ratio of 3.64 (95% confidence interval [CI], 0.69–19.1). Reprocessing using HLD-ETO was associated with a contamination risk ratio of 0.29 (95% CI, 0.06–1.54). Linear echoendoscopes undergoing dHLD had the highest risk of contamination (2 of 18, 11.1%), and duodenoscopes undergoing HLD-ETO and the lowest risk of contamination (0 of 53, 0%). Duodenoscopes with a disposable tip had a 0% contamination rate (0 of 27).
Conclusions:
We did not detect a significant reduction in endoscope contamination using HLD-ETO versus dHLD reprocessing. Linear echoendoscopes have a risk of contamination similar to that of duodenoscopes. Disposable tips may reduce the risk of duodenoscope contamination.