We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In response to the COVID-19 pandemic, we rapidly implemented a plasma coordination center, within two months, to support transfusion for two outpatient randomized controlled trials. The center design was based on an investigational drug services model and a Food and Drug Administration-compliant database to manage blood product inventory and trial safety.
Methods:
A core investigational team adapted a cloud-based platform to randomize patient assignments and track inventory distribution of control plasma and high-titer COVID-19 convalescent plasma of different blood groups from 29 donor collection centers directly to blood banks serving 26 transfusion sites.
Results:
We performed 1,351 transfusions in 16 months. The transparency of the digital inventory at each site was critical to facilitate qualification, randomization, and overnight shipments of blood group-compatible plasma for transfusions into trial participants. While inventory challenges were heightened with COVID-19 convalescent plasma, the cloud-based system, and the flexible approach of the plasma coordination center staff across the blood bank network enabled decentralized procurement and distribution of investigational products to maintain inventory thresholds and overcome local supply chain restraints at the sites.
Conclusion:
The rapid creation of a plasma coordination center for outpatient transfusions is infrequent in the academic setting. Distributing more than 3,100 plasma units to blood banks charged with managing investigational inventory across the U.S. in a decentralized manner posed operational and regulatory challenges while providing opportunities for the plasma coordination center to contribute to research of global importance. This program can serve as a template in subsequent public health emergencies.
OBJECTIVES/GOALS: Contingency management (CM) procedures yield measurable reductions in cocaine use. This poster describes a trial aimed at using CM as a vehicle to show the biopsychosocial health benefits of reduced use, rather than total abstinence, the currently accepted metric for treatment efficacy. METHODS/STUDY POPULATION: In this 12-week, randomized controlled trial, CM was used to reduce cocaine use and evaluate associated improvements in cardiovascular, immune, and psychosocial well-being. Adults aged 18 and older who sought treatment for cocaine use (N=127) were randomized into three groups in a 1:1:1 ratio: High Value ($55) or Low Value ($13) CM incentives for cocaine-negative urine samples or a non-contingent control group. They completed outpatient sessions three days per week across the 12-week intervention period, totaling 36 clinic visits and four post-treatment follow-up visits. During each visit, participants provided observed urine samples and completed several assays of biopsychosocial health. RESULTS/ANTICIPATED RESULTS: Preliminary findings from generalized linear mixed effect modeling demonstrate the feasibility of the CM platform. Abstinence rates from cocaine use were significantly greater in the High Value group (47% negative; OR = 2.80; p = 0.01) relative to the Low Value (23% negative) and Control groups (24% negative;). In the planned primary analysis, the level of cocaine use reduction based on cocaine-negative urine samples will serve as the primary predictor of cardiovascular (e.g., endothelin-1 levels), immune (e.g., IL-10 levels) and psychosocial (e.g., Addiction Severity Index) outcomes using results from the fitted models. DISCUSSION/SIGNIFICANCE: This research will advance the field by prospectively and comprehensively demonstrating the beneficial effects of reduced cocaine use. These outcomes can, in turn, support the adoption of reduced cocaine use as a viable alternative endpoint in cocaine treatment trials.
We recently reported on the radio-frequency attenuation length of cold polar ice at Summit Station, Greenland, based on bi-static radar measurements of radio-frequency bedrock echo strengths taken during the summer of 2021. Those data also allow studies of (a) the relative contributions of coherent (such as discrete internal conducting layers with sub-centimeter transverse scale) vs incoherent (e.g. bulk volumetric) scattering, (b) the magnitude of internal layer reflection coefficients, (c) limits on signal propagation velocity asymmetries (‘birefringence’) and (d) limits on signal dispersion in-ice over a bandwidth of ~100 MHz. We find that (1) attenuation lengths approach 1 km in our band, (2) after averaging 10 000 echo triggers, reflected signals observable over the thermal floor (to depths of ~1500 m) are consistent with being entirely coherent, (3) internal layer reflectivities are ≈–60$\to$–70 dB, (4) birefringent effects for vertically propagating signals are smaller by an order of magnitude relative to South Pole and (5) within our experimental limits, glacial ice is non-dispersive over the frequency band relevant for neutrino detection experiments.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Precision Medicine is an emerging approach for disease treatment and prevention that takes into account individual variability in genes, environment, and lifestyle. Autoimmune diseases are those in which the body’s natural defense system loses discriminating power between its own cells and foreign cells, causing the body to mistakenly attack healthy tissues. These conditions are very heterogeneous in their presentation and therefore difficult to diagnose and treat. Achieving precision medicine in autoimmune diseases has been challenging due to the complex etiologies of these conditions, involving an interplay between genetic, epigenetic, and environmental factors. However, recent technological and computational advances in molecular profiling have helped identify patient subtypes and molecular pathways which can be used to improve diagnostics and therapeutics. This review discusses the current understanding of the disease mechanisms, heterogeneity, and pathogenic autoantigens in autoimmune diseases gained from genomic and transcriptomic studies and highlights how these findings can be applied to better understand disease heterogeneity in the context of disease diagnostics and therapeutics.
To provide comprehensive population-level estimates of the burden of healthcare-associated influenza.
Design:
Retrospective cross-sectional study.
Setting:
US Influenza Hospitalization Surveillance Network (FluSurv-NET) during 2012–2013 through 2018–2019 influenza seasons.
Patients:
Laboratory-confirmed influenza-related hospitalizations in an 8-county catchment area in Tennessee.
Methods:
The incidence of healthcare-associated influenza was determined using the traditional definition (ie, positive influenza test after hospital day 3) in addition to often underrecognized cases associated with recent post-acute care facility admission or a recent acute care hospitalization for a noninfluenza illness in the preceding 7 days.
Results:
Among the 5,904 laboratory-confirmed influenza-related hospitalizations, 147 (2.5%) had traditionally defined healthcare-associated influenza. When we included patients with a positive influenza test obtained in the first 3 days of hospitalization and who were either transferred to the hospital directly from a post-acute care facility or who were recently discharged from an acute care facility for a noninfluenza illness in the preceding 7 days, we identified an additional 1,031 cases (17.5% of all influenza-related hospitalizations).
Conclusions:
Including influenza cases associated with preadmission healthcare exposures with traditionally defined cases resulted in an 8-fold higher incidence of healthcare-associated influenza. These results emphasize the importance of capturing other healthcare exposures that may serve as the initial site of viral transmission to provide more comprehensive estimates of the burden of healthcare-associated influenza and to inform improved infection prevention strategies.
This paper used data from the Apathy in Dementia Methylphenidate Trial 2 (NCT02346201) to conduct a planned cost consequence analysis to investigate whether treatment of apathy with methylphenidate is economically attractive.
Methods:
A total of 167 patients with clinically significant apathy randomized to either methylphenidate or placebo were included. The Resource Utilization in Dementia Lite instrument assessed resource utilization for the past 30 days and the EuroQol five dimension five level questionnaire assessed health utility at baseline, 3 months, and 6 months. Resources were converted to costs using standard sources and reported in 2021 USD. A repeated measures analysis of variance compared change in costs and utility over time between the treatment and placebo groups. A binary logistic regression was used to assess cost predictors.
Results:
Costs were not significantly different between groups whether the cost of methylphenidate was excluded (F(2,330) = 0.626, ηp2 = 0.004, p = 0.535) or included (F(2,330) = 0.629, ηp2 = 0.004, p = 0.534). Utility improved with methylphenidate treatment as there was a group by time interaction (F(2,330) = 7.525, ηp2 = 0.044, p < 0.001).
Discussion:
Results from this study indicated that there was no evidence for a difference in resource utilization costs between methylphenidate and placebo treatment. However, utility improved significantly over the 6-month follow-up period. These results can aid in decision-making to improve quality of life in patients with Alzheimer’s disease while considering the burden on the healthcare system.
Anorexia nervosa (AN) is a psychiatric disorder associated with marked morbidity. Whilst AN genetic studies could identify novel treatment targets, integration of functional genomics data, including transcriptomics and proteomics, would assist to disentangle correlated signals and reveal causally associated genes.
Methods
We used models of genetically imputed expression and splicing from 14 tissues, leveraging mRNA, protein, and mRNA alternative splicing weights to identify genes, proteins, and transcripts, respectively, associated with AN risk. This was accomplished through transcriptome, proteome, and spliceosome-wide association studies, followed by conditional analysis and finemapping to prioritise candidate causal genes.
Results
We uncovered 134 genes for which genetically predicted mRNA expression was associated with AN after multiple-testing correction, as well as four proteins and 16 alternatively spliced transcripts. Conditional analysis of these significantly associated genes on other proximal association signals resulted in 97 genes independently associated with AN. Moreover, probabilistic finemapping further refined these associations and prioritised putative causal genes. The gene WDR6, for which increased genetically predicted mRNA expression was correlated with AN, was strongly supported by both conditional analyses and finemapping. Pathway analysis of genes revealed by finemapping identified the pathway regulation of immune system process (overlapping genes = MST1, TREX1, PRKAR2A, PROS1) as statistically overrepresented.
Conclusions
We leveraged multiomic datasets to genetically prioritise novel risk genes for AN. Multiple-lines of evidence support that WDR6 is associated with AN, whilst other prioritised genes were enriched within immune related pathways, further supporting the role of the immune system in AN.
We describe the association between job roles and coronavirus disease 2019 (COVID-19) among healthcare personnel. A wide range of hazard ratios were observed across job roles. Medical assistants had higher hazard ratios than nurses, while attending physicians, food service workers, laboratory technicians, pharmacists, residents and fellows, and temporary workers had lower hazard ratios.
We describe COVID-19 cases among nonphysician healthcare personnel (HCP) by work location. The proportion of HCP with coronavirus disease 2019 (COVID-19) was highest in the emergency department and lowest among those working remotely. COVID-19 and non–COVID-19 units had similar proportions of HCP with COVID-19 (13%). Cases decreased across all work locations following COVID-19 vaccination.
Adverse drug reactions (ADRs) are associated with increased morbidity, mortality, and resource utilization. Drug interactions (DDIs) are among the most common causes of ADRs, and estimates have cited that up to 22% of patients take interacting medications. DDIs are often due to the propensity for agents to induce or inhibit enzymes responsible for the metabolism of concomitantly administered drugs. However, this phenomenon is further complicated by genetic variants of such enzymes. The aim of this study is to quantify and describe potential drug-drug, drug-gene, and drug-drug-gene interactions in a community-based patient population.
Methods
A regional pharmacy with retail outlets in Arkansas provided deidentified prescription data from March 2020 for 4761 individuals. Drug-drug and drug-drug-gene interactions were assessed utilizing the logic incorporated into GenMedPro, a commercially available digital gene-drug interaction software program that incorporates variants of 9 pharmacokinetic (PK) and 2 pharmacodynamic (PD) genes to evaluate DDIs and drug-gene interactions. The data were first assessed for composite drug-drug interaction risk, and each individual was stratified to a risk category using the logic incorporated in GenMedPro. To calculate the frequency of potential drug-gene interactions, genotypes were imputed and allocated to the cohort according to each gene’s frequency in the general population. Potential genotypes were randomly allocated to the population 100 times in a Monte Carlo simulation. Potential drug-drug, gene-drug, or gene-drug-drug interaction risk was characterized as minor, moderate, or major.
Results
Based on prescription data only, the probability of a DDI of any impact (mild, moderate, or major) was 26% [95% CI: 0.248-0.272] in the population. This probability increased to 49.6% [95% CI: 0.484-0.507] when simulated genetic polymorphisms were additionally assessed. When assessing only major impact interactions, there was a 7.8% [95% CI: 0.070-0.085] probability of drug-drug interactions and 10.1% [95% CI: 0.095-0.108] probability with the addition of genetic contributions. The probability of drug-drug-gene interactions of any impact was correlated with the number of prescribed medications, with an approximate probability of 77%, 85%, and 94% in patients prescribed 5, 6, or 7+ medications, respectively. When stratified by specific drug class, antidepressants (19.5%), antiemetics (21.4%), analgesics (16%), antipsychotics (15.6%), and antiparasitics (49.7%) had the highest probability of major drug-drug-gene interaction.
Conclusions
In a community-based population of outpatients, the probability of drug-drug interaction risk increases when genetic polymorphisms are attributed to the population. These data suggest that pharmacogenetic testing may be useful in predicting drug interactions, drug-gene interactions, and severity of interactions when proactively evaluating patient medication profiles.
Disruptive behavior disorders (DBD) are heterogeneous at the clinical and the biological level. Therefore, the aims were to dissect the heterogeneous neurodevelopmental deviations of the affective brain circuitry and provide an integration of these differences across modalities.
Methods
We combined two novel approaches. First, normative modeling to map deviations from the typical age-related pattern at the level of the individual of (i) activity during emotion matching and (ii) of anatomical images derived from DBD cases (n = 77) and controls (n = 52) aged 8–18 years from the EU-funded Aggressotype and MATRICS consortia. Second, linked independent component analysis to integrate subject-specific deviations from both modalities.
Results
While cases exhibited on average a higher activity than would be expected for their age during face processing in regions such as the amygdala when compared to controls these positive deviations were widespread at the individual level. A multimodal integration of all functional and anatomical deviations explained 23% of the variance in the clinical DBD phenotype. Most notably, the top marker, encompassing the default mode network (DMN) and subcortical regions such as the amygdala and the striatum, was related to aggression across the whole sample.
Conclusions
Overall increased age-related deviations in the amygdala in DBD suggest a maturational delay, which has to be further validated in future studies. Further, the integration of individual deviation patterns from multiple imaging modalities allowed to dissect some of the heterogeneity of DBD and identified the DMN, the striatum and the amygdala as neural signatures that were associated with aggression.
We analyzed blood-culture practices to characterize the utilization of the Infectious Diseases Society of America (IDSA) recommendations related to catheter-related bloodstream infection (CRBSI) blood cultures. Most patients with a central line had only peripheral blood cultures. Increasing the utilization of CRBSI guidelines may improve clinical care, but may also affect other quality metrics.
To determine the incidence of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection among healthcare personnel (HCP) and to assess occupational risks for SARS-CoV-2 infection.
Design:
Prospective cohort of healthcare personnel (HCP) followed for 6 months from May through December 2020.
Setting:
Large academic healthcare system including 4 hospitals and affiliated clinics in Atlanta, Georgia.
Participants:
HCP, including those with and without direct patient-care activities, working during the coronavirus disease 2019 (COVID-19) pandemic.
Methods:
Incident SARS-CoV-2 infections were determined through serologic testing for SARS-CoV-2 IgG at enrollment, at 3 months, and at 6 months. HCP completed monthly surveys regarding occupational activities. Multivariable logistic regression was used to identify occupational factors that increased the risk of SARS-CoV-2 infection.
Results:
Of the 304 evaluable HCP that were seronegative at enrollment, 26 (9%) seroconverted for SARS-CoV-2 IgG by 6 months. Overall, 219 participants (73%) self-identified as White race, 119 (40%) were nurses, and 121 (40%) worked on inpatient medical-surgical floors. In a multivariable analysis, HCP who identified as Black race were more likely to seroconvert than HCP who identified as White (odds ratio, 4.5; 95% confidence interval, 1.3–14.2). Increased risk for SARS-CoV-2 infection was not identified for any occupational activity, including spending >50% of a typical shift at a patient’s bedside, working in a COVID-19 unit, or performing or being present for aerosol-generating procedures (AGPs).
Conclusions:
In our study cohort of HCP working in an academic healthcare system, <10% had evidence of SARS-CoV-2 infection over 6 months. No specific occupational activities were identified as increasing risk for SARS-CoV-2 infection.
We described the epidemiology of bat intrusions into a hospital and subsequent management of exposures during 2018–2020. Most intrusions occurred in older buildings during the summer and fall months. Hospitals need bat intrusion surveillance systems and protocols for bat handling, exposure management, and intrusion mitigation.
We opened this volume with sobering stories of the dire global challenges before us. Indeed, one would not be hard pressed to find stories of the urgency of our various environmental and social crises. While we wrote this book, the COVID-19 pandemic raged, towns in the Arctic reached unprecedented temperatures, countless hectares of forests fell while fossil fuels continued to be violently extracted from the earth, and Black, Indigenous and people of colour continued to be exploited and oppressed. Yet, despite all this, or rather because of it, we wish to begin our conclusion with hope and determination. Drawing on Solnit (2016), we believe that there is a spaciousness in the uncertainties posed by the challenges before us in that they offer new possibilities for being, thinking and acting – for renewal and purposeful redirection in our trajectory – and it is through a reawakened awareness of our rich and dynamic relationships to place that we can find a better way forward.