We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The global food system puts enormous pressure on the environment. Managing these pressures requires understanding not only where they occur (i.e., where food is produced), but also who drives them (i.e., where food is consumed). However, the size and complexity of global supply chains make it difficult to trace food production to consumption. Here, we provide the most comprehensive dataset of bilateral trade flows of environmental pressures stemming from food production from producing to consuming nations. The dataset provides environmental pressures for greenhouse gas emissions, water use, nitrogen and phosphorus pollution, and the area of land/water occupancy of food production for crops and animals from land, freshwater, and ocean systems. To produce these data, we improved upon reported food trade and production data to identify producing and consuming nations for each food item, allowing us to match food flows with appropriate environmental pressure data. These data provide a resource for research on sustainable global food consumption and the drivers of environmental impact.
Multicenter clinical trials are essential for evaluating interventions but often face significant challenges in study design, site coordination, participant recruitment, and regulatory compliance. To address these issues, the National Institutes of Health’s National Center for Advancing Translational Sciences established the Trial Innovation Network (TIN). The TIN offers a scientific consultation process, providing access to clinical trial and disease experts who provide input and recommendations throughout the trial’s duration, at no cost to investigators. This approach aims to improve trial design, accelerate implementation, foster interdisciplinary teamwork, and spur innovations that enhance multicenter trial quality and efficiency. The TIN leverages resources of the Clinical and Translational Science Awards (CTSA) program, complementing local capabilities at the investigator’s institution. The Initial Consultation process focuses on the study’s scientific premise, design, site development, recruitment and retention strategies, funding feasibility, and other support areas. As of 6/1/2024, the TIN has provided 431 Initial Consultations to increase efficiency and accelerate trial implementation by delivering customized support and tailored recommendations. Across a range of clinical trials, the TIN has developed standardized, streamlined, and adaptable processes. We describe these processes, provide operational metrics, and include a set of lessons learned for consideration by other trial support and innovation networks.
The First Large Absorption Survey in H i (FLASH) is a large-area radio survey for neutral hydrogen in and around galaxies in the intermediate redshift range $0.4\lt z\lt1.0$, using the 21-cm H i absorption line as a probe of cold neutral gas. The survey uses the ASKAP radio telescope and will cover 24,000 deg$^2$ of sky over the next five years. FLASH breaks new ground in two ways – it is the first large H i absorption survey to be carried out without any optical preselection of targets, and we use an automated Bayesian line-finding tool to search through large datasets and assign a statistical significance to potential line detections. Two Pilot Surveys, covering around 3000 deg$^2$ of sky, were carried out in 2019-22 to test and verify the strategy for the full FLASH survey. The processed data products from these Pilot Surveys (spectral-line cubes, continuum images, and catalogues) are public and available online. In this paper, we describe the FLASH spectral-line and continuum data products and discuss the quality of the H i spectra and the completeness of our automated line search. Finally, we present a set of 30 new H i absorption lines that were robustly detected in the Pilot Surveys, almost doubling the number of known H i absorption systems at $0.4\lt z\lt1$. The detected lines span a wide range in H i optical depth, including three lines with a peak optical depth $\tau\gt1$, and appear to be a mixture of intervening and associated systems. Interestingly, around two-thirds of the lines found in this untargeted sample are detected against sources with a peaked-spectrum radio continuum, which are only a minor (5–20%) fraction of the overall radio-source population. The detection rate for H i absorption lines in the Pilot Surveys (0.3 to 0.5 lines per 40 deg$^2$ ASKAP field) is a factor of two below the expected value. One possible reason for this is the presence of a range of spectral-line artefacts in the Pilot Survey data that have now been mitigated and are not expected to recur in the full FLASH survey. A future paper in this series will discuss the host galaxies of the H i absorption systems identified here.
Objectives/Goals: We aimed to discover treatment candidates for uterine fibroids, a common benign tumor with adverse impacts on quality of life. Repurposing already approved medications for fibroids can expedite treatment option expansion. Using genetic proxies, we identified novel fibroid drug candidates and estimated their effect on risk of fibroid diagnosis. Methods/Study Population: We performed a genetically predicted gene expression (GPGE) analysis using S-PrediXcan and GTEx tissue models with multi-ancestry genome-wide association study (GWAS) summary statistics of fibroids (cases = 74,294, controls = 465). There were 81 genes significantly associated with fibroid risk. Querying drug–gene interaction databases identified 56 approved medications that target these genes, including two antihypertensives, hydralazine, and spironolactone. Using independent multi-ancestry GWAS summary statistics (N = 635,969) for systolic (SBP) and diastolic blood pressure (DBP), we conducted GPGE analyses. Blood pressure (exposure) and fibroids (outcome) GPGE summary statistics in the same tissues were used for two-sample Mendelian randomization (MR) analyses to proxy medication effects. Results/Anticipated Results: GPGE analyses identified hydralazine/tumor protein P53 (TP53) activity and spironolactone/thyroid hormone receptor beta (THRB) activity as drug-gene candidate pairs. Both drugs increase gene activity of their paired gene. Increased TP53 expression was associated with SBP in four tissues (exposure). The MR results indicated hydralazine use, proxied by increased TP53 expression, may reduce fibroid risk by 42% per standard deviation of gene expression (odds ratio [OR] = 0.58, p = 1.43E-12). Increased THRB expression was associated with DBP in eight tissues and were included in the MR (exposure). The MR results suggest spironolactone use, proxied by increased THRB expression, may reduce fibroid risk by 23% per standard deviation of gene expression (OR = 0.77, p = 5.94E-6). Discussion/Significance of Impact: We provide biologically plausible evidence for repurposing hydralazine and spironolactone for reducing risk of fibroid diagnosis. Repurposing these hypertension medications could provide novel preventative treatments for fibroids, particularly for individuals disproportionately affected by both conditions.
To evaluate the impact of a mobile-app-based central line-associated bloodstream infection (CLABSI) prevention program in oncology clinic patients with peripherally inserted central catheters (PICCs).
Design:
Pre-post prospective cohort study with baseline (July 2015–December 2016), phase-in (January 2017–April 2017), and intervention (May 2017–November 2018). Generalized linear mixed models compared intervention with baseline frequency of localized inflammation/infection and dressing peeling. Cox proportional hazards models compared days-to-removal of lines with localized inflammation/infection. Chi-square test compared bacteremia rates before and after intervention.
Setting:
Oncology clinic at a large medical center.
Patients:
Oncology clinic adult patients with PICCs.
Intervention:
CLABSI prevention program consisting of an actionable scoring system for identifying insertion site infection/inflammation coupled with a mobile-app enabling photo-assessments and automated physician alerting for remote response.
Results:
We completed 5,343 assessments of 569 PICCs in 401 patients (baseline: 2,924 assessments, 300 PICCs, 216 patients; intervention: 2,419 assessments, 269 PICCs, 185 patients). The intervention was associated with a 92% lower likelihood of having a dressing with peeling (OR 0.08, 95%CI 0.04-0.17, P < 0.001), 53% lower local inflammation/infection (OR 0.47, 95%CI 0.27-0.84, P < 0.011), and 24% (non-significant) lower CLABSI rates (P = .63). Physician mobile-app alerting and response enabled 80% lower risk of lines remaining in place after inflammation/infection was identified (HR 0.20, 95%CI:0.14-0.30, P < 0.001) and 85% faster removal of infected lines from mean (SD) 11.1 (9.7) to 1.7 (2.4) days.
Conclusions:
A mobile-app-based CLABSI prevention program decreased frequency of inflamed/infected central line insertion sites and increased speed of removal when inflammation/infection was found.
Resilience of the healthcare system has been described as the ability to absorb, adapt, and respond to stress while maintaining the provision of safe patient care. We quantified the impact that stressors associated with the COVID-19 pandemic had on patient safety, as measured by central line-associated bloodstream infections (CLABSIs) reported to the Centers for Disease Control and Prevention’s National Healthcare Safety Network.
Design:
Acute care hospitals were mandated to report markers of resource availability (staffing and hospital occupancy with COVID-19 inpatients) to the federal government between July 2020 and June 2021. These data were used with community levels of COVID-19 to develop a statistical model to assess factors influencing rates of CLABSIs among inpatients during the pandemic.
Results:
After risk adjustment for hospital characteristics, measured stressors were associated with increased CLABSIs. Staff shortages for more than 10% of days per month were associated with a statistically significant increase of 2 CLABSIs per 10,000 central line days versus hospitals reporting staff shortages of less than 10% of days per month. CLABSIs increased with a higher inpatient COVID-19 occupancy rate; when COVID-19 occupancy was 20% or more, there were 5 more CLABSIs per 10,000 central line days versus the referent (less than 5%).
Conclusions:
Reporting of data pertaining to hospital operations during the COVID-19 pandemic afforded an opportunity to evaluate resilience of US hospitals. We demonstrate how the stressors of staffing shortages and high numbers of patients with COVID-19 negatively impacted patient safety, demonstrating poor resilience. Understanding stress in hospitals may allow for the development of policies that support resilience and drive safe care.
Older age significantly increases risk for cognitive decline. A growing number of older adults (≥ 65 years) experience cognitive decline that compromises immediate and/or long-term health. Interventions to mitigate cognitive decline are greatly needed. Intermittent fasting aligned with innate circadian rhythms is associated with health benefits and improved circadian rhythms; here, we explore impacts on cognition and cardiometabolic outcomes.
Methods:
We conducted a single-group, pre-/post-pilot study to explore an 8-week prolonged nightly fasting intervention (14 h fasting/night) among adults 65+ years with self-reported memory decline. We explored changes in cognitive function, insomnia, and cardiometabolic risk factors. Intervention engagement/adherence were assessed. The intervention was delivered fully remotely; participants completed their fasting protocol at home and were not required to come into the lab.
Results:
In total, 20 individuals signed consent and 18 participants completed the study. Participants were mean age 69.7 years, non-Hispanic White (89%), predominantly female (95%), married (50%), and employed (65%). Paired t-tests indicated an increase in cognitive function (Memory and Attention Phone Screener) (p = 0.02) with a medium effect size (Cohen’s d = 0.58) and a decrease in insomnia (Insomnia Severity Index) (p = 0.04) with a medium effect size (Cohen’s d = 0.52). Changes in BMI or diet quality were not observed. Engagement (66%–77%) and adherence (70%–100%) were high.
Conclusion:
These pilot findings suggest that prolonged nightly fasting, targeted to align food intake with circadian rhythms, may improve cognitive function and sleep among older adults. Fully powered, randomized controlled trials to test the efficacy of this non-pharmacological, low cost-to-burden ratio intervention are needed.
Migraine and post-traumatic stress disorder (PTSD) are both twice as common in women as men. Cross-sectional studies have shown associations between migraine and several psychiatric conditions, including PTSD. PTSD is disproportionally common among patients in headache clinics, and individuals with migraine and PTSD report greater disability from migraines and more frequent medication use. To further clarify the nature of the relationship between PTSD and migraine, we conducted bidirectional analyses of the association between (1) migraine and incident PTSD and (2) PTSD and incident migraine.
Methods
We used longitudinal data from 1989–2020 among the 33,327 Nurses’ Health Study II respondents to the 2018 stress questionnaire. We used log-binomial models to estimate the relative risk of developing PTSD among women with migraine and the relative risk of developing migraine among individuals with PTSD, trauma-exposed individuals without PTSD, and individuals unexposed to trauma, adjusting for race, education, marital status, high blood pressure, high cholesterol, alcohol intake, smoking, and body mass index.
Results
Overall, 48% of respondents reported ever experiencing migraine, 82% reported experiencing trauma and 9% met the Diagnostic and Statistical Manual of Mental Disorders-5 criteria for PTSD. Of those reporting migraine and trauma, 67% reported trauma before migraine onset, 2% reported trauma and migraine onset in the same year and 31% reported trauma after migraine onset. We found that migraine was associated with incident PTSD (adjusted relative risk [RR]: 1.26, 95% confidence interval [CI]: 1.14–1.39). PTSD, but not trauma without PTSD, was associated with incident migraine (adjusted RR: 1.20, 95% CI: 1.14–1.27). Findings were consistently stronger in both directions among those experiencing migraine with aura.
Conclusions
Our study provides further evidence that migraine and PTSD are strongly comorbid and found associations of similar magnitude between migraine and incident PTSD and PTSD and incident migraine.
Cognitive decline is intricately linked to various factors such as obesity, stress, poor sleep, and circadian rhythm misalignment, which are interrelated in their impact on cognitive health. Irregular food-intake timing further compounds these issues. The practice of prolonged nightly fasting (PNF) may help synchronize food intake with circadian rhythms, potentially mitigating adverse effects of cognitive decline and associated factors.
Methods:
A pilot nationwide, remotely delivered, 2-arm randomized controlled trial was conducted to assess the 8-week outcomes of cognition, stress, and sleep, after a PNF intervention (14-hr nightly fast, 6 nights/week, no calories after 8 pm) compared to a health education (HED) control condition. Participants were living with memory decline, stress, and obesity and had weekly check-in calls to report fasting times (PNF) or content feedback (HED).
Results:
Participants were enrolled from 37 states in the US; N = 58, 86% women, 71% white, 93% non-Latinx, mean (SD) age 50.1 (5.1) years and BMI 35.6 (3.6) kg/m2. No group differences existed at baseline. Linear mixed-effects models were used to compare outcome change differences between groups. Compared to the HED control, the PNF intervention was associated with improved sleep quality (B = −2.52; SE = 0.90; 95% CI −4.30–−0.74; p = 0.006). Perceived stress and everyday cognition significantly changed over time (p < 0.02), without significant difference by group.
Discussion:
Changing food intake timing to exclude nighttime eating and promote a fasting period may help individuals living with obesity, memory decline, and stress to improve their sleep. Improved sleep quality may lead to additional health benefits.
OBJECTIVES/GOALS: Cognitive decline is associated with obesity, stress, poor sleep, and circadian rhythm misalignment, which are themselves functionally intertwined. Irregular food intake timing exacerbates these all. Prolonged nightly fasting (PNF) aligns food intake with innate circadian rhythms. METHODS/STUDY POPULATION: A nationwide, remotely-delivered, 2-arm randomized controlled trial was conducted to assess feasibility and 8-week outcomes of cognition, stress, sleep, eating behaviors, and general eating habits, after a PNF intervention (14-hr nightly fast, 6 nights/week, no calories after 8pm) compared to a health education control (HEC) condition. Eligible participants were living with obesity, stress (Perceived stress scale-4 (PSS-4) total score ≥5), and memory “not as good as it used to be.” Data were collected via Zoom meetings with participants and trained staff and entered into REDCap. All participants had weekly staff check-in calls to report fasting times (PNF group only) and feedback. RESULTS/ANTICIPATED RESULTS: Eligible participants were enrolled from 37 of 50 US states; N=58, 86% women, 71% white, 93% non-Latinx, mean (SD) 50.1 (5.1) years of age, BMI 35.6 (3.6) kg/m^2. No group differences existed at baseline. Linear mixed-effects models were used to compare group differences across all outcome changes. Compared to the HEC condition, the PNF intervention was associated with improved sleep quality (Pittsburgh Sleep Quality Index; B = -2.52; SE = 0.90; 95% CI-4.30 to -0.74; p=0.006). Stress, everyday cognition, and emotional eating behavior significantly changed over time (p<0.02), but there were no group differences. Analysis of feasibility outcomes are on-going. DISCUSSION/SIGNIFICANCE: Changing food intake timing 6 days per week, to exclude nighttime eating without mandating food quality/quantity change, may benefit many individuals living with obesity, stress and memory decline to improve their sleep. Improved sleep quality may lead to more health benefits over time.
The focus on social determinants of health (SDOH) and their impact on health outcomes is evident in U.S. federal actions by Centers for Medicare & Medicaid Services and Office of National Coordinator for Health Information Technology. The disproportionate impact of COVID-19 on minorities and communities of color heightened awareness of health inequities and the need for more robust SDOH data collection. Four Clinical and Translational Science Award (CTSA) hubs comprising the Texas Regional CTSA Consortium (TRCC) undertook an inventory to understand what contextual-level SDOH datasets are offered centrally and which individual-level SDOH are collected in structured fields in each electronic health record (EHR) system potentially for all patients.
Methods:
Hub teams identified American Community Survey (ACS) datasets available via their enterprise data warehouses for research. Each hub’s EHR analyst team identified structured fields available in their EHR for SDOH using a collection instrument based on a 2021 PCORnet survey and conducted an SDOH field completion rate analysis.
Results:
One hub offered ACS datasets centrally. All hubs collected eleven SDOH elements in structured EHR fields. Two collected Homeless and Veteran statuses. Completeness at four hubs was 80%–98%: Ethnicity, Race; < 10%: Education, Financial Strain, Food Insecurity, Housing Security/Stability, Interpersonal Violence, Social Isolation, Stress, Transportation.
Conclusion:
Completeness levels for SDOH data in EHR at TRCC hubs varied and were low for most measures. Multiple system-level discussions may be necessary to increase standardized SDOH EHR-based data collection and harmonization to drive effective value-based care, health disparities research, translational interventions, and evidence-based policy.
Alterations in cerebral blood flow (CBF) are associated with risk of cognitive decline and Alzheimer’s disease (AD). Although apolipoprotein E (APOE) ε4 and greater vascular risk burden have both been linked to reduced CBF in older adults, less is known about how APOE ε4 status and vascular risk may interact to influence CBF. We aimed to determine whether the effect of vascular risk on CBF varies by gene dose of APOE ε4 alleles (i.e., number of e4 alleles) in older adults without dementia.
Participants and Methods:
144 older adults without dementia from the Alzheimer’s Disease Neuroimaging Initiative (ADNI) underwent arterial spin labeling (ASL) and T1-weighted MRI, APOE genotyping, fluorodeoxyglucose positron emission tomography (FDG-PET), lumbar puncture, and blood pressure assessment. Vascular risk was assessed using pulse pressure (systolic blood pressure -diastolic blood pressure), which is thought to be a proxy for arterial stiffening. Participants were classified by number of APOE ε4 alleles (n0 alleles = 87, m allele = 46, n2 alleles = 11). CBF in six FreeSurfer-derived a priori regions of interest (ROIs) vulnerable to AD were examined: entorhinal cortex, hippocampus, inferior temporal cortex, inferior parietal cortex, rostral middle frontal gyrus, and medial orbitofrontal cortex. Linear regression models tested the interaction between categorical APOE ε4 dose (0, 1, or 2 alleles) and continuous pulse pressure on CBF in each ROI, adjusting for age, sex, cognitive diagnosis (cognitively unimpaired vs. mild cognitive impairment), antihypertensive medication use, cerebral metabolism (FDG-PET composite), reference CBF region (precentral gyrus), and AD biomarker positivity defined using the ADNI-optimized phosphorylated tau/ß-amyloid ratio cut-off of > 0.0251 pg/ml.
Results:
A significant pulse pressure X APOE ε4 dose interaction was found on CBF in the entorhinal cortex, hippocampus, and inferior parietal cortex (ps < .005). Among participants with two e4 alleles, higher pulse pressure was significantly associated with lower CBF (ps < .001). However, among participants with zero or one ε4 allele, there was no significant association between pulse pressure and CBF (ps > .234). No significant pulse pressure X APOE ε4 dose interaction was found in the inferior temporal cortex, rostral middle frontal gyrus, or medial orbitofrontal cortex (ps > .109). Results remained unchanged when additionally controlling for general vascular risk assessed via the modified Hachinski Ischemic Scale.
Conclusions:
These findings demonstrate that the cross-sectional association between pulse pressure and region-specific CBF differs by APOE ε4 dose. In particular, a detrimental effect of elevated pulse pressure on CBF in AD-vulnerable regions was found only among participants with the e4/e4 genotype. Our findings suggest that pulse pressure may play a mechanistic role in neurovascular unit dysregulation for those genetically at greater risk for AD. Given that pulse pressure is just one of many potentially modifiable vascular risk factors for AD, future studies should seek to examine how these other factors (e.g., diabetes, high cholesterol) may interact with APOE genotype to affect cerebrovascular dysfunction.
We conducted a post hoc analysis of an antibiotic stewardship intervention implemented across our health system’s urgent-care network to determine whether there was a differential impact among patient groups. Respiratory urgent-care antibiotic prescribing decreased for all racial, ethnic, and preferred language groups, but disparities in antibiotic prescribing persisted.
We identify a set of essential recent advances in climate change research with high policy relevance, across natural and social sciences: (1) looming inevitability and implications of overshooting the 1.5°C warming limit, (2) urgent need for a rapid and managed fossil fuel phase-out, (3) challenges for scaling carbon dioxide removal, (4) uncertainties regarding the future contribution of natural carbon sinks, (5) intertwinedness of the crises of biodiversity loss and climate change, (6) compound events, (7) mountain glacier loss, (8) human immobility in the face of climate risks, (9) adaptation justice, and (10) just transitions in food systems.
Technical summary
The Intergovernmental Panel on Climate Change Assessment Reports provides the scientific foundation for international climate negotiations and constitutes an unmatched resource for researchers. However, the assessment cycles take multiple years. As a contribution to cross- and interdisciplinary understanding of climate change across diverse research communities, we have streamlined an annual process to identify and synthesize significant research advances. We collected input from experts on various fields using an online questionnaire and prioritized a set of 10 key research insights with high policy relevance. This year, we focus on: (1) the looming overshoot of the 1.5°C warming limit, (2) the urgency of fossil fuel phase-out, (3) challenges to scale-up carbon dioxide removal, (4) uncertainties regarding future natural carbon sinks, (5) the need for joint governance of biodiversity loss and climate change, (6) advances in understanding compound events, (7) accelerated mountain glacier loss, (8) human immobility amidst climate risks, (9) adaptation justice, and (10) just transitions in food systems. We present a succinct account of these insights, reflect on their policy implications, and offer an integrated set of policy-relevant messages. This science synthesis and science communication effort is also the basis for a policy report contributing to elevate climate science every year in time for the United Nations Climate Change Conference.
Social media summary
We highlight recent and policy-relevant advances in climate change research – with input from more than 200 experts.
Improving the quality and conduct of multi-center clinical trials is essential to the generation of generalizable knowledge about the safety and efficacy of healthcare treatments. Despite significant effort and expense, many clinical trials are unsuccessful. The National Center for Advancing Translational Science launched the Trial Innovation Network to address critical roadblocks in multi-center trials by leveraging existing infrastructure and developing operational innovations. We provide an overview of the roadblocks that led to opportunities for operational innovation, our work to develop, define, and map innovations across the network, and how we implemented and disseminated mature innovations.
Since the initial publication of A Compendium of Strategies to Prevent Healthcare-Associated Infections in Acute Care Hospitals in 2008, the prevention of healthcare-associated infections (HAIs) has continued to be a national priority. Progress in healthcare epidemiology, infection prevention, antimicrobial stewardship, and implementation science research has led to improvements in our understanding of effective strategies for HAI prevention. Despite these advances, HAIs continue to affect ∼1 of every 31 hospitalized patients,1 leading to substantial morbidity, mortality, and excess healthcare expenditures,1 and persistent gaps remain between what is recommended and what is practiced.
The widespread impact of the coronavirus disease 2019 (COVID-19) pandemic on HAI outcomes2 in acute-care hospitals has further highlighted the essential role of infection prevention programs and the critical importance of prioritizing efforts that can be sustained even in the face of resource requirements from COVID-19 and future infectious diseases crises.3
The Compendium: 2022 Updates document provides acute-care hospitals with up-to-date, practical expert guidance to assist in prioritizing and implementing HAI prevention efforts. It is the product of a highly collaborative effort led by the Society for Healthcare Epidemiology of America (SHEA), the Infectious Disease Society of America (IDSA), the Association for Professionals in Infection Control and Epidemiology (APIC), the American Hospital Association (AHA), and The Joint Commission, with major contributions from representatives of organizations and societies with content expertise, including the Centers for Disease Control and Prevention (CDC), the Pediatric Infectious Disease Society (PIDS), the Society for Critical Care Medicine (SCCM), the Society for Hospital Medicine (SHM), the Surgical Infection Society (SIS), and others.
Clinical trials face many challenges with meeting projected enrollment and retention goals. A study’s recruitment materials and messaging convey necessary key information and therefore serve as a critical first impression with potential participants. Yet study teams often lack the resources and skills needed to develop engaging, culturally tailored, and professional-looking recruitment materials. To address this gap, the Recruitment Innovation Center recently developed a Recruitment & Retention Materials Content and Design Toolkit, which offers research teams guidance, actionable tips, resources, and customizable templates for creating trial-specific study materials. This paper seeks to describe the creation and contents of this new toolkit.