We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Objectives/Goals: The Wake Forest Clinical and Translational Science Institute (CTSI) has integrated academic goals of T0-T4 translation, scholarship, and education into our Academic Learning Health System (aLHS) framework. Our Translation Research Academy (TRA) provides rigorous training for outstanding and diverse K12 and early-career faculty to develop LHS core competencies. Methods/Study Population: The TRA Forum is the main vehicle for delivering an aLHS-oriented curriculum. Currently, the program includes six K12 scholars and 18 other early-career research faculty with facilitated access to CTSI resources. The TRA Forum is a 2-year seminar series that meets twice a month to discuss topics relevant to the aLHS, leadership, and career development. Inclusion of first- and second-year scholars facilitates peer mentorship, allowing Year 2 scholars to share insights with new scholars. Forum sessions are developed around adult learning theory: Each participant is asked to contribute their experience to discussions, and sessions focus on real-world examples. Results/Anticipated Results: Scholar and faculty commitment is very high. For the first 30 min., scholars present their work in small groups. This extends the range of disciplines exposed (64% of TRA graduates found this very helpful) and promotes translational traits of boundary crosser, team player, and systems thinker. Participants view the TRA as an opportunity to form internal peer networks, promote peer mentoring, and establish new collaborations. The remaining 60 minutes are used for education. Sessions include nominated topics and those providing a solid foundation in core aLHS competencies and characteristics of translational scientists. Educational sessions (97%) were rated as helpful or very helpful. Discussion/Significance of Impact: TRA scholars receive rigorous training in a highly supportive environment to produce aLHS researchers with skills to transcend boundaries, innovate systems, create new knowledge, and rigorously evaluate results.
Objectives/Goals: Diamond Blackfan anemia (DBA) is caused by loss of ribosomal proteins leading to death of red blood cell progenitors. We identified a novel heterozygous variant (c.167+769C>T) in RPL30 in a patient with DBA. We hypothesized that this variant, in a gene not previously studied in DBA, would demonstrate DBA phenotype and reveal early drivers of disease. Methods/Study Population: To study the role of our novel variant, we developed an induced pluripotent stem cell (iPSC) model, including wild type (WT) and CRISPR-edited RPL30 mutant clones. We differentiated the iPSC into hematopoietic stem cells, identified cell populations with flow cytometry, and applied single-cell RNA sequencing. We identified erythroid clusters for differential gene expression analysis, using R Studio DESeq followed by Gene Ontology (GO) enrichment analysis. We are differentiating cells into red blood cells for further comparison with flow cytometry, bulk RNA sequencing, protein analysis, and hemoglobin staining. Our approach has relied on multidisciplinary expertise in clinical hematology and genetics, basic science study of ribosomes, computational biology, stem cell, and hematopoietic biology. Results/Anticipated Results: Compared to WT hematopoietic stem cells, RPL30mutant cells had significantly decreased expression of RPL30. Analysis of top differentially expressed genes revealed downregulation of HSPA1A which encodes heat shock protein 70 (HSP70), chaperone of a critical red blood cell transcription factor. Loss of HSP70 protein has been implicated in RPL-mutated red blood cells previously as a potential modulator of severe DBA phenotype. Upon GO enrichment analysis of downregulated genes, biologic process terms GO:0042254 ribosome biogenesis, GO:1903708 positive regulation of hemopoiesis, and GO:0045646 regulation of erythrocyte differentiation were all highlighted as driver terms. We expect further differentiation to reveal early death of RPL30mutant cells with associated downregulated HSP70. Discussion/Significance of Impact: Our results support our hypothesis that the RPL30 variant downregulates erythropoiesis, with a potential early role of HSP70 protein. Upon completion of our study, we will demonstrate the role of RPL30in DBA pathogenesis as well as provide understanding of its drivers, which is critical for improved management of this disease.
The Magellanic Stream (MS), a tail of diffuse gas formed from tidal and ram pressure interactions between the Small and Large Magellanic Clouds (SMC and LMC) and the Halo of the Milky Way, is primarily composed of neutral atomic hydrogen (HI). The deficiency of dust and the diffuse nature of the present gas make molecular formation rare and difficult, but if present, could lead to regions potentially suitable for star formation, thereby allowing us to probe conditions of star formation similar to those at high redshifts. We search for $\text{HCO}^{+}$, HCN, HNC, and C$_2$H using the highest sensitivity observations of molecular absorption data from the Atacama Large Millimeter Array (ALMA) to trace these regions, comparing with HI archival data from the Galactic Arecibo L-Band Feed Array (GALFA) HI Survey and the Galactic All Sky Survey (GASS) to compare these environments in the MS to the HI column density threshold for molecular formation in the Milky Way. We also compare the line of sight locations with confirmed locations of stars, molecular hydrogen, and OI detections, though at higher sensitivities than the observations presented here.
We find no detections to a 3$\sigma$ significance, despite four sightlines having column densities surpassing the threshold for molecular formation in the diffuse regions of the Milky Way. Here we present our calculations for the upper limits of the column densities of each of these molecular absorption lines, ranging from $3 \times 10^{10}$ to $1 \times 10^{13}$ cm$^{-2}$. The non-detection of $\text{HCO}^{+}$ suggests that at least one of the following is true: (i) $X_{\text{HCO}^{+}{}, \mathrm{MS}}$ is significantly lower than the Milky Way value; (ii) that the widespread diffuse molecular gas observed by Rybarczyk (2022b, ApJ, 928, 79) in the Milky Way’s diffuse interstellar medium (ISM) does not have a direct analogue in the MS; (iii) the HI-to-$\text{H}_{2}$ transition occurs in the MS at a higher surface density in the MS than in the LMC or SMC; or (iv) molecular gas exists in the MS, but only in small, dense clumps.
When kaolinite undergoes percussive grinding, pronounced changes take place in its i.r. absorption spectrum even in the earliest stages of the grinding when the lattice is not yet destroyed. In this report, attention is directed to the change in the stretching bands of the hydroxyl ions. A remarkably rapid effect on the band of the intralayer hydroxyl ions has been observed and is attributed to a permanent removal of the protons from these ions. Auxiliary measurements of X-ray diffraction, thermal water loss, and DTA were used to corroborate the spectroscopic evidence for this ready prototropy.
This study identified 26 late invasive primary surgical site infection (IP-SSI) within 4–12 months of transplantation among 2073 SOT recipients at Duke University Hospital over the period 2015–2019. Thoracic organ transplants accounted for 25 late IP-SSI. Surveillance for late IP-SSI should be maintained for at least one year following transplant.
Although the link between alcohol involvement and behavioral phenotypes (e.g. impulsivity, negative affect, executive function [EF]) is well-established, the directionality of these associations, specificity to stages of alcohol involvement, and extent of shared genetic liability remain unclear. We estimate longitudinal associations between transitions among alcohol milestones, behavioral phenotypes, and indices of genetic risk.
Methods
Data came from the Collaborative Study on the Genetics of Alcoholism (n = 3681; ages 11–36). Alcohol transitions (first: drink, intoxication, alcohol use disorder [AUD] symptom, AUD diagnosis), internalizing, and externalizing phenotypes came from the Semi-Structured Assessment for the Genetics of Alcoholism. EF was measured with the Tower of London and Visual Span Tasks. Polygenic scores (PGS) were computed for alcohol-related and behavioral phenotypes. Cox models estimated associations among PGS, behavior, and alcohol milestones.
Results
Externalizing phenotypes (e.g. conduct disorder symptoms) were associated with future initiation and drinking problems (hazard ratio (HR)⩾1.16). Internalizing (e.g. social anxiety) was associated with hazards for progression from first drink to severe AUD (HR⩾1.55). Initiation and AUD were associated with increased hazards for later depressive symptoms and suicidal ideation (HR⩾1.38), and initiation was associated with increased hazards for future conduct symptoms (HR = 1.60). EF was not associated with alcohol transitions. Drinks per week PGS was linked with increased hazards for alcohol transitions (HR⩾1.06). Problematic alcohol use PGS increased hazards for suicidal ideation (HR = 1.20).
Conclusions
Behavioral markers of addiction vulnerability precede and follow alcohol transitions, highlighting dynamic, bidirectional relationships between behavior and emerging addiction.
Comprehensive studies examining longitudinal predictors of dietary change during the coronavirus disease 2019 pandemic are lacking. Based on an ecological framework, this study used longitudinal data to test if individual, social and environmental factors predicted change in dietary intake during the peak of the coronavirus 2019 pandemic in Los Angeles County and examined interactions among the multilevel predictors.
Design:
We analysed two survey waves (e.g. baseline and follow-up) of the Understanding America Study, administered online to the same participants 3 months apart. The surveys assessed dietary intake and individual, social, and neighbourhood factors potentially associated with diet. Lagged multilevel regression models were used to predict change from baseline to follow-up in daily servings of fruits, vegetables and sugar-sweetened beverages.
Setting:
Data were collected in October 2020 and January 2021, during the peak of the coronavirus disease 2019 pandemic in Los Angeles County.
Participants:
903 adults representative of Los Angeles County households.
Results:
Individuals who had depression and less education or who identified as non-Hispanic Black or Hispanic reported unhealthy dietary changes over the study period. Individuals with smaller social networks, especially low-income individuals with smaller networks, also reported unhealthy dietary changes. After accounting for individual and social factors, neighbourhood factors were generally not associated with dietary change.
Conclusions:
Given poor diets are a leading cause of death in the USA, addressing ecological risk factors that put some segments of the community at risk for unhealthy dietary changes during a crisis should be a priority for health interventions and policy.
Hemodynamic collapse in multi-trauma patients with severe traumatic brain injury (TBI) poses both a diagnostic and therapeutic challenge for prehospital clinicians. Brain injury associated shock (BIAS), likely resulting from catecholamine storm, can cause both ventricular dysfunction and vasoplegia but may present clinically in a manner similar to hemorrhagic shock. Despite different treatment strategies, few studies exist describing this phenomenon in the early post-injury phase. This retrospective observational study aimed to describe the frequency of shock in isolated TBI in prehospital trauma patients and to compare their clinical characteristics to those patients with hemorrhagic shock and TBI without shock.
Methods:
All prehospital trauma patients intubated by prehospital medical teams from New South Wales Ambulance Aeromedical Operations (NSWA-AO) with an initial Glasgow Coma Scale (GCS) of 12 or less were investigated. Shock was defined as a pre-intubation systolic blood pressure under 90mmHg and the administration of blood products or vasopressors. Injuries were classified from in-hospital computed tomography (CT) reports. From this, three study groups were derived: BIAS, hemorrhagic shock, and isolated TBI without shock. Descriptive statistics were then produced for clinical and treatment variables.
Results:
Of 1,292 intubated patients, 423 had an initial GCS of 12 or less, 24 patients (5.7% of the original cohort) had shock with an isolated TBI, and 39 patients had hemorrhagic shock. The hemodynamic parameters were similar amongst these groups, including values of tachycardia, hypotension, and elevated shock index. Prehospital clinical interventions including blood transfusion and total fluids administered were also similar, suggesting they were indistinguishable to prehospital clinicians.
Conclusions:
Hemodynamic compromise in the setting of isolated severe TBI is a rare clinical entity. Current prehospital physiological data available to clinicians do not allow for easy delineation between these patients from those with hemorrhagic shock.
Observations of radiocarbon (14C) in Earth’s atmosphere and other carbon reservoirs are important to quantify exchanges of CO2 between reservoirs. The amount of 14C is commonly reported in the so-called Delta notation, i.e., Δ14C, the decay- and fractionation-corrected departure of the ratio of 14C to total C from that ratio in an absolute international standard; this Delta notation permits direct comparison of 14C/C ratios in the several reservoirs. However, as Δ14C of atmospheric CO2, Δ14CO2 is based on the ratio of 14CO2 to total atmospheric CO2, its value can and does change not just because of change in the amount of atmospheric14CO2 but also because of change in the amount of total atmospheric CO2, complicating ascription of change in Δ14CO2 to change in one or the other quantity. Here we suggest that presentation of atmospheric 14CO2 amount as mole fraction relative to dry air (moles of 14CO2 per moles of dry air in Earth’s atmosphere), or as moles or molecules of 14CO2 in Earth’s atmosphere, all readily calculated from Δ14CO2 and the amount of atmospheric CO2 (with slight dependence on δ13CO2), complements presentation only as Δ14CO2, and can provide valuable insight into the evolving budget and distribution of atmospheric 14CO2.
‘Going Green: The BIALL Sustainability Working Group’ was a parallel session presented by Christine Baird during the BIALL Annual Conference ‘Gaining the edge: investing in our skillset’, held in Belfast in June 2023. The session aimed to provide an introduction to the newly formed BIALL Sustainability Working Group. The session, and this subsequent article, outlines the group's aims, projects and structure, as well as the wider contexts which motivated the group's establishment. It discusses the ways in which information professionals can develop more sustainable working practices, both individually and by leveraging the power of our professional networks. This paper explores the opportunities for legal librarians to engage in community activism, and the importance of both this concept and of ‘cathedral thinking’ in responding to the climate crisis.
As the US population ages, the prevalence of Alzheimer’s disease and related dementias (AD/RD) is on the rise. This is especially true in rural America, where mortality rates due to AD/RD are rising faster than in metropolitan areas. To date, however, people living in rural communities are severely underrepresented in aging research. The Nevada Exploratory Alzheimer’s Disease Research Center (NVeADRC) seeks to address this gap. Here, we present preliminary cognitive data from our rural-dwelling cohort, as well as relevant demographic and clinical characteristics.
Participants and Methods:
Individuals with normal cognition (NC), mild cognitive impairment (MCI), and dementia due to Alzheimer’s disease (AD) living in rural communities, defined as a rural-urban commuting area (RUCA) code of 4 or higher, were enrolled through either clinic or community outreach. Eligibility for the observational cohort required: age >55 years, primarily English-speaking, primary residence in a rural community, and availability of a study partner. Measures included the Uniform Data Set (v3), blood-based biomarkers, structural brain MRI, and portions of the PhenX Social Determinants of Health toolkit. Participants are seen at baseline and followed annually, with interim remote visits every 6 months. A multidisciplinary consensus diagnosis is rendered after each visit. Where feasible, a harmonized urban cohort followed by the Nevada Center for Neurodegeneration and Translational Neuroscience (CNTN) was used for comparison.
Results:
Fifty-six rural-dwelling (age=70.4±7.1 years; edu=15.2±2.6 years; 61% female) and 148 urban-dwelling (age=72.9±6.8 years; edu=15.8±2.7 years; 46% female) older adults were included; age significantly differed between cohorts but education did not. The rural cohort was 46% NC (MoCA=26.8±2.3; CDRsob=0.3±0.6), 32% MCI (MoCA=22.8±3.1; CDRsob=1.2±1.0), and 22% AD (MoCA=16.9±5.5; CDRsob=5.2±3.0). The urban cohort was 39% NC (MoCA=26.4±2.6; CDRsob=0.3±0.8), 44% MCI (MoCA=22.3±3.1; CDRsob=2.0±1.5) and 17% AD (MoCA=18.6±3.9; CDRsob=4.7±2.3). Rural communities were significantly more disadvantaged, as measured by the Area Deprivation Index (ADI), than urban communities (rural ADI=6.3±2.6; urban ADI=3.4±2.3; p<.001). Fifty-percent of the rural cohort lives in a moderate to severely disadvantaged neighborhood (ADI Decile>7) compared to 12% of the urban cohort, and 11% of individuals in the rural cohort reported living more than 30 miles from the nearest medical facility. Across the combined cohort, education was significantly correlated with ADI deciles (r=-.30, p<.001), with people in the areas of highest disadvantage having the lowest education. Verbal memory was also inversely associated with ADI. There were no differences in clinical diagnosis as a function of ADI rank.
Conclusions:
Living in a rural community conveys a multifaceted array of risks and benefits, some of which differ from urban settings. The literature to date suggests that older adults living in rural communities are at significantly increased risk for morbidity and mortality due to AD/RD, though it is unclear why. Preliminary data from the NVeADRC show that increasing levels of neighborhood disadvantage were associated with lower levels of education and worse verbal memory in this convenience sample. The combined effect of low education and increased disadvantage account for some of the urban-rural differences in mortality that have been reported, though additional research on representative samples in this underrepresented population is critical.
Individuals living with HIV may experience cognitive difficulties or marked declines known as HIV-Associated Neurocognitive Disorder (HAND). Cognitive difficulties have been associated with worse outcomes for people living with HIV, therefore, accurate cognitive screening and identification is critical. One potentially sensitive marker of cognitive impairment which has been underutilized, is intra-individual variability (IIV). Cognitive IIV is the dispersion of scores across tasks in neuropsychological assessment. In individuals living with HIV, greater cognitive IIV has been associated with cortical atrophy, poorer cognitive functioning, with more rapid declines, and greater difficulties in daily functioning. Studies examining the use of IIV in clinical neuropsychological testing are limited, and few have examined IIV in the context of a single neuropsychological battery designed for culturally diverse or at-risk populations. To address these gaps, this study aimed to examine IIV profiles of individuals living with HIV and who inject drugs, utilizing the Neuropsi, a standardized neuropsychological instrument for Spanish speaking populations.
Participants and Methods:
Spanish speaking adults residing in Puerto Rico (n=90) who are HIV positive and who inject drugs (HIV+I), HIV negative and who inject drugs (HIV-I), HIV positive who do not inject drugs (HIV+), or healthy controls (HC) completed the Neuropsi battery as part of a larger research protocol. The Neuropsi produces 3 index scores representing cognitive domains of memory, attention/memory, and attention/executive functioning. Total battery and within index IIV were calculated by dividing the standard deviation of T-scores by mean performance, resulting in a coefficient of variance (CoV). Group differences on overall test battery mean CoV (OTBMCoV) were investigated. To examine unique profiles of index specific IIV, a cluster analysis was performed for each group.
Results:
Results of a one-way ANOVA indicated significant between group differences on OTBMCoV (F[3,86]=6.54, p<.001). Post-hoc analyses revealed that HIV+I (M=.55, SE=.07, p=.003), HIV-I (M=.50, SE=.03, p=.001), and HIV+ (M=.48, SE=.02, p=.002) had greater OTBMCoV than the HC group (M=.30, SE=.02). To better understand sources of IIV within each group, cluster analysis of index specific IIV was conducted. For the HIV+ group, 3 distinct clusters were extracted: 1. High IIV in attention/memory and attention/executive functioning (n=3, 8%); 2. Elevated memory IIV (n=21, 52%); 3. Low IIV across all indices (n=16, 40%). For the HIV-I group, 2 distinct clusters were extracted: 1. High IIV across all 3 indices (n=7, 24%) and 2. Low IIV across all 3 indices (n=22, 76%). For the HC group, 3 distinct clusters were extracted: 1. Very low IIV across all 3 indices (n=5, 36%); 2. Elevated memory IIV (n=6, 43%); 3. Elevated attention/executive functioning IIV with very low attention/memory and memory IIV (n=3, 21%). Sample size of the HIV+I group was insufficient to extract clusters.
Conclusions:
Current findings support IIV in the Neuropsi test battery as clinically sensitive marker for cognitive impairment in Spanish speaking individuals living with HIV or who inject drugs. Furthermore, the distinct IIV cluster types identified between groups can help to better understand specific sources of variability. Implications for clinical assessment in prognosis and etiological considerations are discussed.
Injection drug use is a significant public health crisis with adverse health outcomes, including increased risk of human immunodeficiency virus (HIV) infection. Comorbidity of HIV and injection drug use is highly prevalent in the United States and disproportionately elevated in surrounding territories such as Puerto Rico. While both HIV status and injection drug use are independently known to be associated with cognitive deficits, the interaction of these effects remains largely unknown. The aim of this study was to determine how HIV status and injection drug use are related to cognitive functioning in a group of Puerto Rican participants. Additionally, we investigated the degree to which type and frequency of substance use predict cognitive abilities.
Participants and Methods:
96 Puerto Rican adults completed the Neuropsi Attention and Memory-3rd Edition battery for Spanish-speaking participants. Injection substance use over the previous 12 months was also obtained via clinical interview. Participants were categorized into four groups based on HIV status and injection substance use in the last 30 days (HIV+/injector, HIV+/non-injector, HIV/injector, HIV-/non-injector). One-way analysis of variance (ANOVA) was conducted to determine differences between groups on each index of the Neuropsi battery (Attention and Executive Function; Memory; Attention and Memory). Multiple linear regression was used to determine whether type and frequency of substance use predicted performance on these indices while considering HIV status.
Results:
The one-way ANOVAs revealed significant differences (p’s < 0.01) between the healthy control group and all other groups across all indices. No significant differences were observed between the other groups. Injection drug use, regardless of the substance, was associated with lower combined attention and memory performance compared to those who inject less than monthly (Monthly: p = 0.04; 2-3x daily: p < 0.01; 4-7x daily: p = 0.02; 8+ times daily: p < 0.01). Both minimal and heavy daily use predicted poorer memory performance (p = 0.02 and p = 0.01, respectively). Heavy heroin use predicted poorer attention and executive functioning (p = 0.04). Heroin use also predicted lower performance on tests of memory when used monthly (p = 0.049), and daily or almost daily (2-6x weekly: p = 0.04; 4-7x daily: p = 0.04). Finally, moderate injection of heroin predicted lower scores on attention and memory (Weekly: p = 0.04; 2-6x weekly: p = 0.048). Heavy combined heroin and cocaine use predicted worse memory performance (p = 0.03) and combined attention and memory (p = 0.046). HIV status was not a moderating factor in any circumstance.
Conclusions:
As predicted, residents of Puerto Rico who do not inject substances and are HIVnegative performed better in domains of memory, attention, and executive function than those living with HIV and/or inject substances. There was no significant difference among the affected groups in cognitive ability. As expected, daily injection of substances predicted worse performance on tasks of memory. Heavy heroin use predicted worse performance on executive function and memory tasks, while heroin-only and combined heroin and cocaine use predicted worse memory performance. Overall, the type and frequency of substance is more predictive of cognitive functioning than HIV status.
Preemergence herbicides associated with cereal rye (Secale cereale L.) cover crop (hereafter “cereal rye”) can be an effective waterhemp [Amaranthus tuberculatus (Moq.) Sauer.] and Palmer amaranth (Amaranthus palmeri S. Watson) management strategy in soybean [Glycine max (L.) Merr.] production. Delaying cereal rye termination until soybean planting (planting green) optimizes biomass production and weed suppression but might further impact the fate of preemergence herbicides. Limited research is available on the fate of preemergence herbicides applied over living cereal rye in the planting green system. Field experiments were conducted in Illinois, Kansas, Pennsylvania, and Wisconsin to evaluate the fate of flumioxazin and pyroxasulfone and Amaranthus spp. residual control under different cover crop management practices in soybean in 2021 and 2022 (8 site-years). A flumioxazin + pyroxasulfone herbicide premix was applied preemergence at soybean planting under no-till without cereal rye, cereal rye early terminated before soybean planting, and cereal rye terminated at soybean planting. Flumioxazin and pyroxasulfone concentrations in the soil were quantified at 0, 7, and 21 d after treatment (DAT), and Amaranthus spp. density was determined at postemergence herbicide application. The presence of cereal rye biomass intercepted flumioxazin and pyroxasulfone at preemergence application and reduced concentration in the soil when compared with no-till, mainly at 0 DAT. Main differences in herbicide concentration were observed between no-till and cereal rye treatments rather than cereal rye termination times. Despite reducing herbicide concentration in the soil, the presence of the cereal rye biomass did not affect early-season residual Amaranthus spp. control. The adoption of effective preemergence herbicides associated with a properly managed cereal rye cover crop is an effective option for integrated Amaranthus spp. management programs in soybean production systems.
Mental health service delivery needs radical reimagination in the United States where unmet needs for care remain large and most metrics on the burden of mental health problems have worsened, despite significant numbers of mental health professionals, spending on service provision and research. The COVID-19 pandemic has exacerbated the need for mental health care. One path to a radical reimagination is “Community Initiated Care (CIC)” which equips and empowers communities to address by providing brief psychosocial interventions by people in community settings. We co-developed a theory of change (ToC) for CIC with 24 stakeholders including representatives from community-based, advocacy, philanthropic and faith-based organizations to understand how CIC could be developed and adapted for specific contexts. We present a ToC which describes ways in which the CIC initiative can promote and strengthen mental health in communities in the United States with respect to community organization and leadership; community care and inclusion and normalizing mental health. We propose 10 strategies as part of CIC and propose a way forward for implementation and evaluation. This CIC model is a local, tailored approach which can expand the role of community members to strengthen our response to mental health needs in the United States.
The COVID-19 pandemic increased food insufficiency: a severe form of food insecurity. Drawing on an ecological framework, we aimed to understand factors that contributed to changes in food insufficiency from April to December 2020, in a large urban population hard hit by the pandemic.
Design:
We conducted internet surveys every 2 weeks in April–December 2020, including a subset of items from the Food Insecurity Experience Scale. Longitudinal analysis identified predictors of food insufficiency, using fixed effects models.
Setting:
Los Angeles County, which has a diverse population of 10 million residents.
Participants:
A representative sample of 1535 adults in Los Angeles County who are participants in the Understanding Coronavirus in America tracking survey.
Results:
Rates of food insufficiency spiked in the first year of the pandemic, especially among participants living in poverty, in middle adulthood and with larger households. Government food assistance from the Supplemental Nutrition Assistance Program was significantly associated with reduced food insufficiency over time, while other forms of assistance such as help from family and friends or stimulus funds were not.
Conclusions:
The findings highlight that during a crisis, there is value in rapidly monitoring food insufficiency and investing in government food benefits.
Childhood and lifetime adversity may reduce brain serotonergic (5-HT) neurotransmission by epigenetic mechanisms.
Aims
We tested the relationships of childhood adversity and recent stress to serotonin 1A (5-HT1A) receptor genotype, DNA methylation of this gene in peripheral blood monocytes and in vivo 5-HT1A receptor binding potential (BPF) determined by positron emission tomography (PET) in 13 a priori brain regions, in participants with major depressive disorder (MDD) and healthy volunteers (controls).
Method
Medication-free participants with MDD (n = 192: 110 female, 81 male, 1 other) and controls (n = 88: 48 female, 40 male) were interviewed about childhood adversity and recent stressors and genotyped for rs6295. DNA methylation was assayed at three upstream promoter sites (−1019, −1007, −681) of the 5-HT1A receptor gene. A subgroup (n = 119) had regional brain 5-HT1A receptor BPF quantified by PET. Multi-predictor models were used to test associations between diagnosis, recent stress, childhood adversity, genotype, methylation and BPF.
Results
Recent stress correlated positively with blood monocyte methylation at the −681 CpG site, adjusted for diagnosis, and had positive and region-specific correlations with 5-HT1A BPF in participants with MDD, but not in controls. In participants with MDD, but not in controls, methylation at the −1007 CpG site had positive and region-specific correlations with binding potential. Childhood adversity was not associated with methylation or BPF in participants with MDD.
Conclusions
These findings support a model in which recent stress increases 5-HT1A receptor binding, via methylation of promoter sites, thus affecting MDD psychopathology.
Several hypotheses may explain the association between substance use, posttraumatic stress disorder (PTSD), and depression. However, few studies have utilized a large multisite dataset to understand this complex relationship. Our study assessed the relationship between alcohol and cannabis use trajectories and PTSD and depression symptoms across 3 months in recently trauma-exposed civilians.
Methods
In total, 1618 (1037 female) participants provided self-report data on past 30-day alcohol and cannabis use and PTSD and depression symptoms during their emergency department (baseline) visit. We reassessed participant's substance use and clinical symptoms 2, 8, and 12 weeks posttrauma. Latent class mixture modeling determined alcohol and cannabis use trajectories in the sample. Changes in PTSD and depression symptoms were assessed across alcohol and cannabis use trajectories via a mixed-model repeated-measures analysis of variance.
Results
Three trajectory classes (low, high, increasing use) provided the best model fit for alcohol and cannabis use. The low alcohol use class exhibited lower PTSD symptoms at baseline than the high use class; the low cannabis use class exhibited lower PTSD and depression symptoms at baseline than the high and increasing use classes; these symptoms greatly increased at week 8 and declined at week 12. Participants who already use alcohol and cannabis exhibited greater PTSD and depression symptoms at baseline that increased at week 8 with a decrease in symptoms at week 12.
Conclusions
Our findings suggest that alcohol and cannabis use trajectories are associated with the intensity of posttrauma psychopathology. These findings could potentially inform the timing of therapeutic strategies.
OBJECTIVES/GOALS: Glioblastomas (GBMs) are heterogeneous, treatment-resistant tumors that are driven by populations of cancer stem cells (CSCs). In this study, we perform an epigenetic-focused functional genomics screen in GBM organoids and identify WDR5 as an essential epigenetic regulator in the SOX2-enriched, therapy resistant cancer stem cell niche. METHODS/STUDY POPULATION: Despite their importance for tumor growth, few molecular mechanisms critical for CSC population maintenance have been exploited for therapeutic development. We developed a spatially resolved loss-of-function screen in GBM patient-derived organoids to identify essential epigenetic regulators in the SOX2-enriched, therapy resistant niche. Our niche-specific screens identified WDR5, an H3K4 histone methyltransferase responsible for activating specific gene expression, as indispensable for GBM CSC growth and survival. RESULTS/ANTICIPATED RESULTS: In GBM CSC models, WDR5 inhibitors blocked WRAD complex assembly and reduced H3K4 trimethylation and expression of genes involved in CSC-relevant oncogenic pathways. H3K4me3 peaks lost with WDR5 inhibitor treatment occurred disproportionally on POU transcription factor motifs, required for stem cell maintenance and including the POU5F1(OCT4)::SOX2 motif. We incorporated a SOX2/OCT4 motif driven GFP reporter system into our CSC cell models and found that WDR5 inhibitor treatment resulted in dose-dependent silencing of stem cell reporter activity. Further, WDR5 inhibitor treatment altered the stem cell state, disrupting CSC in vitro growth and self-renewal as well as in vivo tumor growth. DISCUSSION/SIGNIFICANCE: Our results unveiled the role of WDR5 in maintaining the CSC state in GBM and provide a rationale for therapeutic development of WDR5 inhibitors for GBM and other advanced cancers. This conceptual and experimental framework can be applied to many cancers, and can unmask unique microenvironmental biology and rationally designed combination therapies.