We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Multicenter clinical trials are essential for evaluating interventions but often face significant challenges in study design, site coordination, participant recruitment, and regulatory compliance. To address these issues, the National Institutes of Health’s National Center for Advancing Translational Sciences established the Trial Innovation Network (TIN). The TIN offers a scientific consultation process, providing access to clinical trial and disease experts who provide input and recommendations throughout the trial’s duration, at no cost to investigators. This approach aims to improve trial design, accelerate implementation, foster interdisciplinary teamwork, and spur innovations that enhance multicenter trial quality and efficiency. The TIN leverages resources of the Clinical and Translational Science Awards (CTSA) program, complementing local capabilities at the investigator’s institution. The Initial Consultation process focuses on the study’s scientific premise, design, site development, recruitment and retention strategies, funding feasibility, and other support areas. As of 6/1/2024, the TIN has provided 431 Initial Consultations to increase efficiency and accelerate trial implementation by delivering customized support and tailored recommendations. Across a range of clinical trials, the TIN has developed standardized, streamlined, and adaptable processes. We describe these processes, provide operational metrics, and include a set of lessons learned for consideration by other trial support and innovation networks.
The Hippoboscidae are ectoparasites of birds and mammals, which, as a group, are known to vector multiple diseases. Avipoxvirus (APV) is mechanically vectored by various arthropods and causes seasonal disease in wild birds in the United Kingdom (UK). Signs of APV and the presence of louse flies (Hippoboscidae) on Dunnocks Prunella modularis were recorded over a 16·5-year period in a rural garden in Somerset, UK. Louse flies collected from this site and other sites in England were tested for the presence of APV DNA and RNA sequences. Louse flies on Dunnocks were seen to peak seasonally three weeks prior to the peak of APV lesions, an interval consistent with the previously estimated incubation period of APV in Dunnocks. APV DNA was detected on 13/25 louse flies, Ornithomya avicularia and Ornithomya fringillina, taken from Dunnocks, both with and without lesions consistent with APV, at multiple sites in England. Collectively these data support the premise that louse flies may vector APV. The detection of APV in louse flies, from apparently healthy birds, and from sites where disease has not been observed in any host species, suggests that the Hippoboscidae could provide a non-invasive and relatively cheap method of monitoring avian diseases. This could provide advanced warnings of disease, including zoonoses, before they become clinically apparent.
The association between cannabis and psychosis is established, but the role of underlying genetics is unclear. We used data from the EU-GEI case-control study and UK Biobank to examine the independent and combined effect of heavy cannabis use and schizophrenia polygenic risk score (PRS) on risk for psychosis.
Methods
Genome-wide association study summary statistics from the Psychiatric Genomics Consortium and the Genomic Psychiatry Cohort were used to calculate schizophrenia and cannabis use disorder (CUD) PRS for 1098 participants from the EU-GEI study and 143600 from the UK Biobank. Both datasets had information on cannabis use.
Results
In both samples, schizophrenia PRS and cannabis use independently increased risk of psychosis. Schizophrenia PRS was not associated with patterns of cannabis use in the EU-GEI cases or controls or UK Biobank cases. It was associated with lifetime and daily cannabis use among UK Biobank participants without psychosis, but the effect was substantially reduced when CUD PRS was included in the model. In the EU-GEI sample, regular users of high-potency cannabis had the highest odds of being a case independently of schizophrenia PRS (OR daily use high-potency cannabis adjusted for PRS = 5.09, 95% CI 3.08–8.43, p = 3.21 × 10−10). We found no evidence of interaction between schizophrenia PRS and patterns of cannabis use.
Conclusions
Regular use of high-potency cannabis remains a strong predictor of psychotic disorder independently of schizophrenia PRS, which does not seem to be associated with heavy cannabis use. These are important findings at a time of increasing use and potency of cannabis worldwide.
Cannabis use and familial vulnerability to psychosis have been associated with social cognition deficits. This study examined the potential relationship between cannabis use and cognitive biases underlying social cognition and functioning in patients with first episode psychosis (FEP), their siblings, and controls.
Methods
We analyzed a sample of 543 participants with FEP, 203 siblings, and 1168 controls from the EU-GEI study using a correlational design. We used logistic regression analyses to examine the influence of clinical group, lifetime cannabis use frequency, and potency of cannabis use on cognitive biases, accounting for demographic and cognitive variables.
Results
FEP patients showed increased odds of facial recognition processing (FRP) deficits (OR = 1.642, CI 1.123–2.402) relative to controls but not of speech illusions (SI) or jumping to conclusions (JTC) bias, with no statistically significant differences relative to siblings. Daily and occasional lifetime cannabis use were associated with decreased odds of SI (OR = 0.605, CI 0.368–0.997 and OR = 0.646, CI 0.457–0.913 respectively) and JTC bias (OR = 0.625, CI 0.422–0.925 and OR = 0.602, CI 0.460–0.787 respectively) compared with lifetime abstinence, but not with FRP deficits, in the whole sample. Within the cannabis user group, low-potency cannabis use was associated with increased odds of SI (OR = 1.829, CI 1.297–2.578, FRP deficits (OR = 1.393, CI 1.031–1.882, and JTC (OR = 1.661, CI 1.271–2.171) relative to high-potency cannabis use, with comparable effects in the three clinical groups.
Conclusions
Our findings suggest increased odds of cognitive biases in FEP patients who have never used cannabis and in low-potency users. Future studies should elucidate this association and its potential implications.
We assessed adverse events in hospitalized patients receiving selected vesicant antibiotics or vasopressors administered through midline catheters or peripherally inserted central catheters (PICC). The rates of catheter-related bloodstream infections, thrombosis, and overall events were similar across the two groups, while occlusion was higher in the PICC group.
Globally, mental disorders account for almost 20% of disease burden and there is growing evidence that mental disorders are associated with various social determinants. Tackling the United Nations Sustainable Development Goals (UN SDGs), which address known social determinants of mental disorders, may be an effective way to reduce the global burden of mental disorders.
Objectives
To examine the evidence base for interventions that seek to improve mental health through targeting the social determinants of mental disorders.
Methods
We conducted a systematic review of reviews, using a five-domain conceptual framework which aligns with the UN SDGs (PROSPERO registration: CRD42022361534). PubMed, PsycInfo, and Scopus were searched from 01 January 2012 until 05 October 2022. Citation follow-up and expert consultation were used to identify additional studies. Systematic reviews including interventions seeking to change or improve a social determinant of mental disorders were eligible for inclusion. Study screening, selection, data extraction, and quality appraisal were conducted in accordance with PRISMA guidelines. The AMSTAR-2 was used to assess included reviews and results were narratively synthesised.
Results
Over 20,000 records were screened, and 101 eligible reviews were included. Most reviews were of low, or critically low, quality. Reviews included interventions which targeted sociocultural (n = 31), economic (n = 24), environmental (n = 19), demographic (n = 15), and neighbourhood (n = 8) determinants of mental disorders. Interventions demonstrating the greatest promise for improved mental health from high and moderate quality reviews (n = 37) included: digital and brief advocacy interventions for female survivors of intimate partner violence; cash transfers for people in low-middle-income countries; improved work schedules, parenting programs, and job clubs in the work environment; psychosocial support programs for vulnerable individuals following environmental events; and social and emotional learning programs for school students. Few effective neighbourhood-level interventions were identified.
Conclusions
This review presents interventions with the strongest evidence base for the prevention of mental disorders and highlights synergies where addressing the UN SDGs can be beneficial for mental health. A range of issues across the literature were identified, including barriers to conducting randomised controlled trials and lack of follow-up limiting the ability to measure long-term mental health outcomes. Interdisciplinary and novel approaches to intervention design, implementation, and evaluation are required to improve the social circumstances and mental health experienced by individuals, communities, and populations.
Throughout the COVID-19 pandemic, many areas in the United States experienced healthcare personnel (HCP) shortages tied to a variety of factors. Infection prevention programs, in particular, faced increasing workload demands with little opportunity to delegate tasks to others without specific infectious diseases or infection control expertise. Shortages of clinicians providing inpatient care to critically ill patients during the early phase of the pandemic were multifactorial, largely attributed to increasing demands on hospitals to provide care to patients hospitalized with COVID-19 and furloughs.1 HCP shortages and challenges during later surges, including the Omicron variant-associated surges, were largely attributed to HCP infections and associated work restrictions during isolation periods and the need to care for family members, particularly children, with COVID-19. Additionally, the detrimental physical and mental health impact of COVID-19 on HCP has led to attrition, which further exacerbates shortages.2 Demands increased in post-acute and long-term care (PALTC) settings, which already faced critical staffing challenges difficulty with recruitment, and high rates of turnover. Although individual healthcare organizations and state and federal governments have taken actions to mitigate recurring shortages, additional work and innovation are needed to develop longer-term solutions to improve healthcare workforce resiliency. The critical role of those with specialized training in infection prevention, including healthcare epidemiologists, was well-demonstrated in pandemic preparedness and response. The COVID-19 pandemic underscored the need to support growth in these fields.3 This commentary outlines the need to develop the US healthcare workforce in preparation for future pandemics.
Throughout history, pandemics and their aftereffects have spurred society to make substantial improvements in healthcare. After the Black Death in 14th century Europe, changes were made to elevate standards of care and nutrition that resulted in improved life expectancy.1 The 1918 influenza pandemic spurred a movement that emphasized public health surveillance and detection of future outbreaks and eventually led to the creation of the World Health Organization Global Influenza Surveillance Network.2 In the present, the COVID-19 pandemic exposed many of the pre-existing problems within the US healthcare system, which included (1) a lack of capacity to manage a large influx of contagious patients while simultaneously maintaining routine and emergency care to non-COVID patients; (2) a “just in time” supply network that led to shortages and competition among hospitals, nursing homes, and other care sites for essential supplies; and (3) longstanding inequities in the distribution of healthcare and the healthcare workforce. The decades-long shift from domestic manufacturing to a reliance on global supply chains has compounded ongoing gaps in preparedness for supplies such as personal protective equipment and ventilators. Inequities in racial and socioeconomic outcomes highlighted during the pandemic have accelerated the call to focus on diversity, equity, and inclusion (DEI) within our communities. The pandemic accelerated cooperation between government entities and the healthcare system, resulting in swift implementation of mitigation measures, new therapies and vaccinations at unprecedented speeds, despite our fragmented healthcare delivery system and political divisions. Still, widespread misinformation or disinformation and political divisions contributed to eroded trust in the public health system and prevented an even uptake of mitigation measures, vaccines and therapeutics, impeding our ability to contain the spread of the virus in this country.3 Ultimately, the lessons of COVID-19 illustrate the need to better prepare for the next pandemic. Rising microbial resistance, emerging and re-emerging pathogens, increased globalization, an aging population, and climate change are all factors that increase the likelihood of another pandemic.4
Recent theories suggest that for youth highly sensitive to incentives, perceiving more social threat may contribute to social anxiety (SA) symptoms. In 129 girls (ages 11–13) oversampled for shy/fearful temperament, we thus examined how interactions between neural responses to social reward (vs. neutral) cues (measured during anticipation of peer feedback) and perceived social threat in daily peer interactions (measured using ecological momentary assessment) predict SA symptoms two years later. No significant interactions emerged when neural reward function was modeled as a latent factor. Secondary analyses showed that higher perceived social threat was associated with more severe SA symptoms two years later only for girls with higher basolateral amygdala (BLA) activation to social reward cues at baseline. Interaction effects were specific to BLA activation to social reward (not threat) cues, though a main effect of BLA activation to social threat (vs. neutral) cues on SA emerged. Unexpectedly, interactions between social threat and BLA activation to social reward cues also predicted generalized anxiety and depression symptoms two years later, suggesting possible transdiagnostic risk pathways. Perceiving high social threat may be particularly detrimental for youth highly sensitive to reward incentives, potentially due to mediating reward learning processes, though this remains to be tested.
Knowledge of sex differences in risk factors for posttraumatic stress disorder (PTSD) can contribute to the development of refined preventive interventions. Therefore, the aim of this study was to examine if women and men differ in their vulnerability to risk factors for PTSD.
Methods
As part of the longitudinal AURORA study, 2924 patients seeking emergency department (ED) treatment in the acute aftermath of trauma provided self-report assessments of pre- peri- and post-traumatic risk factors, as well as 3-month PTSD severity. We systematically examined sex-dependent effects of 16 risk factors that have previously been hypothesized to show different associations with PTSD severity in women and men.
Results
Women reported higher PTSD severity at 3-months post-trauma. Z-score comparisons indicated that for five of the 16 examined risk factors the association with 3-month PTSD severity was stronger in men than in women. In multivariable models, interaction effects with sex were observed for pre-traumatic anxiety symptoms, and acute dissociative symptoms; both showed stronger associations with PTSD in men than in women. Subgroup analyses suggested trauma type-conditional effects.
Conclusions
Our findings indicate mechanisms to which men might be particularly vulnerable, demonstrating that known PTSD risk factors might behave differently in women and men. Analyses did not identify any risk factors to which women were more vulnerable than men, pointing toward further mechanisms to explain women's higher PTSD risk. Our study illustrates the need for a more systematic examination of sex differences in contributors to PTSD severity after trauma, which may inform refined preventive interventions.
Marine litter poses a complex challenge in Indonesia, necessitating a well-informed and coordinated strategy for effective mitigation. This study investigates the seasonality of plastic concentrations around Sulawesi Island in central Indonesia during monsoon-driven wet and dry seasons. By using open data and methodologies including the HYCOM and Parcels models, we simulated the dispersal of plastic waste over 3 months during both the southwest and northeast monsoons. Our research extended beyond data analysis, as we actively engaged with local communities, researchers and policymakers through a range of outreach initiatives, including the development of a web application to visualize model results. Our findings underscore the substantial influence of monsoon-driven currents on surface plastic concentrations, highlighting the seasonal variation in the risk to different regional seas. This study adds to the evidence provided by coarser resolution regional ocean modelling studies, emphasizing that seasonality is a key driver of plastic pollution within the Indonesian archipelago. Inclusive international collaboration and a community-oriented approach were integral to our project, and we recommend that future initiatives similarly engage researchers, local communities and decision-makers in marine litter modelling results. This study aims to support the application of model results in solutions to the marine litter problem.
We examined whether cannabis use contributes to the increased risk of psychotic disorder for non-western minorities in Europe.
Methods
We used data from the EU-GEI study (collected at sites in Spain, Italy, France, the United Kingdom, and the Netherlands) on 825 first-episode patients and 1026 controls. We estimated the odds ratio (OR) of psychotic disorder for several groups of migrants compared with the local reference population, without and with adjustment for measures of cannabis use.
Results
The OR of psychotic disorder for non-western minorities, adjusted for age, sex, and recruitment area, was 1.80 (95% CI 1.39–2.33). Further adjustment of this OR for frequency of cannabis use had a minimal effect: OR = 1.81 (95% CI 1.38–2.37). The same applied to adjustment for frequency of use of high-potency cannabis. Likewise, adjustments of ORs for most sub-groups of non-western countries had a minimal effect. There were two exceptions. For the Black Caribbean group in London, after adjustment for frequency of use of high-potency cannabis the OR decreased from 2.45 (95% CI 1.25–4.79) to 1.61 (95% CI 0.74–3.51). Similarly, the OR for Surinamese and Dutch Antillean individuals in Amsterdam decreased after adjustment for daily use: from 2.57 (95% CI 1.07–6.15) to 1.67 (95% CI 0.62–4.53).
Conclusions
The contribution of cannabis use to the excess risk of psychotic disorder for non-western minorities was small. However, some evidence of an effect was found for people of Black Caribbean heritage in London and for those of Surinamese and Dutch Antillean heritage in Amsterdam.
Although food insecurity affects a significant proportion of young children in New Zealand (NZ)(1), evidence of its association with dietary intake and sociodemographic characteristics in this population is lacking. This study aims to assess the household food security status of young NZ children and its association with energy and nutrient intake and sociodemographic factors. This study included 289 caregiver and child (1-3 years old) dyads from the same household in either Auckland, Wellington, or Dunedin, NZ. Household food security status was determined using a validated and NZ-specific eight-item questionnaire(2). Usual dietary intake was determined from two 24-hour food recalls, using the multiple source method(3). The prevalence of inadequate nutrient intake was assessed using the Estimated Average Requirement (EAR) cut-point method and full probability approach. Sociodemographic factors (i.e., socioeconomic status, ethnicity, caregiver education, employment status, household size and structure) were collected from questionnaires. Linear regression models were used to estimate associations with statistical significance set at p <0.05. Over 30% of participants had experienced food insecurity in the past 12 months. Of all eight indicator statements, “the variety of foods we are able to eat is limited by a lack of money,” had the highest proportion of participants responding “often” or “sometimes” (35.8%). Moderately food insecure children exhibited higher fat and saturated fat intakes, consuming 3.0 (0.2, 5.8) g/day more fat, and 2.0 (0.6, 3.5) g/day more saturated fat compared to food secure children (p<0.05). Severely food insecure children had lower g/kg/day protein intake compared to food secure children (p<0.05). In comparison to food secure children, moderately and severely food insecure children had lower fibre intake, consuming 1.6 (2.8, 0.3) g/day and 2.6 (4.0, 1.2) g/day less fibre, respectively. Severely food insecure children had the highest prevalence of inadequate calcium (7.0%) and vitamin C (9.3%) intakes, compared with food secure children [prevalence of inadequate intakes: calcium (2.3%) and vitamin C (2.8%)]. Household food insecurity was more common in those of Māori or Pacific ethnicity; living in areas of high deprivation; having a caregiver who was younger, not in paid employment, or had low educational attainment; living with ≥2 other children in the household; and living in a sole-parent household. Food insecure young NZ children consume a diet that exhibits lower nutritional quality in certain measures compared to their food-secure counterparts. Food insecurity was associated with various sociodemographic factors that are closely linked with poverty or low income. As such, there is an urgent need for poverty mitigation initiatives to safeguard vulnerable young children from the adverse consequences of food insecurity.
Disparities in CHD outcomes exist across the lifespan. However, less is known about disparities for patients with CHD admitted to neonatal ICU. We sought to identify sociodemographic disparities in neonatal ICU admissions among neonates born with cyanotic CHD.
Materials & Methods:
Annual natality files from the US National Center for Health Statistics for years 2009–2018 were obtained. For each neonate, we identified sex, birthweight, pre-term birth, presence of cyanotic CHD, and neonatal ICU admission at time of birth, as well as maternal age, race, ethnicity, comorbidities/risk factors, trimester at start of prenatal care, educational attainment, and two measures of socio-economic status (Special Supplemental Nutrition Program for Women, Infants, and Children [WIC] status and insurance type). Multivariable logistic regression models were fit to determine the association of maternal socio-economic status with neonatal ICU admission. A covariate for race/ethnicity was then added to each model to determine if race/ethnicity attenuate the relationship between socio-economic status and neonatal ICU admission.
Results:
Of 22,373 neonates born with cyanotic CHD, 77.2% had a neonatal ICU admission. Receipt of WIC benefits was associated with higher odds of neonatal ICU admission (adjusted odds ratio [aOR] 1.20, 95% CI 1.1–1.29, p < 0.01). Neonates born to non-Hispanic Black mothers had increased odds of neonatal ICU admission (aOR 1.20, 95% CI 1.07–1.35, p < 0.01), whereas neonates born to Hispanic mothers were at lower odds of neonatal ICU admission (aOR 0.84, 95% CI 0.76–0.93, p < 0.01).
Conclusion:
Maternal Black race and low socio-economic status are associated with increased risk of neonatal ICU admission for neonates born with cyanotic CHD. Further work is needed to identify the underlying causes of these disparities.
Nitrogen availability has an important influence on agricultural weed growth, because many weeds in annual cropping systems are more competitive in high-nitrogen soils. A potential method to control nitrogen availability is through soil carbon amendments, which stimulate soil microbial growth and immobilize nitrogen. Additionally, carbon amendments may alter soil microbial community composition, increase soil biological functioning, and improve soil health. In a 2-yr field experiment in corn (Zea mays L.) and soybean [Glycine max (L.) Merr.], we implemented five amendment treatments to test their ability to alter weed and crop growth through soil nitrogen availability and soil biological functioning. The treatments included: an untreated control, an unamended weed-free control, rye hay adding 3,560 kg C ha−1 and 3,350 kg C ha−1 in 2020 and 2021, respectively, sawdust adding 5,030 kg C ha−1 and 4,350 kg C ha−1 in 2020 and 2021, respectively, and a rye hay and sawdust combined treatment adding 8,590 kg C ha−1 and 7,700 kg C ha−1 in 2020 and 2021, respectively. Each treatment was replicated five times in corn and six times in soybean. Each season, we explored correlations between crop and weed biomass and weed community composition and nitrogen immobilization measured through soil respiration and nitrogen availability. We also explored changes to the soil microbial community composition and soil health as a secondary result of the carbon amendment treatments. Nitrogen availability was lowest in plots treated with the highest C:N amendment. Increasing carbon improved soil health metrics, but the microbial community composition was most affected by the rye hay treatment. Amendments with high C:N reduced weed growth in both soybean and corn plots but only selected for specific weed communities in soybean, leading to improved soybean competitiveness against weeds. In corn, crop growth and weed community composition remained consistent across amendment treatments. Targeted nitrogen immobilization may improve leguminous crop competition in some weed communities as part of an integrated weed management program.
Odd Radio Circles (ORCs) are a class of low surface brightness, circular objects approximately one arcminute in diameter. ORCs were recently discovered in the Australian Square Kilometre Array Pathfinder (ASKAP) data and subsequently confirmed with follow-up observations on other instruments, yet their origins remain uncertain. In this paper, we suggest that ORCs could be remnant lobes of powerful radio galaxies, re-energised by the passage of a shock. Using relativistic hydrodynamic simulations with synchrotron emission calculated in post-processing, we show that buoyant evolution of remnant radio lobes is alone too slow to produce the observed ORC morphology. However, the passage of a shock can produce both filled and edge-brightnened ORC-like morphologies for a wide variety of shock and observing orientations. Circular ORCs are predicted to have host galaxies near the geometric centre of the radio emission, consistent with observations of these objects. Significantly offset hosts are possible for elliptical ORCs, potentially causing challenges for accurate host galaxy identification. Observed ORC number counts are broadly consistent with a paradigm in which moderately powerful radio galaxies are their progenitors.
The UK Soft Drinks Industry Levy (SDIL) (announced in March 2016; implemented in April 2018) aims to incentivise reformulation of soft drinks to reduce added sugar levels. The SDIL has been applauded as a policy success, and it has survived calls from parliamentarians for it to be repealed. We aimed to explore parliamentary reaction to the SDIL following its announcement until two years post-implementation in order to understand how health policy can become established and resilient to opposition.
Design:
Searches of Hansard for parliamentary debate transcripts that discussed the SDIL retrieved 186 transcripts, with 160 included after screening. Five stages of Applied Thematic Analysis were conducted: familiarisation and creation of initial codebooks; independent second coding; codebook finalisation through team consensus; final coding of the dataset to the complete codebook; and theme finalisation through team consensus.
Setting:
The United Kingdom Parliament
Participants:
N/A
Results:
Between the announcement (16/03/2016) – royal assent (26/04/2017), two themes were identified 1: SDIL welcomed cross-party 2: SDIL a good start but not enough. Between royal assent – implementation (5/04/2018), one theme was identified 3: The SDIL worked – what next? The final theme identified from implementation until 16/03/2020 was 4: Moving on from the SDIL.
Conclusions:
After the announcement, the SDIL had cross-party support and was recognised to have encouraged reformulation prior to implementation. Lessons for governments indicate that the combination of cross-party support and a policy’s documented success in achieving its aim can help cement the resilience of it to opposition and threats of repeal.
Self-concept becomes reliant on social comparison, potentially leading to excessive self-focused attention, persistently negative self-concept and increased risk for depression during early adolescence. Studies have implicated neural activation in cortical midline brain structures in self-related information processing, yet it remains unclear how this activation may underlie subjective self-concept and links to depression in adolescence. We examined these associations by assessing neural activity during negative vs. positive self-referential processing in 39 11-to-13-year-old girls. During a functional neuroimaging task, girls reported on their perceptions of self-concept by rating how true they believed positive and negative personality traits were about them. Girls reported on depressive symptoms at the scan and 6 months later. Activation in the dorsomedial and ventrolateral prefrontal cortexes (dMPFC; VLPFC), and visual association area was significantly associated with subjective self-concept and/or depressive symptoms at the scan or 6 months later. Exploratory models showed higher activation in the dMPFC to Self-negative > Self-positive was indirectly associated with concurrent depressive symptoms through more negative self-concept. Higher activation in the visual association area to Self-positive > Self-negative was associated with lower depressive symptoms at follow-up through more positive self-concept. Findings highlight how differential neural processing of negative versus positive self-relevant information maps onto perceptions of self-concept and adolescent depression.