We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The concept of the protein transition represents a shift from a diet rich in animal proteins to one richer in plant-based alternatives, largely in response to environmental sustainability concerns. However, a simple swap by replacing dairy protein with plant protein will lead to lower protein quality and a lower intake of key micronutrients that sit naturally within the dairy matrix. Owing to antagonistic effects within the plant food matrix, micronutrients in plant sources exhibit lower bioavailability which is not reflected in food composition data or dietary guidelines. The dairy matrix effect includes moderation of blood lipid levels in which calcium plays a key role. Protein recommendations often take a muscle-centric approach. Hence, strategies to increase the anabolic potential of plant proteins have focused on increasing total protein intake to counter the suboptimal amino acid composition relative to dairy protein or leucine fortification. However, emerging evidence indicates a role for nutrient interactions and non-nutrient components (milk exosomes, bioactive peptides) of the dairy matrix in modulating postprandial muscle protein synthesis rates. To ensure the food system transformation is environmentally sustainable and optimal from a nutrition perspective, consideration needs to be given to complementary benefits of different food matrices and the holistic evaluation of foods in the protein transition. This narrative review critically examines the role of dairy in the protein transition, emphasising the importance of the food matrix in nutrient bioavailability and muscle health. By considering both nutritional and sustainability perspectives, we provide a holistic evaluation of dairy’s contribution within evolving dietary patterns.
Objectives/Goals: Cutaneous lupus erythematosus (CLE) is an inflammatory skin manifestation of lupus. CLE lesions are frequently colonized by Staphylococcus aureus, a microbe known to promote IFN production and inflammation. Here, we investigate whether type I IFN and inflammatory gene signatures in CLE lesions can be modulated with a topical antibiotic treatment. Methods/Study Population: SLE patients with active CLE lesions (n = 12) were recruited and randomized into a week of topical treatment with either 2% mupirocin or petroleum jelly vehicle. Paired samples were collected before and after 7 days of treatment to assess microbial lesional skin responses. Microbial samples from nares and lesional skin were used to determine baseline and posttreatment Staphylococcus abundance and microbial community profiles by 16S rRNA gene sequencing. Inflammatory responses were evaluated by bulk RNA sequencing of lesional skin biopsies. Immunophenotyping of CLE lesions was performed using CIBERSORTx to deconvolute the RNA-seq data into predicted cell populations impacted by treatment. Results/Anticipated Results: We identified 173 differentially expressed genes in CLE lesions after topical mupirocin treatment. Mupirocin treatment decreased the abundance of Staphylococcus associated with CLE lesions without altering the overall diversity of the skin microbiota relative to vehicle. Decreased lesional Staphylococcus burden correlated with decreased IFN pathway signaling and inflammatory gene expression and increased barrier dysfunction. Interestingly, mupirocin treatment lowered skin monocyte levels, and this mupirocin-associated depletion of monocytes correlated with decreased inflammatory gene expression. Discussion/Significance of Impact: Mupirocin treatment decreased lesional Staphylococcus burden and this correlated with decreased IFN signaling and inflammatory gene expression. This study suggests a topical antibiotic could be employed to decrease lupus skin inflammation and type I IFN responses by reducing Staphylococcus colonization.
Geriatric (old age) psychiatry faces growing challenges amid Europe’s ageing population. This editorial emphasises the need for specialised training, mentorship and subspecialty recognition to attract young psychiatrists. By addressing structural gaps and fostering innovation, the field offers a rewarding career in enhancing older adults’ mental healthcare and quality of life.
In major depressive disorder (MDD), only ~35% achieve remission after first-line antidepressant therapy. Using UK Biobank data, we identify sociodemographic, clinical, and genetic predictors of antidepressant response through self-reported outcomes, aiming to inform personalized treatment strategies.
Methods
In UK Biobank Mental Health Questionnaire 2, participants with MDD reported whether specific antidepressants helped them. We tested whether retrospective lifetime response to four selective serotonin reuptake inhibitors (SSRIs) (N = 19,516) – citalopram (N = 8335), fluoxetine (N = 8476), paroxetine (N = 2297) and sertraline (N = 5883) – was associated with sociodemographic (e.g. age, gender) and clinical factors (e.g. episode duration). Genetic analyses evaluated the association between CYP2C19 variation and self-reported response, while polygenic score (PGS) analysis assessed whether genetic predisposition to psychiatric disorders and antidepressant response predicted self-reported SSRI outcomes.
Results
71%–77% of participants reported positive responses to SSRIs. Non-response was significantly associated with alcohol and illicit drug use (OR = 1.59, p = 2.23 × 10−20), male gender (OR = 1.25, p = 8.29 × 10−08), and lower-income (OR = 1.35, p = 4.22 × 10−07). The worst episode lasting over 2 years (OR = 1.93, p = 3.87 × 10−16) and no mood improvement from positive events (OR = 1.35, p = 2.37 × 10−07) were also associated with non-response. CYP2C19 poor metabolizers had nominally higher non-response rates (OR = 1.31, p = 1.77 × 10−02). Higher PGS for depression (OR = 1.08, p = 3.37 × 10−05) predicted negative SSRI outcomes after multiple testing corrections.
Conclusions
Self-reported antidepressant response in the UK Biobank is influenced by sociodemographic, clinical, and genetic factors, mirroring clinical response measures. While positive outcomes are more frequent than remission reported in clinical trials, these self-reports replicate known treatment associations, suggesting they capture meaningful aspects of antidepressant effectiveness from the patient’s perspective.
Diagnosis of ventilator-associated pneumonia (VAP) is challenging and relies heavily on respiratory culture results. The results of this survey underscore the potential for a diagnostic stewardship nudge limiting culture reports to “potential colonization or contamination” in those without clinical findings of VAP to decrease unnecessary antibiotic prescribing.
Antibiotic prophylaxis in children with vesicoureteral reflux (VUR) remains controversial. We reviewed patients diagnosed with VUR after an index urinary tract infection (UTI) who subsequently received antibiotic prophylaxis. Recurrent UTIs in patients with and without urologic anomalies occurred in 57% and 33%, respectively. Multidrug-resistant organisms accounted for 25% of first UTI recurrences.
A series of webinars covering widespread knowledge on paediatric cardiology and cardiac surgery topics was initiated by Association for European Paediatric and Congenital Cardiology, serving towards preparation for the Association for European Paediatric and Congenital Cardiology certification in paediatric and congenital cardiology. This study investigated the impact of webinars as educational tools for junior paediatric cardiologists in the post-COVID-19 pandemic era.
Materials and methods:
A cross-sectional survey design study using an online survey as a tool for the assessment of trainees. An open and closed-ended SurveyMonkey questionnaire was used to document the learners’ opinions on webinars. Results were reported using descriptive statistical analyses.
Results:
Twenty-seven Association for European Paediatric and Congenital Cardiology junior members participated in the online survey from twelve different countries. Most of the participants were trainees in paediatric cardiology (56%), and the remainder were junior consultants in paediatric cardiology. Approximately 70% found no difficulties in participating in the webinars. The webinars were appreciated by participants, who found the webinars interactive and highly educational with contents highly applicable to everyday clinical practice. Significant heterogeneity emerged in training programmes across Europe and worldwide in terms of programme duration, number of fellows, teaching approach, and assessments. Training opportunities such as courses, grants, and more webinars were suggested as tools to support continuous learning by the Association for European Paediatric and Congenital Cardiology.
Conclusion:
The Association for European Paediatric and Congenital Cardiology webinar series has confirmed the crucial role of online-based learning resources in the new generation of junior paediatric cardiologists. Association for European Paediatric and Congenital Cardiology webinars and the examination in paediatric cardiology may help standardise training across Europe, promoting the highest standards in patient care.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
Clinical high-risk for psychosis (CHR-P) states exhibit diverse clinical presentations, prompting a shift towards broader outcome assessments beyond psychosis manifestation. To elucidate more uniform clinical profiles and their trajectories, we investigated CHR-P profiles in a community sample.
Methods
Participants (N = 829; baseline age: 16–40 years) comprised individuals from a Swiss community sample who were followed up over roughly 3 years. latent class analysis was applied to CHR-P symptom data at baseline and follow-up, and classes were examined for demographic and clinical differences, as well as stability over time.
Results
Similar three-class solutions were yielded for both time points. Class 1 was mainly characterized by subtle, subjectively experienced disturbances in mental processes, including thinking, speech and perception (basic symptoms [BSs]). Class 2 was characterized by subthreshold positive psychotic symptoms (i.e., mild delusions or hallucinations) indicative of an ultra-high risk for psychosis. Class 3, the largest group (comprising over 90% of participants), exhibited the lowest probability of experiencing any psychosis-related symptoms (CHR-P symptoms). Classes 1 and 2 included more participants with functional impairment and psychiatric morbidity. Class 3 participants had a low probability of having functional deficits or mental disorders at both time points, suggesting that Class 3 was the healthiest group and that their mental health and functioning remained stable throughout the study period. While 91% of Baseline Class 3 participants remained in their class over time, most Baseline Classes 1 (74%) and Class 2 (88%) participants moved to Follow-up Class 3.
Conclusions
Despite some temporal fluctuations, CHR-P symptoms within community samples cluster into distinct subgroups, reflecting varying levels of symptom severity and risk profiles. This clustering highlights the largely distinct nature of BSs and attenuated positive symptoms within the community. The association of Classes 1 and 2 with Axis-I disorders and functional deficits emphasizes the clinical significance of CHR-P symptoms. These findings highlight the need for personalized preventive measures targeting specific risk profiles in community-based populations.
Anhedonia characterizes major depressive episodes in bipolar depression and is associated with more severe illness/poor prognosis. These post hoc analyses assess effect of cariprazine 1.5 and 3 mg/d on anhedonia symptoms in patients with bipolar I depression.
Methods
Data were pooled from 3 randomized, double-blind, placebo-controlled bipolar I depression trials in cariprazine. Cariprazine 1.5 and 3 mg/d versus placebo were evaluated in patient subgroups stratified by median baseline MADRS anhedonia score (higher anhedonia=score ≥19; lower anhedonia=score <19). Outcomes included mean change from baseline to week 6 in MADRS total and anhedonia factor score (sum of apparent sadness, reported sadness, concentration, lassitude, and inability to feel items). The proportion of patients with week 6 anhedonia factor response (≥50% improvement from baseline) was also determined. Changes from baseline were analyzed using a mixed-effect model for repeated measures.
Results
There were 760 patients in the higher anhedonia subgroup (placebo=249, cariprazine: 1.5 mg/d=261; 3 mg/d=250) and 623 patients in the lower anhedonia subgroup (placebo=211, cariprazine: 1.5 mg/d=200; 3 mg/d=212). Mean baseline MADRS total score was higher in the higher anhedonia subgroup (total=33.6) than in the lower anhedonia subgroup (total=27.6). Change from baseline to week 6 in MADRS total score was greater for both cariprazine doses versus placebo in the higher anhedonia subgroup (least squares mean difference [LSMD] and 95% confidence interval [CI]: 1.5 mg/d=-3.01 [-4.84, -1.19], P=.0012; 3 mg/d: -3.26 [-5.12, -1.40], P=.0006); in the lower anhedonia subgroup, cariprazine 1.5 mg/d was statistically significant versus placebo (-2.61 [-4.28, -0.93], P=.0024). In the higher anhedonia subgroup at week 6, change from baseline in anhedonia factor score was significant versus placebo for both cariprazine doses (1.5 mg/d=-1.97 [-3.13, -0.81], P=.0009; 3 mg/d=-2.07 [-3.26, -0.89], P=.0006); in the lower subgroup, the difference was significant versus placebo for cariprazine 1.5 mg/d (-1.70 [-2.77, -0.62], P=.0021). After adjusting for changes in other depressive symptoms, LSMDs versus placebo in the anhedonia factor score remained significant for cariprazine 1.5 mg/d (-1.21 [-2.05, -0.36], P=.0052) and 3 mg/d (-1.00 [-1.86, -0.14], P=.0233) in the higher anhedonia subgroup, and for 1.5 mg/d (-1.06 [-1.92, -0.19], P=.0164) in the lower subgroup. In the higher anhedonia subgroup, rates of anhedonia factor response were greater versus placebo (31.7%) for cariprazine 1.5 mg/d (44.8%, P=.0028) and 3 mg/d (45.6%, P=.0019); in the lower subgroup, response rates were 39.3% for placebo, 48.0% for 1.5 mg/d, and 46.7% for 3 mg/d. Adverse events in ≥5% cariprazine and twice placebo were nausea, akathisia, restlessness, and EPS.
Importance
Those with bipolar depression and anhedonia cariprazine demonstrated a potent antidepressant and antianhedonic effect in higher/lower anhedonia subgroups.
Funding
AbbVie
This data was previously presented at the European College of Neuropsychopharmacology (ECNP) Congress; Barcelona, Spain; October 7 – 10, 2023.
The study objective was to develop and validate a clinical decision support system (CDSS) to guide clinicians through the diagnostic evaluation of hospitalized individuals with suspected pulmonary tuberculosis (TB) in low-prevalence settings.
Methods:
The “TBorNotTB” CDSS was developed using a modified Delphi method. The CDSS assigns points based on epidemiologic risk factors, TB history, symptoms, chest imaging, and sputum/bronchoscopy results. Below a set point threshold, airborne isolation precautions are automatically discontinued; otherwise, additional evaluation, including infection control review, is recommended. The model was validated through retrospective application of the CDSS to all individuals hospitalized in the Mass General Brigham system from July 2016 to December 2022 with culture-confirmed pulmonary TB (cases) and equal numbers of age and date of testing-matched controls with three negative respiratory mycobacterial cultures.
Results:
104 individuals with TB (cases) and 104 controls were identified. Prior residence in a highly endemic country, positive interferon release assay, weight loss, absence of symptom resolution with treatment for alternative diagnoses, and findings concerning for TB on chest imaging were significant predictors of TB (all P < 0.05). CDSS contents and scoring were refined based on the case–control analysis. The final CDSS demonstrated 100% sensitivity and 27% specificity for TB with an AUC of 0.87.
Conclusions:
The TBorNotTB CDSS demonstrated modest specificity and high sensitivity to detect TB even when AFB smears were negative. This CDSS, embedded into the electronic medical record system, could help reduce risks of nosocomial TB transmission, patient-time in airborne isolation, and person-time spent reviewing individuals with suspected TB.
Dual scaling (DS) is a multivariate exploratory method equivalent to correspondence analysis when analysing contingency tables. However, for the analysis of rating data, different proposals appear in the DS and correspondence analysis literature. It is shown here that a peculiarity of the DS method can be exploited to detect differences in response styles. Response styles occur when respondents use rating scales differently for reasons not related to the questions, often biasing results. A spline-based constrained version of DS is devised which can detect the presence of four prominent types of response styles, and is extended to allow for multiple response styles. An alternating nonnegative least squares algorithm is devised for estimating the parameters. The new method is appraised both by simulation studies and an empirical application.
Tape rolls are often used for multiple patients despite recommendations by manufacturers for single-patient use. We developed a survey to query Health Care Personnel about their tape use practices and beliefs and uncovered behaviors that put patients at risk for hospital-acquired infections due to tape use.
Evidence-based insertion and maintenance bundles are effective in reducing the incidence of central line-associated bloodstream infections (CLABSI) in intensive care unit (ICU) settings. We studied the adoption and compliance of CLABSI prevention bundle programs and CLABSI rates in ICUs in a large network of acute care hospitals across Canada.
There is a growing trend for studies run by academic and nonprofit organizations to have regulatory submission requirements. As a result, there is greater reliance on REDCap, an electronic data capture (EDC) widely used by researchers in these organizations. This paper discusses the development and implementation of the Rapid Validation Process (RVP) developed by the REDCap Consortium, aimed at enhancing regulatory compliance and operational efficiency in response to the dynamic demands of modern clinical research. The RVP introduces a structured validation approach that categorizes REDCap functionalities, develops targeted validation tests, and applies structured and standardized testing syntax. This approach ensures that REDCap can meet regulatory standards while maintaining flexibility to adapt to new challenges. Results from the application of the RVP on recent successive REDCap software version releases illustrate significant improvements in testing efficiency and process optimization, demonstrating the project’s success in setting new benchmarks for EDC system validation. The project’s community-driven responsibility model fosters collaboration and knowledge sharing and enhances the overall resilience and adaptability of REDCap. As REDCap continues to evolve based on feedback from clinical trialists, the RVP ensures that REDCap remains a reliable and compliant tool, ready to meet regulatory and future operational challenges.
The association between cannabis and psychosis is established, but the role of underlying genetics is unclear. We used data from the EU-GEI case-control study and UK Biobank to examine the independent and combined effect of heavy cannabis use and schizophrenia polygenic risk score (PRS) on risk for psychosis.
Methods
Genome-wide association study summary statistics from the Psychiatric Genomics Consortium and the Genomic Psychiatry Cohort were used to calculate schizophrenia and cannabis use disorder (CUD) PRS for 1098 participants from the EU-GEI study and 143600 from the UK Biobank. Both datasets had information on cannabis use.
Results
In both samples, schizophrenia PRS and cannabis use independently increased risk of psychosis. Schizophrenia PRS was not associated with patterns of cannabis use in the EU-GEI cases or controls or UK Biobank cases. It was associated with lifetime and daily cannabis use among UK Biobank participants without psychosis, but the effect was substantially reduced when CUD PRS was included in the model. In the EU-GEI sample, regular users of high-potency cannabis had the highest odds of being a case independently of schizophrenia PRS (OR daily use high-potency cannabis adjusted for PRS = 5.09, 95% CI 3.08–8.43, p = 3.21 × 10−10). We found no evidence of interaction between schizophrenia PRS and patterns of cannabis use.
Conclusions
Regular use of high-potency cannabis remains a strong predictor of psychotic disorder independently of schizophrenia PRS, which does not seem to be associated with heavy cannabis use. These are important findings at a time of increasing use and potency of cannabis worldwide.
There is a growing awareness that diversity, health equity, and inclusion play a significant role in improving patient outcomes and advancing knowledge. The Pediatric Heart Network launched an initiative to incorporate diversity, health equity, and inclusion into its 2021 Scholar Award Funding Opportunity Announcement. This manuscript describes the process of incorporating diversity, health equity, and inclusion into the Pediatric Heart Network Scholar Award and the lessons learned. Recommendations for future Pediatric Heart Network grant application cycles are made which could be replicated by other funding agencies.
School food has a major influence on children’s diet quality and has the potential to reduce diet inequalities and non-communicable disease risk. Funded by the UK Prevention Research Partnership, we have established a UK school food system network. The overarching aim was to build a community to work towards a more health-promoting food and nutrition system in UK schools. The network has brought together a team from a range of disciplines, while the inclusion of non-academic users and other stakeholders, such as pupils and parents, has allowed the co-development of research priorities and questions. This network has used a combination of workshops, working groups and pump-priming projects to explore the school food system, as well as creating a systems map of the UK school food system and conducting network analysis of the newly established network. Through understanding the current food system and building network expertise, we hope to advance research and policy around food in schools. Further funding has been achieved based on these findings, working in partnership with policymakers and schools, while a Nutrition Society Special Interest Group has been established to ensure maximum engagement and future sustainability of the network. This review will describe the key findings and progress to date based on the work of the network, as well as a summary of the current literature, identification of knowledge gaps and areas of debate, according to key elements of the school food system.
Childhood maltreatment is linked with later depressive symptoms, but not every maltreated child will experience symptoms later in life. Therefore, we investigate whether genetic predisposition for depression (i.e., polygenic score for depression, PGSDEP) modifies the association between maltreatment and depressive symptoms, while accounting for different types of maltreatment and whether it was evaluated through prospective and retrospective reports. The sample included 541–617 participants from the Quebec Longitudinal Study of Child Development with information on maltreatment, including threat, deprivation, assessed prospectively (5 months–17 years) and retrospectively (reported at 23 years), PGSDEP and self-reported depressive symptoms (20–23 years). Using hierarchical linear regressions, we found that retrospective, but not prospective indicators of maltreatment (threat/deprivation/cumulative) were associated with later depressive symptoms, above and beyond the PGSDEP. Our findings also show the presence of gene–environment interactions, whereby the association between maltreatment (retrospective cumulative maltreatment/threat, prospective deprivation) and depression was strengthened among youth with higher PGSDEP scores. Consistent with the Diathesis-Stress hypothesis, our findings suggest that a genetic predisposition for depression may exacerbate the putative impact of maltreatment on later depressive symptoms, especially when maltreatment is retrospective. Understanding the gene–environment interplay emerging in the context of maltreatment has the potential to guide prevention efforts.
Using data from a 15-year longitudinal follow-up of a randomized controlled trial of a parenting-focused preventive intervention for divorced families (N = 240) with children aged 9–12, the current study examined alternative cascading pathways through which the intervention led to improvements in offspring’s perceived health problems, BMI, and cigarette smoking in emerging adulthood. It was hypothesized that the program would lead to improvements in these health-related outcomes during emerging adulthood through progressive associations between program-induced changes in parenting and offspring outcomes, including mental health problems, substance use, and competencies. Intervention-induced improvements in positive parenting at posttest led to improvements in mental health problems in late childhood/early adolescence, which led to lower levels of mental health and substance use problems as well as higher levels of competencies in adolescence, which led to improvements in the health-related outcomes. Academic performance predicted all three health-related outcomes and other aspects of adolescent functioning showed different relations across outcomes. Results highlight the potential for intervention effects of preventive parenting interventions in childhood to cascade over time to affect health-related outcomes in emerging adulthood.