We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Plotinus' Enneads is a work which is central to the history of philosophy in late antiquity. This is the second edition of the first English translation of the complete works of Plotinus in one volume in seventy years, which also includes Porphyry's Life of Plotinus. Led by Lloyd P. Gerson, a team of experts present up-to-date translations which are based on the best available text, the edition minor of Henry and Schwyzer and its corrections. The translations are consistent in their vocabulary, making the volume ideal for the study of Plotinus' philosophical arguments. This second edition includes a number of corrections, as well as additional cross-references to enrich the reader's understanding of Plotinus' sometimes very difficult presentation of his ideas. It will be invaluable for scholars of Plotinus with or without ancient Greek, as well as for students of the Platonic tradition.
Functional impairment in daily activities, such as work and socializing, is part of the diagnostic criteria for major depressive disorder and most anxiety disorders. Despite evidence that symptom severity and functional impairment are partially distinct, functional impairment is often overlooked. To assess whether functional impairment captures diagnostically relevant genetic liability beyond that of symptoms, we aimed to estimate the heritability of, and genetic correlations between, key measures of current depression symptoms, anxiety symptoms, and functional impairment.
Methods
In 17,130 individuals with lifetime depression or anxiety from the Genetic Links to Anxiety and Depression (GLAD) Study, we analyzed total scores from the Patient Health Questionnaire-9 (depression symptoms), Generalized Anxiety Disorder-7 (anxiety symptoms), and Work and Social Adjustment Scale (functional impairment). Genome-wide association analyses were performed with REGENIE. Heritability was estimated using GCTA-GREML and genetic correlations with bivariate-GREML.
Results
The phenotypic correlations were moderate across the three measures (Pearson’s r = 0.50–0.69). All three scales were found to be under low but significant genetic influence (single-nucleotide polymorphism-based heritability [h2SNP] = 0.11–0.19) with high genetic correlations between them (rg = 0.79–0.87).
Conclusions
Among individuals with lifetime depression or anxiety from the GLAD Study, the genetic variants that underlie symptom severity largely overlap with those influencing functional impairment. This suggests that self-reported functional impairment, while clinically relevant for diagnosis and treatment outcomes, does not reflect substantial additional genetic liability beyond that captured by symptom-based measures of depression or anxiety.
Sustainability in Aotearoa New Zealand’s food system is essential for environmental health (taiao ora) and human well-being (tangata ora). However, achieving resilience in our food system faces significant cross-sector challenges, requiring a national food strategy that addresses environmental, economic, and social pressures(1). This work aims to develop the first national computational model of Aotearoa New Zealand’s food system, integrating key factors into a decision support tool. The model aims to support food system resilience by offering an accessible platform that could help inform decisions to strengthen preparedness for shocks, while also providing insights to enhance everyday food security. The Kai Anamata mō Aotearoa (KAMA) model leverages new data and indigenous crop trials to combine work across agriculture, environment, and human wellbeing, forming a comprehensive tool to examine food system resilience. This model will capture the resources required, outputs produced, and wellbeing outcomes of our food system. The KAMA model was built using a flow-state modelling approach, which allows for flexible configuration of land uses and ensures that the model can adapt to future technologies and climate change scenarios. The preliminary development the KAMA model was used to demonstrate the current production system and applied to a regional case study from Te Tauihu, integrating region-specific food production data, including apples, kiwifruit, mussels, wine, and hops production. Outputs included labour, carbon dioxide emissions and mass of production. Beyond food production, this model will enable users to explore the impacts of land use for commodity production, the effects of trade, nutrient supply, and the broader implications for well-being. model will be made publicly accessible online to allow any interested individual to explore the future of the national food system.
Although current estimates suggest that global food production is enough to meet nutritional needs, there are still significant challenges with equitable distribution(1). Tackling these disparities is essential for achieving global nutrition security now and in the future. This study uses the DELTA Model® to analyse global nutrient supply dynamics at national resolution and address nutritional shortfalls in specific countries(2). By examining the distribution of food commodities and nutrients in 2020, we project the future food and nutrient production needs for 2050 to ensure adequate global supply. Our findings indicate that while some nutrients are sufficiently supplied on a global scale, many countries face significant national deficiencies in essential nutrients such as vitamins A, B12, B2, potassium, and iron. Addressing these gaps will require substantial increases in nutrient supply or redistribution. For example, a 1% increase in global protein, targeted at countries with insufficient protein, could close the 2020 gaps. However, if current consumption patterns persist, the global food system will need a 26% increase in production by 2050 to accommodate population growth and changing consumption patterns. Our study developed a framework for exploring future production scenarios. This involves reducing surplus national nutrient supply linearly over decades while simultaneously increasing production of undersupplied nutrients. This framework provides a more practical assessment of future needs, transitioning from idealized production scenarios to realistic projections. Our study investigated a potential future for nutrient supply to meet minimum requirements by 2050. Calcium and vitamin E are crucial, and production must be increased to address significant gaps, given their severe deficiencies in 2020. Energy and fibre production will be required to peak between 2030 and 2040 before stabilizing back near 2020 levels. Predicted changes in nutrient supply from 2020 to 2050 vary: while calcium and vitamin E will need to increase, phosphorus, thiamine and the indispensable amino acids can decrease without compromising global nutrition with only minor redistribution. These results are essential for determining the food supply required to achieve adequate global nutrient supply in the future. Incorporating these insights into global food balance models will provide key stakeholders with evidence, refine future projections, and inform policy decisions aimed at promoting sustainable healthy diets worldwide.
Background: While efgartigimod usage is expected to reduce immunoglobulin (IG) utilization, evidence in clinical practice is limited. Methods: In this retrospective cohort study, patients with gMG treated with efgartigimod for ≥1-year were identified from US medical/pharmacy claims data (April 2016-January 2024) and data from the My VYVGART Path patient support program (PSP). The number of IG courses during 1-year before and after efgartigimod initiation (index date) were evaluated. Patients with ≥6 annual IG courses were considered chronic IG users. Myasthenia Gravis Activities of Daily Living (MG-ADL) scores before and after index were obtained from the PSP where available. Descriptive statistics were used without adjustment for covariates. Results: 167 patients with ≥1 IG claim before index were included. Prior to efgartigimod initiation, the majority of patients (62%) received IG chronically. During the 1-year after index, the number of IG courses fell by 95% (pre: 1531, post: 75). 89% (n=149/167) of patients fully discontinued IG usage. Mean (SD) best-follow up MG-ADL scores were significantly reduced after index (8.0 [4.1] to 2.8 [2.1], P<0.05, n=73/167, 44%). Conclusions: Based on US claims, IG utilization was substantially reduced among patients who continued efgartigimod for ≥1-year, with patients demonstrating a favorable MG-ADL response.
Background: Our prior six-year review (n=2165) revealed 24% of patients undergoing posterior decompression surgeries (laminectomy or discectomy) sought emergency department (ED) care within three months post-surgery. We established an integrated Spine Assessment Clinic (SAC) to enhance patient outcomes and minimize unnecessary ED visits through pre-operative education, targeted QI interventions, and early post-operative follow-up. Methods: We reviewed 13 months of posterior decompression data (n=205) following SAC implementation. These patients received individualized, comprehensive pre-operative education and follow-up phone calls within 7 days post-surgery. ED visits within 90 days post-surgery were tracked using provincial databases and compared to our pre-SAC implementation data. Results: Out of 205 patients, 24 (11.6%) accounted for 34 ED visits within 90 days post-op, showing a significant reduction in ED visits from 24% to 11.6%, and decreased overall ED utilization from 42.1% to 16.6% (when accounting for multiple visits by the same patient). Early interventions including wound monitoring, outpatient bloodwork, and prescription adjustments for pain management, helped mitigate ED visits. Patient satisfaction surveys (n=62) indicated 92% were “highly satisfied” and 100% would recommend the SAC. Conclusions: The SAC reduced ED visits after posterior decompression surgery by over 50%, with pre-operative education, focused QI initiatives, and its individualized, proactive approach.
DSM-5 specifies bulimia nervosa (BN) severity based on specific thresholds of compensatory behavior frequency. There is limited empirical support for such severity groupings. Limited support could be because the DSM-5’s compensatory behavior frequency cutpoints are inaccurate or because compensatory behavior frequency does not capture true underlying differences in severity. In support of the latter possibility, some work has suggested shape/weight overvaluation or use of single versus multiple purging methods may be better severity indicators. We used structural equation modeling (SEM) Trees to empirically determine the ideal variables and cutpoints for differentiating BN severity, and compared the SEM Tree groupings to alternate severity classifiers: the DSM-5 indicators, single versus multiple purging methods, and a binary indicator of shape/weight overvaluation.
Methods
Treatment-seeking adolescents and adults with BN (N = 1017) completed self-report measures assessing BN and comorbid symptoms. SEM Trees specified an outcome model of BN severity and recursively partitioned this model into subgroups based on shape/weight overvaluation and compensatory behaviors. We then compared groups on clinical characteristics (eating disorder symptoms, depression, anxiety, and binge eating frequency).
Results
SEM Tree analyses resulted in five severity subgroups, all based on shape/weight overvaluation: overvaluation <1.25; overvaluation 1.25–3.74; overvaluation 3.75–4.74; overvaluation 4.75–5.74; and overvaluation ≥5.75. SEM Tree groups explained 1.63–6.41 times the variance explained by other severity schemes.
Conclusions
Shape/weight overvaluation outperformed the DSM-5 severity scheme and single versus multiple purging methods, suggesting the DSM-5 severity scheme should be reevaluated. Future research should examine the predictive utility of this severity scheme.
Aims: The NHS Southern Gambling Service (SGS) opened in 2022, and provides evidence-based assessment and treatment for people affected by gambling disorder across the South East of England. It is known that gambling venues are often placed in highly deprived areas, where populations vulnerable to gambling disorder reside. Little is known about whether geographical presence of gambling venues is linked to higher rates of referrals for gambling disorder to clinical services. The aims were to draw insights on the association between the incidence of referrals at the SGS and number of registered gambling venues across the geographical footprint of the regional service, while we adjust for indices of multiple deprivation.
Methods: Service level data for referrals per Low-Tier Local Authority (LTLA) level were merged with open access national datasets for indices of multiple deprivation (Office for National Statistics 2021) and number of gambling venues in each area (Gambling Commission 2024). Linear regression analyses were performed in-sample to identify the strength of the associations between the number of referrals and number of registered venues, adjusted for indices of multiple deprivation (IMD). This service evaluation was pre-registered with the Hampshire and Isle of Wight Healthcare NHS Foundation Trust Clinical Effectiveness team. All data analysis was conducted in R version 4.4.2.
Results: A total of 668 participants were referred to the SGS from September 2022 to end of November 2024. The correlation between venues and referral incidence was strong (Pearson’s r=0.58, p<0.001). Number of venues per LTLA were statistically associated with incidence of referrals to the SGS (t=3.9, p<0.001) including after adjusting for IMD indices. The model which included only the number of venues as a predictor explained 33.3% of the variance in incidence rate (R²=0.3325, p<0.001).
Conclusion: Number of gambling venues was strongly associated with incidence of referrals to the SGS. This association remained strong even after adjusting for indices of multiple deprivation. These insights can help the SGS in the strategic planning of development and utilization of its future resources, and highlight the need to examine sources of referrals nationally and links to contextual factors such as presence of gambling venues. Further work is warranted to define the optimal granularity for dissecting the geospatial links between the location of gambling venues and referrals to NHS Gambling Treatment Services, to further establish the stability and generalizability of these findings, as well as to explore a broader range of implicated bio-socio-economic factors.
Posttraumatic stress disorder (PTSD) has been associated with advanced epigenetic age cross-sectionally, but the association between these variables over time is unclear. This study conducted meta-analyses to test whether new-onset PTSD diagnosis and changes in PTSD symptom severity over time were associated with changes in two metrics of epigenetic aging over two time points.
Methods
We conducted meta-analyses of the association between change in PTSD diagnosis and symptom severity and change in epigenetic age acceleration/deceleration (age-adjusted DNA methylation age residuals as per the Horvath and GrimAge metrics) using data from 7 military and civilian cohorts participating in the Psychiatric Genomics Consortium PTSD Epigenetics Workgroup (total N = 1,367).
Results
Meta-analysis revealed that the interaction between Time 1 (T1) Horvath age residuals and new-onset PTSD over time was significantly associated with Horvath age residuals at T2 (meta β = 0.16, meta p = 0.02, p-adj = 0.03). The interaction between T1 Horvath age residuals and changes in PTSD symptom severity over time was significantly related to Horvath age residuals at T2 (meta β = 0.24, meta p = 0.05). No associations were observed for GrimAge residuals.
Conclusions
Results indicated that individuals who developed new-onset PTSD or showed increased PTSD symptom severity over time evidenced greater epigenetic age acceleration at follow-up than would be expected based on baseline age acceleration. This suggests that PTSD may accelerate biological aging over time and highlights the need for intervention studies to determine if PTSD treatment has a beneficial effect on the aging methylome.
The Hippoboscidae are ectoparasites of birds and mammals, which, as a group, are known to vector multiple diseases. Avipoxvirus (APV) is mechanically vectored by various arthropods and causes seasonal disease in wild birds in the United Kingdom (UK). Signs of APV and the presence of louse flies (Hippoboscidae) on Dunnocks Prunella modularis were recorded over a 16·5-year period in a rural garden in Somerset, UK. Louse flies collected from this site and other sites in England were tested for the presence of APV DNA and RNA sequences. Louse flies on Dunnocks were seen to peak seasonally three weeks prior to the peak of APV lesions, an interval consistent with the previously estimated incubation period of APV in Dunnocks. APV DNA was detected on 13/25 louse flies, Ornithomya avicularia and Ornithomya fringillina, taken from Dunnocks, both with and without lesions consistent with APV, at multiple sites in England. Collectively these data support the premise that louse flies may vector APV. The detection of APV in louse flies, from apparently healthy birds, and from sites where disease has not been observed in any host species, suggests that the Hippoboscidae could provide a non-invasive and relatively cheap method of monitoring avian diseases. This could provide advanced warnings of disease, including zoonoses, before they become clinically apparent.
There is a significant mortality gap between the general population and people with psychosis. Completion rates of regular physical health assessments for cardiovascular risk in this group are suboptimal. Point-of-care testing (POCT) for diabetes and hyperlipidaemia – providing an immediate result from a finger-prick – could improve these rates.
Aims
To evaluate the impact on patient–clinician encounters and on physical health check completion rates of implementing POCT for cardiovascular risk markers in early intervention in psychosis (EIP) services in South East England.
Method
A mixed-methods, real-world evaluation study was performed, with 40 POCT machines introduced across EIP teams in all eight mental health trusts in South East England from March to May 2021. Clinician training and support was provided. Numbers of completed physical health checks, HbA1c and lipid panel blood tests completed 6 and 12 months before and 6 months after introduction of POCT were collected for individual patients. Data were compared with those from the South West region, which acted as a control. Clinician questionnaires were administered at 2 and 8 months, capturing device usability and impacts on patient interactions.
Results
Post-POCT, South East England saw significant increases in HbA1c testing (odds ratio 2.02, 95% CI 1.17–3.49), lipid testing (odds ratio 2.38, 95% CI 1.43–3.97) and total completed health checks (odds ratio 3.61, 95% CI 1.94–7.94). These increases were not seen in the South West. Questionnaires revealed improved patient engagement, clinician empowerment and patients’ preference for POCT over traditional blood tests.
Conclusions
POCT is associated with improvements in the completion and quality of physical health checks, and thus could be a tool to enhance holistic care for individuals with psychosis.
Objectives/Goals: Personalized cancer therapy based on genomic testing is advancing patient care. Genomic alterations in fibroblast growth factor receptor (FGFR) predict response to FGFR inhibitors; however, the role of RNA expression and protein activation is not known. We propose to examine the phospho-proteomic signature in FGFR-altered cancers to identify new candidates for FGFR-targeted therapies. Methods/Study Population: In our preliminary study, we have curated a cohort of FGFR2 mutants (13 FGFR2-fusions and 4 FGFR2 point mutations) with known clinical outcomes to FGFR inhibitors and 8 FGFR2 wild-type (WT) cholangiocarcinoma tumor samples to investigate the phospho-proteomic fingerprint using a clinical grade reverse phase protein array (RPPA). RPPAs are high throughput quantitative antibody-based proteomics assays that can quantify hundreds of proteins in thousands of patient tissues providing a high degree of sensitivity through laser tumor microdissection (LCM). We have selected proteins in the FGFR signaling pathway including FGFR2, AKT, ERK1.2, STAT1/3, FRS2, and PLCg to define the range of phospho-proteomic signal between FGFR2 WT and mutant cancers. All samples will undergo evaluation with RNASeq for gene expression. Results/Anticipated Results: Our initial analysis defined the range of RNA expression of FGFR2 and pFGFR2 protein signal (Y653/654 and Y769) between FGFR2 WT and FGFR2 mutant samples. On average, the FGFR2 mutant cohort displayed higher FGFR2 RNA expression compared to the FGFR2 WT cohort. There is no apparent correlation between RNA expression and clinical response to FGFR-targeted therapy. However, in this small cohort, there is no significant difference in FGFR2 phosphorylation between FGFR2 WT and mutant cancers. RPPA analysis of FGFR downstream signaling proteins reveals a wide range of phosphorylation, but no significant difference between FGFR2 WT and mutant cancers. Discussion/Significance of Impact: These findings illustrate the complexities of FGFR signaling between FGFR2 WT and mutant cancers. These data suggest that tumors with genomically WT FGFR may display increased pFGFR2 and downstream signaling phospho-proteins. We propose a larger study of cholangiocarcinoma to evaluate evidence of FGFR pathway activation in WT tumors.
The Early Minimally Invasive Removal of Intracerebral Hemorrhage (ENRICH) trial demonstrated that minimally invasive surgery to treat spontaneous lobar intracerebral hemorrhage (ICH) improved functional outcomes. We aimed to explore current management trends for spontaneous lobar ICH in Canada to assess practice patterns and determine whether further randomized controlled trials are needed to clarify the role of surgical intervention.
Methods:
Neurologists, neurosurgeons, physiatrists and trainees in these specialties were invited to complete a 16-question survey exploring three areas: (1) current management for spontaneous lobar ICH at their institution, (2) perceived influence of ENRICH on their practice and (3) perceived need for additional clinical trial data. Standard descriptive statistics were used to report categorical variables. The χ2 test was used to compare responses across specialties and career stages.
Results:
The survey was sent to 433 physicians, and 101 (23.3%) responded. Sixty-eight percent of participants reported that prior to publication of the ENRICH trial, spontaneous lobar ICH was primarily managed conservatively, with surgery reserved for life-threatening situations. Forty-three percent of participants did not foresee a significant increase in surgical intervention at their institution. Of neurosurgical respondents, 33% remained hesitant to offer surgical intervention beyond lifesaving operations. Only 5% reported routinely using specifically designed technologies to evacuate ICH. Seventy percent reported that another randomized controlled trial comparing nonsurgical to surgical management for spontaneous lobar ICH is needed.
Conclusions:
There is significant practice variability in the management of spontaneous lobar ICH across Canadian institutions, stressing the need for additional clinical trial data to determine the role of surgical intervention.
We present several arguments for the preeminence of social interactions in determining and giving shape to societies. In our view, a society can emerge from social interaction and relationship patterns without the need for establishing an a priori limit on who actually belongs to it. Markers of group identity are one element among many that allow societies to persist.
Corn (Zea mays L.) is an important crop that contributes to global food security, but understanding how farm management practices and soil health affect corn grain nutrient analysis and therefore human health is lacking. Leveraging Rodale Institute's Farming Systems Trial—a long-term field experiment established in 1981 in Kutztown, PA, USA—this study was conducted to assess the impact of different agricultural management systems on corn grain nutrient profiles in a long-term trial that has resulted in differences in soil health indicators between treatments as a result of long-term management. The main plot factor was two tillage practices (intensive and reduced) and the subplot factor was four cropping systems (non-diversified conventional [nCNV], diversified conventional [dCNV], legume-based organic [ORG-LEG], and manure-based organic [ORG-MNR]). Generally, the levels of amino acids, vitamins, and protein in corn grain were greatest in the ORG-MNR system, followed by the ORG-LEG and dCNV systems, and finally the nCNV system. It is important to consider that the observed difference between the organic and conventionally grown grain could be due to variations in corn hybrids that were used in those systems. However, nutrient composition of corn differed within cropping systems but between management practices (diversified crop rotation and cover cropping) which also contributed to differences in soil health indicators (soil compaction, soil protein, and organic C levels) that may also influence grain nutrient concentrations. With the exception of methionine, nutrient concentration in corn grain was not affected by different tillage regimes. These findings provide novel information on corn grain nutritional quality of organic and conventional cropping systems after long-term management and give insights into how system-specific components affect nutrient composition of corn grain.
The concept of interaction classes (iClasses) for multi-environment trial data was introduced to address the problem of summarising variety performance across environments in the presence of variety by environment interaction (VEI). The approach involves the fitting of a factor analytic linear mixed model (FALMM), with the resultant estimates of factor loadings being used to form groups of environments (iClasses) that discriminate varieties with different patterns of VEI. It is then meaningful to summarise variety performance across environments within iClasses. The iClass methodology was developed with respect to a FALMM in which the genetic effects for different varieties were assumed independent. This was done for pedagogical reasons but it was pointed out that the accuracy of variety selection is greatly enhanced by considering the genetic relatedness of varieties, either via ancestral or genomic information. The focus of the current paper is therefore to extend the iClass approach for FALMMs which incorporate such information. In addition, a measure of stability of variety performance across iClasses is defined. The utility of the approach for variety selection is illustrated using a multi-environment trial dataset from the lentil breeding programme operated by Agriculture Victoria.