We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Functional impairment in daily activities, such as work and socializing, is part of the diagnostic criteria for major depressive disorder and most anxiety disorders. Despite evidence that symptom severity and functional impairment are partially distinct, functional impairment is often overlooked. To assess whether functional impairment captures diagnostically relevant genetic liability beyond that of symptoms, we aimed to estimate the heritability of, and genetic correlations between, key measures of current depression symptoms, anxiety symptoms, and functional impairment.
Methods
In 17,130 individuals with lifetime depression or anxiety from the Genetic Links to Anxiety and Depression (GLAD) Study, we analyzed total scores from the Patient Health Questionnaire-9 (depression symptoms), Generalized Anxiety Disorder-7 (anxiety symptoms), and Work and Social Adjustment Scale (functional impairment). Genome-wide association analyses were performed with REGENIE. Heritability was estimated using GCTA-GREML and genetic correlations with bivariate-GREML.
Results
The phenotypic correlations were moderate across the three measures (Pearson’s r = 0.50–0.69). All three scales were found to be under low but significant genetic influence (single-nucleotide polymorphism-based heritability [h2SNP] = 0.11–0.19) with high genetic correlations between them (rg = 0.79–0.87).
Conclusions
Among individuals with lifetime depression or anxiety from the GLAD Study, the genetic variants that underlie symptom severity largely overlap with those influencing functional impairment. This suggests that self-reported functional impairment, while clinically relevant for diagnosis and treatment outcomes, does not reflect substantial additional genetic liability beyond that captured by symptom-based measures of depression or anxiety.
Lesbian, gay, and bisexual (LGB) individuals are more than twice as likely to experience anxiety and depression compared with heterosexuals. Minority stress theory posits that stigma and discrimination contribute to chronic stress, potentially affecting clinical treatment. We compared psychological therapy outcomes between LGB and heterosexual patients by gender.
Methods
Retrospective cohort data were obtained from seven NHS talking therapy services in London, from April 2013 to December 2023. Of 100,389 patients, 94,239 reported sexual orientation, 7,422 identifying as LGB. The primary outcome was reliable recovery from anxiety and depression. Secondary outcomes were reliable improvement, depression and anxiety severity, therapy attrition, and engagement. Analyses were stratified by gender and employed multilevel regression models, adjusting for sociodemographic and clinical covariates.
Results
After adjustment, gay men had higher odds of reliable recovery (OR: 1.23, 95% CI: 1.13–1.34) and reliable improvement (OR: 1.16, 95% CI: 1.06–1.28) than heterosexual men, with lower attrition (OR: 0.88, 95% CI: 0.80–0.97) and greater reductions in depression (MD: 0.51, 95% CI: 0.28–0.74) and anxiety (MD: 0.45, 95% CI: 0.25–0.65). Bisexual men (OR: 0.67, 95% CI: 0.54–0.83) and bisexual women (OR: 0.84, 95% CI: 0.77–0.93) had lower attrition than heterosexuals. Lesbian and bisexual women, and bisexual men, attended slightly more sessions (MD: 0.02–0.03, 95% CI: 0.01–0.04) than heterosexual patients. No other differences were observed.
Conclusions
Despite significant mental health burdens and stressors, LGB individuals had similar, if not marginally better, outcomes and engagement with psychological therapy compared with heterosexual patients.
Cardiometabolic diseases, including type 2 diabetes (T2DM) and cardiovascular disease (CVD), are common. Approximately one in three deaths annually are caused by CVD in Aotearoa New Zealand (AoNZ)(1). The Mediterranean dietary pattern is associated with a reduced risk of cardiometabolic disease in epidemiological and interventional studies(2,3). However, implementing the Mediterranean diet into non-Mediterranean populations can be challenging(4). Some of these challeanges include facilitating consumption of unfamiliar foods and the cultural and social context of food consumption. AoNZ produces a rich source of high-quality foods consistent with a Mediterranean dietary pattern. He Rourou Whai Painga is collaborative project combining contributions from food industry partners into a Mediterranean Diet pattern and providing foods, recipes and other support to whole household/whānau. The aim was to test if a New Zealand food-based Mediterranean diet (NZMedDiet) with behavioural intervention improves cardiometabolic health and wellbeing in individuals at risk. This presentation will review the background to the research, the process of forming a collaboration between researchers and the food industry, the design and implementation of a complex study design (see protocol paper)(5), with results from the initial randomised controlled trial. We conducted several pilot studies(6,7,8) to inform the final design of the research, which was a combination of two randomised controlled trials (RCT 1 and 2) and a longitudinal cohort study. RCT-1 compared 12-weeks of the NZMedDiet to usual diet in participants with increased cardiometabolic risk (metabolic syndrome severity score (MetSSS) >0.35). The intervention group were provided with food and recipes to meet 75% of their energy requirements, supported by a behavioural intervention to improve adherence. The primary outcome measure was MetSSS after 12 weeks. Two hundred individuals with mean (SD) age 49.9 (10.9)yrs with 62% women were enrolled with their household/whānau. After 12 weeks, the mean (SD) MetSSS was 1.0 (0.7) in the control (n = 98) and 0.8 (0.5) in the intervention (n = 102) group; estimated difference (95% CI) of -0.05 (-0.16 to 0.06), p=0.35. A Mediterranean diet score (PyrMDS) was greater in the intervention group 1.6 (1.1 to 2.1), p<0.001, consistent with a change to a more Mediterranean dietary pattern. Weight reduced in the NZMedDiet group compared with control (-1.9 kg (-2.0 to -0.34)), p=0.006 and wellbeing, assessed by the SF-36 quality of life questionnaire, improved across all domains p<0.001. In participants with increased cardiometabolic risk, food provision with a Mediterranean dietary pattern and a behavioural intervention did not improve a metabolic risk score but was associated with reduced weight and improved quality of life.
As temperatures globally continue to rise, sporting events such as marathons will take place on warmer days, increasing the risk of exertional heat stroke (EHS).
Methods
The medical librarian developed and executed comprehensive searches in Ovid MEDLINE, Ovid Embase, CINAHL, SPORTDiscus, Scopus, and Web of Science Core Collection. Relevant keywords were selected. The results underwent title, abstract, and full text screening in a web-based tool called Covidence, and were analyzed for pertinent data.
Results
A total of 3918 results were retrieved. After duplicate removal and title, abstract, and full text screening, 38 articles remained for inclusion. There were 22 case reports, 12 retrospective reviews, and 4 prospective observational studies. The races included half marathons, marathons, and other long distances. In the case reports and retrospective reviews, the mean environmental temperatures were 21.3°C and 19.8°C, respectively. Discussions emphasized that increasing environmental temperatures result in higher incidences of EHS.
Conclusion
With rising global temperatures from climate change, athletes are at higher risk of EHS. Early ice water immersion is the best treatment for EHS. Earlier start times and cooling stations for races may mitigate incidences of EHS. Future work needs to concentrate on the establishment of EHS prevention and mitigation protocols.
CBRN incidents require specialized hazmat decontamination protocols to prevent secondary contamination and systemic toxicity. While wet decontamination is standard, it can present challenges in cold weather or when resources are limited. Dry decontamination offers an alternative and supportive approach, though its effectiveness across different contaminants remains unclear. This scoping review evaluates the effectiveness, advantages, and limitations of dry decontamination in hazmat incidents.
Methods
A scoping review was conducted using MEDLINE, CINAHL, and other databases. Following the PRISMA-ScR approach, 9 studies were selected from 234 identified articles. The review assessed decontamination techniques, materials, and effectiveness across different contaminants.
Results
Dry decontamination is rapid, resource-efficient, and suitable for immediate use in pre-hospital and hospital settings, especially during mass casualty incidents (MCIs). Dry decontamination is highly effective for liquid contaminants, with blue roll and sterile trauma dressings removing over 80% of contaminants within minutes. However, dry decontamination is less effective for hair and particulate contaminants. Blotting and rubbing techniques significantly enhance decontamination efficiency.
Conclusions
Dry decontamination can be an effective alternative for wet decontamination, particularly for liquid contaminants, as a first-line approach for scenarios where wet decontamination is not a practical solution for logistical and environmental reasons. However, dry decontamination is less effective than wet decontamination for hair and particulate contaminants. Combining dry and wet decontamination is shown to be more effective. Identifying the need for including dry decontamination as an integral part of the CBRN response plan improves the efficacy of decontamination.
Approximately 15% of Australia’s workforce are shift workers, who are at greater risk for obesity and related conditions, such as type 2 diabetes and cardiovascular disease.(1,2,3) While current guidelines for obesity management prioritise diet-induced weight loss as a treatment option, there are limited weight-loss studies involving night shift workers and no current exploration of the factors associated with engagement in weight-loss interventions. The Shifting Weight using Intermittent Fasting in night shift workers (SWIFt) study was a randomised controlled trial that compared three, 24-week weight-loss interventions: continuous energy restriction (CER), and 500-calorie intermittent fasting (IF) for 2-days per week; either during the day (IF:2D), or the night shift (IF:2N). This current study provided a convergent, mixed methods, experimental design to: 1) explore the relationship between participant characteristics, dietary intervention group and time to drop out for the SWIFt study (quantitative); and 2) understand why some participants are more likely to drop out of the intervention (qualitative). Participant characteristics included age, gender, ethnicity, occupation, shift schedule, number of night shifts per four weeks, number of years in shift work, weight at baseline, weight change at four weeks, and quality of life at baseline. A Cox regression model was used to specify time to drop out from the intervention as the dependent variable and purposive selection was used to determine predictors for the model. Semi-structured interviews at baseline and 24-weeks were conducted and audio diaries every two weeks were collected from participants using a maximum variation sampling approach, and analysed using the five steps of framework analysis.(4) A total of 250 participants were randomised to the study between October 2019 and February 2022. Two participants were excluded from analysis due to retrospective ineligibility. Twenty-nine percent (n = 71) of participants dropped out of the study over the 24-week intervention. Greater weight at baseline, fewer years working shift work, lower weight change at four weeks, and women compared to men were associated with a significant increased rate of drop out from the study (p < 0.05). Forty-seven interviews from 33 participants were conducted and 18 participants completed audio diaries. Lack of time, fatigue and emotional eating were barriers more frequently reported by women. Participants with a higher weight at baseline more frequently reported fatigue and emotional eating barriers, and limited guidance on non-fasting days as a barrier for the IF interventions. This study provides important considerations for refining shift-worker weight-loss interventions for future implementation in order to increase engagement and mitigate the adverse health risks experienced by this essential workforce.
It remains unclear which individuals with subthreshold depression benefit most from psychological intervention, and what long-term effects this has on symptom deterioration, response and remission.
Aims
To synthesise psychological intervention benefits in adults with subthreshold depression up to 2 years, and explore participant-level effect-modifiers.
Method
Randomised trials comparing psychological intervention with inactive control were identified via systematic search. Authors were contacted to obtain individual participant data (IPD), analysed using Bayesian one-stage meta-analysis. Treatment–covariate interactions were added to examine moderators. Hierarchical-additive models were used to explore treatment benefits conditional on baseline Patient Health Questionnaire 9 (PHQ-9) values.
Results
IPD of 10 671 individuals (50 studies) could be included. We found significant effects on depressive symptom severity up to 12 months (standardised mean-difference [s.m.d.] = −0.48 to −0.27). Effects could not be ascertained up to 24 months (s.m.d. = −0.18). Similar findings emerged for 50% symptom reduction (relative risk = 1.27–2.79), reliable improvement (relative risk = 1.38–3.17), deterioration (relative risk = 0.67–0.54) and close-to-symptom-free status (relative risk = 1.41–2.80). Among participant-level moderators, only initial depression and anxiety severity were highly credible (P > 0.99). Predicted treatment benefits decreased with lower symptom severity but remained minimally important even for very mild symptoms (s.m.d. = −0.33 for PHQ-9 = 5).
Conclusions
Psychological intervention reduces the symptom burden in individuals with subthreshold depression up to 1 year, and protects against symptom deterioration. Benefits up to 2 years are less certain. We find strong support for intervention in subthreshold depression, particularly with PHQ-9 scores ≥ 10. For very mild symptoms, scalable treatments could be an attractive option.
The treatment recommendation based on a network meta-analysis (NMA) is usually the single treatment with the highest expected value (EV) on an evaluative function. We explore approaches that recommend multiple treatments and that penalise uncertainty, making them suitable for risk-averse decision-makers. We introduce loss-adjusted EV (LaEV) and compare it to GRADE and three probability-based rankings. We define properties of a valid ranking under uncertainty and other desirable properties of ranking systems. A two-stage process is proposed: the first identifies treatments superior to the reference treatment; the second identifies those that are also within a minimal clinically important difference (MCID) of the best treatment. Decision rules and ranking systems are compared on stylised examples and 10 NMAs used in NICE (National Institute of Health and Care Excellence) guidelines. Only LaEV reliably delivers valid rankings under uncertainty and has all the desirable properties. In 10 NMAs comparing between 5 and 41 treatments, an EV decision maker would recommend 4–14 treatments, and LaEV 0–3 (median 2) fewer. GRADE rules give rise to anomalies, and, like the probability-based rankings, the number of treatments recommended depends on arbitrary probability cutoffs. Among treatments that are superior to the reference, GRADE privileges the more uncertain ones, and in 3/10 cases, GRADE failed to recommend the treatment with the highest EV and LaEV. A two-stage approach based on MCID ensures that EV- and LaEV-based rules recommend a clinically appropriate number of treatments. For a risk-averse decision maker, LaEV is conservative, simple to implement, and has an independent theoretical foundation.
Precision or “Personalized Medicine” and “Big Data” are growing trends in the biomedical research community and highlight an increased focus on access to larger datasets to effectively explore disease processes at the molecular level versus the previously common one-size-fits all approach. This focus necessitated a local transition from independent lab and siloed projects to a single software application utilizing a common ontology to create access to data from multiple repositories. Use of a common system has allowed for increased ease of collaboration and access to quality biospecimens that are extensively annotated with clinical, molecular, and patient associated data. The software needed to function at an enterprise level while continuing to allow investigators the autonomy and security access they desire. To identify a solution, a working group comprised of representation from independent repositories and areas of research focus across departments was established and responsible for review and implementation of an enterprise-wide biospecimen management system. Central to this process was the creation of a unified vocabulary across all repositories, including consensus around source of truth, standardized field definitions, and shared terminology.
The recent expansion of cross-cultural research in the social sciences has led to increased discourse on methodological issues involved when studying culturally diverse populations. However, discussions have largely overlooked the challenges of construct validity – ensuring instruments are measuring what they are intended to – in diverse cultural contexts, particularly in developmental research. We contend that cross-cultural developmental research poses distinct problems for ensuring high construct validity owing to the nuances of working with children, and that the standard approach of transporting protocols designed and validated in one population to another risks low construct validity. Drawing upon our own and others’ work, we highlight several challenges to construct validity in the field of cross-cultural developmental research, including (1) lack of cultural and contextual knowledge, (2) dissociating developmental and cultural theory and methods, (3) lack of causal frameworks, (4) superficial and short-term partnerships and collaborations, and (5) culturally inappropriate tools and tests. We provide guidelines for addressing these challenges, including (1) using ethnographic and observational approaches, (2) developing evidence-based causal frameworks, (3) conducting community-engaged and collaborative research, and (4) the application of culture-specific refinements and training. We discuss the need to balance methodological consistency with culture-specific refinements to improve construct validity in cross-cultural developmental research.
The adipofascial anterolateral thigh (AF-ALT) free flap represents a versatile technique in head and neck reconstructions, with its applications increasingly broadening. The objective was to detail the novel utilization of the AF-ALT flap in orbital and skull base reconstruction, along with salvage laryngectomy onlay in our case series.
Method
We conducted a retrospective analysis at Roswell Park Comprehensive Cancer Center, spanning from July 2019 to June 2023, focusing on patient demographics and reconstructive parameters data.
Results
The AF-ALT flap was successfully employed in eight patients (average age 59, body mass index [BMI] 32.0) to repair various defects. Noteworthy outcomes were observed in skull base reconstructions, with no flap failures or major complications over an average 12-month follow-up. Donor sites typically healed well with minimal interventions.
Conclusion
Our series is the first to report the AF-ALT flap's efficacy in anterior skull base and orbital reconstructions, demonstrating an additional innovation in complex head and neck surgeries.
From early on, infants show a preference for infant-directed speech (IDS) over adult-directed speech (ADS), and exposure to IDS has been correlated with language outcome measures such as vocabulary. The present multi-laboratory study explores this issue by investigating whether there is a link between early preference for IDS and later vocabulary size. Infants’ preference for IDS was tested as part of the ManyBabies 1 project, and follow-up CDI data were collected from a subsample of this dataset at 18 and 24 months. A total of 341 (18 months) and 327 (24 months) infants were tested across 21 laboratories. In neither preregistered analyses with North American and UK English, nor exploratory analyses with a larger sample did we find evidence for a relation between IDS preference and later vocabulary. We discuss implications of this finding in light of recent work suggesting that IDS preference measured in the laboratory has low test-retest reliability.
Attention-deficit/hyperactivity disorder (ADHD) is a highly prevalent psychiatric condition that frequently originates in early development and is associated with a variety of functional impairments. Despite a large functional neuroimaging literature on ADHD, our understanding of the neural basis of this disorder remains limited, and existing primary studies on the topic include somewhat divergent results.
Objectives
The present meta-analysis aims to advance our understanding of the neural basis of ADHD by identifying the most statistically robust patterns of abnormal neural activation throughout the whole-brain in individuals diagnosed with ADHD compared to age-matched healthy controls.
Methods
We conducted a meta-analysis of task-based functional magnetic resonance imaging (fMRI) activation studies of ADHD. This included, according to PRISMA guidelines, a comprehensive PubMed search and predetermined inclusion criteria as well as two independent coding teams who evaluated studies and included all task-based, whole-brain, fMRI activation studies that compared participants diagnosed with ADHD to age-matched healthy controls. We then performed multilevel kernel density analysis (MKDA) a well-established, whole-brain, voxelwise approach that quantitatively combines existing primary fMRI studies, with ensemble thresholding (p<0.05-0.0001) and multiple comparisons correction.
Results
Participants diagnosed with ADHD (N=1,550), relative to age-matched healthy controls (N=1,340), exhibited statistically significant (p<0.05-0.0001; FWE-corrected) patterns of abnormal activation in multiple brains of the cerebral cortex and basal ganglia across a variety of cognitive control tasks.
Conclusions
This study advances our understanding of the neural basis of ADHD and may aid in the development of new brain-based clinical interventions as well as diagnostic tools and treatment matching protocols for patients with ADHD. Future studies should also investigate the similarities and differences in neural signatures between ADHD and other highly comorbid psychiatric disorders.
Food cravings are one of several important complexities between psychological and physiological triggers for food consumption. Cravings are commonly cited as contributing to over-consumption of hyperpalatable foods (sugary, salty, and fatty foods) and may be causal in obesity(1). The Mediterranean dietary pattern (MedDiet) is linked to reduced disease risk and improved health and wellbeing(2). Despite a lower intake of sugary and salty foods compared to a Western diet, free-living adults switching to the MedDiet find it satiating and achieve high adherence in Western countries. The MedDiet is known to improve mood and wellbeing, is high in fibre, monounsaturated fat and low in added sugar, and has a low glycaemic load, which could separately and synergistically reduce food cravings. The relationship between adherence to the MedDiet and food cravings has never been investigated. In the MedLey randomised controlled trial, we investigated the effects of a MedDiet on food cravings, compared with a habitual Australian diet (HabDiet)(3). Adherence to the MedDiet was scored out of 15 (maximum adherence). Participants completed three food cravings questionnaires at baseline and 6-months. The State questionnaire measures momentary cravings and has a maximum score of 75, indicating maximum food cravings. The Trait-reduced questionnaire measures general cravings and has a maximum score of score of 126, indicating more frequent and intense cravings for foods. The Food Cravings Inventory (FCI) measures cravings for four food domains: fatty foods, fast foods, sugary foods, and high carbohydrate (CHO) foods. MedDiet group (n = 58) responses were compared with the HabDiet group (n = 53) across visits using linear mixed effects modelling. Predicted differences were obtained for adherence scores of ≤8 (median adherence) and ≥9. Means ± SD or CIs are presented. Mean adherence increased from 7.1 ± 1.8 to 10.7 ± 1.48 in the MedDiet group (P<0.01), with no change in the HabDiet group (P = 1.00). Trait-reduced scores were not significantly different between groups at 6-months (P = 0.11), although there was a 5.57-point reduction within the MedDiet group (CI −12.56, −1.96, P = 0.04). State score was significantly lower in the MedDiet group than the HabDiet at 6-months (−4.4 (CI −7.53, −0.39), P = 0.03), and was significantly lower than at baseline (−5.9 (CI −9.33, −0.24,) P = 0.04). There were no differences between groups for the four domains of the FCI (P>0.05). Cravings for sugary foods was significantly reduced within the MedDiet group (−0.26 (CI −0.46, −0.05) P = 0.01). The predictive modelling suggested moving from an adherence score of 8 to 9 was associated with lower cravings for sugar (−0.03 ± 0.01, P = 0.03), fast food (−0.04 ± 0.02, P = 0.02) and CHO foods (−0.05 ± 0.02, P = 0.02). These results are suggestive that higher adherence to a MedDiet could reduce cravings compared to the Australian diet and suggest that the MedDiet may specifically reduce sugar cravings. Further investigation is warranted, through observational and intervention trials.
This article examines the development, early operation and subsequent failure of the Tot-Kolowa Red Cross irrigation scheme in Kenya’s Kerio Valley. Initially conceived as a technical solution to address regional food insecurity, the scheme aimed to scale up food production through the implementation of a fixed pipe irrigation system and the provision of agricultural inputs for cash cropping. A series of unfolding circumstances, however, necessitated numerous modifications to the original design as the project became increasingly entangled with deep and complex histories of land use patterns, resource allocation and conflict. Failure to understand the complexity of these dynamics ultimately led to the project’s collapse as the region spiralled into a period of significant unrest. In tracing these events, we aim to foreground the lived realities of imposed development, including both positive and negative responses to the scheme’s participatory obligations and its wider impact on community resilience.
A pharmacist-driven protocol for methicillin-resistant Staphylococcus aureus nares screening and empiric vancomycin discontinuation was instituted in a community healthcare system utilizing a tele-antimicrobial stewardship program to reduce inappropriate use of vancomycin. The protocol and associated intervention resulted in a significant decrease in both vancomycin utilization and the rate of acute kidney injury.
The late Pleistocene to Holocene subaerial pyroclastic deposits of the Quill stratovolcano on the Caribbean island of St Eustatius form seven stratigraphic divisions. New radiocarbon ages of charcoal are presented for the second, third and seventh divisions in order to better constrain the Quill’s eruption history. Three samples from the same layer of Division 2 at two localities on the northeast coast yield ages of 18,020 ± 40 (1σ), 18,310 ± 45 and 18,490 ± 45 14C yr BP (∼19,800–20,600 yr cal BC). These are considerably younger (∼4400 yr) than a previously published result for this division. A single sample of Division 3 gave an age of 8090 14C yr BP (∼7100 yr cal BC) and overlaps with previously published 14C ages for this division. A charred root in the pyroclastic unit deposited by the last eruption of the Quill (Division 7) gave an age of 919 14C yr BP (∼1100–1200 yr cal AD). This result is ∼600 years younger than a previously published age, and its origin is attributed to human activity. The timing of the last eruption of the Quill therefore remains poorly constrained but is older than 600 AD. Terrestrial gastropods found in paleosols and organic material found in small streams that developed in Division 3 indicate that Division 4 must be younger than 6100 ± 500 yr cal BC. The oxygen and carbon isotope composition of the terrestrial gastropods derived from Division 3 paleosols indicates that the C4 and CAM-type vegetation was dominant and that the climate subsequently changed to wetter conditions. The minimum eruption frequency for the Quill volcano is one eruption every ∼1400 years during the past 22,000 years. This eruption frequency of the Quill volcano is of the same order of magnitude as other recent northern Lesser Antilles volcanoes, Soufrière Hills (Montserrat, ∼5000 years) and Mt Liamuiga (St. Kitts, ∼2500 years).
Different fertilization strategies can be adopted to optimize the productive components of an integrated crop–livestock systems. The current research evaluated how the application of P and K to soybean (Glycine max (L.) Merr.) or Urochloa brizantha (Hochst. ex A. Rich.) R. D. Webster cv. BRS Piatã associated with nitrogen or without nitrogen in the pasture phase affects the accumulation and chemical composition of forage and animal productivity. The treatments were distributed in randomized blocks with three replications. Four fertilization strategies were tested: (1) conventional fertilization with P and K in the crop phase (CF–N); (2) conventional fertilization with nitrogen in the pasture phase (CF + N); (3) system fertilization with P and K in the pasture phase (SF–N); (4) system fertilization with nitrogen in the pasture phase (SF + N). System fertilization increased forage accumulation from 15 710 to 20 920 kg DM ha/year compared to conventional without nitrogen. Stocking rate (3.1 vs. 2.8 AU/ha; SEM = 0.12) and gain per area (458 vs. 413 kg BW/ha; SEM = 27.9) were higher in the SF–N than CF–N, although the average daily gain was lower (0.754 vs. 0.792 kg LW/day; SEM = 0.071). N application in the pasture phase, both, conventional and system fertilization resulted in higher crude protein, stocking rate and gain per area. Applying nitrogen and relocate P and K from crop to pasture phase increase animal productivity and improve forage chemical composition in integrated crop–livestock system.