We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
It remains unclear which individuals with subthreshold depression benefit most from psychological intervention, and what long-term effects this has on symptom deterioration, response and remission.
Aims
To synthesise psychological intervention benefits in adults with subthreshold depression up to 2 years, and explore participant-level effect-modifiers.
Method
Randomised trials comparing psychological intervention with inactive control were identified via systematic search. Authors were contacted to obtain individual participant data (IPD), analysed using Bayesian one-stage meta-analysis. Treatment–covariate interactions were added to examine moderators. Hierarchical-additive models were used to explore treatment benefits conditional on baseline Patient Health Questionnaire 9 (PHQ-9) values.
Results
IPD of 10 671 individuals (50 studies) could be included. We found significant effects on depressive symptom severity up to 12 months (standardised mean-difference [s.m.d.] = −0.48 to −0.27). Effects could not be ascertained up to 24 months (s.m.d. = −0.18). Similar findings emerged for 50% symptom reduction (relative risk = 1.27–2.79), reliable improvement (relative risk = 1.38–3.17), deterioration (relative risk = 0.67–0.54) and close-to-symptom-free status (relative risk = 1.41–2.80). Among participant-level moderators, only initial depression and anxiety severity were highly credible (P > 0.99). Predicted treatment benefits decreased with lower symptom severity but remained minimally important even for very mild symptoms (s.m.d. = −0.33 for PHQ-9 = 5).
Conclusions
Psychological intervention reduces the symptom burden in individuals with subthreshold depression up to 1 year, and protects against symptom deterioration. Benefits up to 2 years are less certain. We find strong support for intervention in subthreshold depression, particularly with PHQ-9 scores ≥ 10. For very mild symptoms, scalable treatments could be an attractive option.
The treatment recommendation based on a network meta-analysis (NMA) is usually the single treatment with the highest expected value (EV) on an evaluative function. We explore approaches that recommend multiple treatments and that penalise uncertainty, making them suitable for risk-averse decision-makers. We introduce loss-adjusted EV (LaEV) and compare it to GRADE and three probability-based rankings. We define properties of a valid ranking under uncertainty and other desirable properties of ranking systems. A two-stage process is proposed: the first identifies treatments superior to the reference treatment; the second identifies those that are also within a minimal clinically important difference (MCID) of the best treatment. Decision rules and ranking systems are compared on stylised examples and 10 NMAs used in NICE (National Institute of Health and Care Excellence) guidelines. Only LaEV reliably delivers valid rankings under uncertainty and has all the desirable properties. In 10 NMAs comparing between 5 and 41 treatments, an EV decision maker would recommend 4–14 treatments, and LaEV 0–3 (median 2) fewer. GRADE rules give rise to anomalies, and, like the probability-based rankings, the number of treatments recommended depends on arbitrary probability cutoffs. Among treatments that are superior to the reference, GRADE privileges the more uncertain ones, and in 3/10 cases, GRADE failed to recommend the treatment with the highest EV and LaEV. A two-stage approach based on MCID ensures that EV- and LaEV-based rules recommend a clinically appropriate number of treatments. For a risk-averse decision maker, LaEV is conservative, simple to implement, and has an independent theoretical foundation.
Precision or “Personalized Medicine” and “Big Data” are growing trends in the biomedical research community and highlight an increased focus on access to larger datasets to effectively explore disease processes at the molecular level versus the previously common one-size-fits all approach. This focus necessitated a local transition from independent lab and siloed projects to a single software application utilizing a common ontology to create access to data from multiple repositories. Use of a common system has allowed for increased ease of collaboration and access to quality biospecimens that are extensively annotated with clinical, molecular, and patient associated data. The software needed to function at an enterprise level while continuing to allow investigators the autonomy and security access they desire. To identify a solution, a working group comprised of representation from independent repositories and areas of research focus across departments was established and responsible for review and implementation of an enterprise-wide biospecimen management system. Central to this process was the creation of a unified vocabulary across all repositories, including consensus around source of truth, standardized field definitions, and shared terminology.
This article replicates and “stress tests” a recent finding by Eckel and Grossman (2003) that matching subsidies generate substantially higher Charity Receipts than theoretically comparable rebate subsidies. In a first replication treatment, we show that most choices are consist with a “constant (gross) contribution” rule, suggesting that inattention to the subsidies’ differing net consequences may explain the higher revenues elicited with matching subsidies. Results of additional treatments suggest that (a) the charity dimension of the decision problems has little to do with the result, and (b) extra information regarding the net consequences of decisions reduces but does not eliminate the result.
A period of the life course where optimal nutrition and food security are crucial for the life-long health and wellbeing of women/birthing parents and infants is preconception, pregnancy, and infancy.(1) It is estimated that nearly one in every four households with pre-school children (0-4 years) experience food insecurity (FI) in the UK.(2) Yet, we lack an evidence-base exploring experiences of FI in this life course stage.(3,4) This study aimed to explore women’s experiences of food insecurity during and after pregnancy, including its influence on infant feeding decisions.
This study was ethically approved (Ref No: LRS/DP-23/24-39437) and pre-registered on OSF Registries (https://osf.io/9hn6r). Semi-structured mixed format individual interviews were conducted between November 2023 and February 2024. Pregnant individuals, those who had given birth ≤12 months ago, ≥18 years old, food insecure, residing in South London and with recourse to public funds were recruited through purposive sampling. The topic guide was informed by FI, pregnancy and postpartum related literature and piloted (n = 2). Interviews were audiorecorded and professionally transcribed. Demographic data was summarised using SPSS. Inductive thematic analysis was used to analyse the data and was completed using NVivo.
Eleven food insecure participants (2 pregnant, 9 new mothers; 2 White European, 9 Black African/Caribbean/British women) participated in the study. Six women were 0-6 months postpartum, and 3 women were between 6-12 months postpartum. The preliminary findings are represented by three themes: 1) A dichotomy: knowing vs affording, 2) Adaptive food coping strategies, and 3) Infant feeding practices. Participants shared detailed accounts of valuing a healthy diet and adapting food practices, yet they still were unable to meet their dietary needs and desires during and after pregnancy. Participants described worry around breastmilk supply; quality and quantity. Complimentary feeding was also identified as a source of worry. “She is still breastfeeding fully. I don’t want to change to milk, which maybe, sometimes, I might not be able to afford it…I won’t stop until she is 1.”Whilst the cost of formula feeding was a driver of a more severe experience of FI.
Policy and practice recommendations include enhancing local breastfeeding support to address FI specific concerns around breastmilk supply and at national level, advocating for greater support for adequate healthy food provision and for a price cap on infant formula. Future interventions must support maternal mental health given the high cognitive stress identified with living with FI during and after pregnancy. Further high-quality research is needed 1) amongst asylum seekers and refugees and non-English speakers who may also experience FI, and 2) exploring cultural influences on breastfeeding and the relationship with FI.
From early on, infants show a preference for infant-directed speech (IDS) over adult-directed speech (ADS), and exposure to IDS has been correlated with language outcome measures such as vocabulary. The present multi-laboratory study explores this issue by investigating whether there is a link between early preference for IDS and later vocabulary size. Infants’ preference for IDS was tested as part of the ManyBabies 1 project, and follow-up CDI data were collected from a subsample of this dataset at 18 and 24 months. A total of 341 (18 months) and 327 (24 months) infants were tested across 21 laboratories. In neither preregistered analyses with North American and UK English, nor exploratory analyses with a larger sample did we find evidence for a relation between IDS preference and later vocabulary. We discuss implications of this finding in light of recent work suggesting that IDS preference measured in the laboratory has low test-retest reliability.
This article examines the development, early operation and subsequent failure of the Tot-Kolowa Red Cross irrigation scheme in Kenya’s Kerio Valley. Initially conceived as a technical solution to address regional food insecurity, the scheme aimed to scale up food production through the implementation of a fixed pipe irrigation system and the provision of agricultural inputs for cash cropping. A series of unfolding circumstances, however, necessitated numerous modifications to the original design as the project became increasingly entangled with deep and complex histories of land use patterns, resource allocation and conflict. Failure to understand the complexity of these dynamics ultimately led to the project’s collapse as the region spiralled into a period of significant unrest. In tracing these events, we aim to foreground the lived realities of imposed development, including both positive and negative responses to the scheme’s participatory obligations and its wider impact on community resilience.
Both cortical and parasympathetic systems are believed to regulate emotional arousal in the service of healthy development. Systemic coordination, or coupling, between putative regulatory functions begins in early childhood. Yet the degree of coupling between cortical and parasympathetic systems in young children remains unclear, particularly in relation to the development of typical or atypical emotion function. We tested whether cortical (ERN) and parasympathetic (respiratory sinus arrhythmia [RSA]) markers of regulation were coupled during cognitive challenge in preschoolers (N = 121). We found no main effect of RSA predicting ERN. We then tested children’s typical and atypical emotion behavior (context-appropriate/context-inappropriate fear, anxiety symptoms, neuroendocrine reactivity) as moderators of early coupling in an effort to link patterns of coupling to adaptive emotional development. Negative coupling (i.e., smaller ERN, more RSA suppression or larger ERN, less RSA suppression) at age 3 was associated with greater atypical and less typical emotion behaviors, indicative of greater risk. Negative age 3 coupling was also visible for children who had greater Generalized Anxiety Disorder symptoms and blunted cortisol reactivity at age 5. Results suggest that negative coupling may reflect a maladaptive pattern across regulatory systems that is identifiable during the preschool years.
Soil amelioration via strategic deep tillage is occasionally utilized within conservation tillage systems to alleviate soil constraints, but its impact on weed seed burial and subsequent growth within the agronomic system is poorly understood. This study assessed the effects of different strategic deep-tillage practices, including soil loosening (deep ripping), soil mixing (rotary spading), or soil inversion (moldboard plow), on weed seed burial and subsequent weed growth, compared with a no-till control. The tillage practices were applied in 2019 at Yerecoin and Darkan, WA, and data on weed seed burial and growth were collected during the following 3-yr winter crop rotation (2019 to 2021). Soil inversion buried 89% of rigid ryegrass (Lolium rigidum Gaudin) and ripgut brome (Bromus diandrus Roth) seeds to a depth of 10 to 20 cm at both sites, while soil loosening and mixing left between 31% and 91% of the seeds in the top 0 to 10 cm of soil, with broad variation between sites. Few seeds were buried beyond 20 cm despite tillage working depths exceeding 30 cm at both sites. Soil inversion reduced the density of L. rigidum to <1 plant m−2 for 3 yr after strategic tillage. Bromus diandrus density was initially reduced to 0 to 1 plant m−2 by soil inversion, but increased to 4 plants m−2 at Yerecoin in 2020 and 147 plants at Darkan in 2021. Soil loosening or mixing did not consistently decrease weed density. The field data were used to parameterize a model that predicted weed density following strategic tillage with greater accuracy for soil inversion than for loosening or mixing. The findings provide important insights into the effects of strategic deep tillage on weed management in conservational agricultural systems and demonstrate the potential of models for optimizing weed management strategies.
Different fertilization strategies can be adopted to optimize the productive components of an integrated crop–livestock systems. The current research evaluated how the application of P and K to soybean (Glycine max (L.) Merr.) or Urochloa brizantha (Hochst. ex A. Rich.) R. D. Webster cv. BRS Piatã associated with nitrogen or without nitrogen in the pasture phase affects the accumulation and chemical composition of forage and animal productivity. The treatments were distributed in randomized blocks with three replications. Four fertilization strategies were tested: (1) conventional fertilization with P and K in the crop phase (CF–N); (2) conventional fertilization with nitrogen in the pasture phase (CF + N); (3) system fertilization with P and K in the pasture phase (SF–N); (4) system fertilization with nitrogen in the pasture phase (SF + N). System fertilization increased forage accumulation from 15 710 to 20 920 kg DM ha/year compared to conventional without nitrogen. Stocking rate (3.1 vs. 2.8 AU/ha; SEM = 0.12) and gain per area (458 vs. 413 kg BW/ha; SEM = 27.9) were higher in the SF–N than CF–N, although the average daily gain was lower (0.754 vs. 0.792 kg LW/day; SEM = 0.071). N application in the pasture phase, both, conventional and system fertilization resulted in higher crude protein, stocking rate and gain per area. Applying nitrogen and relocate P and K from crop to pasture phase increase animal productivity and improve forage chemical composition in integrated crop–livestock system.
Skin-based samples (leather, skin, and parchment) in archaeological, historic and museum settings are among the most challenging materials to radiocarbon (14C) date in terms of removing exogenous carbon sources—comparable to bone collagen in many respects but with much less empirical study to guide pretreatment approaches. In the case of leather, the 14C content of materials used in manufacturing the leather can vary greatly. The presence of leather manufacturing chemicals before pretreatment and their absence afterward is difficult to demonstrate, and the accuracy of dates depends upon isolating the original animal proteins and removing exogenous carbon. Parchments differ in production technique from leather but include similar unknowns. It is not clear that lessons learned in the treatment of one are always salient for treating the other. We measured the 14C content of variously pretreated leather, parchment, skin samples, and extracts, producing apparent ages that varied by hundreds or occasionally thousands of years depending upon sample pretreatment. Fourier Transform Infrared Spectroscopy (FTIR) and C:N ratios provided insight into the chemical composition of carbon reservoirs contributing to age differences. The results of these analyses demonstrated that XAD column chromatography resulted in the most accurate 14C dates for leather and samples of unknown tannage, and FTIR allowed for the detection of contamination that might have otherwise been overlooked.
Delirium is characterised by an acute, fluctuating change in cognition, attention and awareness (Wilson et al. Nature Reviews 2020; 6). This presentation can make the diagnosis of delirium extremely challenging to clinicians (Gofton., Canadian Journal of neurological sciences. 2011; 38 673-680). It is commonly reported in hospitalised patients, particularly in those over the age of sixty five (NICE. Delirium: prevention, diagnosis and management. 2010).
Objectives
Our aim is to identify which investigations and cognitive assessments are completed prior to a referral to the liaison psychiatry services in patients with symptoms of delirium.
Methods
Referrals (N = 6012) to the liaison psychiatry team at Croydon University Hospital made between April and September 2022 were screened. Search parameters used to identify referrals related to a potential diagnosis of delirium were selected by the authors. The terms used were confusion; delirium; agitation; aggression; cognitive decline or impairment; disorientation; challenging behaviour. Data was collected on the completion rates of investigations for delirium as advised by the NICE clinical knowledge summaries. Further data was gathered on neuroimaging (CT or MRI), cognitive assessment tools (MOCA/MMSE) and delirium screening tools (4AT/AMTS).
Results
The study sample identified 114 referrals (61 males and 53 females), with 82% over 65 years at the time of referral. In 96% of referrals, U&E and CRP were performed. Sputum culture (1%), urine toxin screen (4%) and free T3/4 (8%) were the tests utilised the least. Neuroimaging was completed in 41% of referrals (see Graph 1 for a full breakdown of results).
A formal cognitive assessment or delirium screening tool was completed in 32% of referrals. The AMTS and 4AT tools were documented for 65% and 24% respectively. A total of 19 referrals explicitly stated the patient was suspected to have dementia. A delirium screening tool was documented in 47% of these cases however, a formal cognitive assessment was documented in only 5% of these patients.
Following psychiatric assessment 47% of referrals were confirmed as delirium.
Image:
Conclusions
Our data highlights the low level completion of the NICE recommended delirium screen prior to referral to liaison psychiatry. The effective implementation of a delirium screen and cognitive assessment is paramount to reduce the number of inappropriate psychiatric referrals in hospital and helps to identify reversible organic causes of delirium. This in turn will ensure timely treatment of reversible causes of delirium and reduce the length of hospital admission.