We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Several factors shape the neurodevelopmental trajectory. A key area of focus in neurodevelopmental research is to estimate the factors that have maximal influence on the brain and can tip the balance from typical to atypical development.
Methods
Utilizing a dissimilarity maximization algorithm on the dynamic mode decomposition (DMD) of the resting state functional MRI data, we classified subjects from the cVEDA neurodevelopmental cohort (n = 987, aged 6–23 years) into homogeneously patterned DMD (representing typical development in 809 subjects) and heterogeneously patterned DMD (indicative of atypical development in 178 subjects).
Results
Significant DMD differences were primarily identified in the default mode network (DMN) regions across these groups (p < 0.05, Bonferroni corrected). While the groups were comparable in cognitive performance, the atypical group had more frequent exposure to adversities and faced higher abuses (p < 0.05, Bonferroni corrected). Upon evaluating brain-behavior correlations, we found that correlation patterns between adversity and DMN dynamic modes exhibited age-dependent variations for atypical subjects, hinting at differential utilization of the DMN due to chronic adversities.
Conclusion
Adversities (particularly abuse) maximally influence the DMN during neurodevelopment and lead to the failure in the development of a coherent DMN system. While DMN's integrity is preserved in typical development, the age-dependent variability in atypically developing individuals is contrasting. The flexibility of DMN might be a compensatory mechanism to protect an individual in an abusive environment. However, such adaptability might deprive the neural system of the faculties of normal functioning and may incur long-term effects on the psyche.
INDUCT (Interdisciplinary Network for Dementia Using Current Technology), and DISTINCT (Dementia Inter-sectorial strategy for training and innovation network for current technology) are two Marie Sklodowska-Curie funded International Training Networks that aimed to develop a multi-disciplinary, inter-sectorial educational research framework for Europe to improve technology and care for people with dementia, and to provide the evidence to show how technology can improve the lives of people with dementia.
Methods:
In INDUCT (2016-2020) 15 Early Stage Researchers worked on projects in the areas of Technology to support everyday life; technology to promote meaningful activities; and healthcare technology. In DISTINCT (2019-2023) 15 Early Stage Researchers worked on technology to promote Social health in three domains: fulfilling ones potential and obligations in society, managing one’s own life, and participation in social and other meaningful activities.
Both networks adopted three transversal objectives: 1) To determine practical, cognitive and social factors needed to make technology more useable for people with dementia; 2) To evaluate the effectiveness of specific contemporary technology; 3) To trace facilitators and barriers for implementation of technology in dementia care.
Results:
The main recommendations resulting from all research projects are integrated in a web-based digital Best Practice Guidance on Human Interaction with Technology in Dementia which was recently updated (Dec 2022 and June 2023) and will be presented at the congress. The recommendations are meant for different target groups, i.e. people in different stages of dementia, their (in)formal carers, policy makers, designers and researchers, who can easily find the recommendations relevant to them in the Best Practice Guidance by means of a digital selection tool.
Conclusions:
The INDUCT/DISTINCT Best Practice Guidance informs on how to improve the development, usage, impact and implementation of technology for people with dementia in various technology areas. This Best Practice Guidance is the result of intensive collaborative partnership of INDUCT and DISTINCT with academic and non-academic partners as well as the involvement of representatives of the different target groups throughout the projects.
Iodine is a trace element required to produce the thyroid hormones, which are critical for development, growth and metabolism. To ensure appropriate population iodine nutrition, convenient and accurate methods of monitoring are necessary. Current methods for determining iodine status either involve a significant participant burden or are subject to considerable intra-individual variation. The continuous secretion of iodide in saliva potentially permits its use as a convenient, non-invasive assessment of status in populations. To assess its likely effectiveness, we reviewed studies analysing the association between salivary iodide concentration (SIC) and dietary iodine intake, urinary iodide concentration (UIC) and/or 24-h urinary iodide excretion (UIE). Eight studies conducted in different countries met the inclusion criteria, including data for 921 subjects: 702 healthy participants and 219 with health conditions. SIC correlated positively with UIC and/or UIE in four studies, with the strength of relationship ranging from r = 0·19 to r = 0·90 depending on sampling protocol, age, and if salivary values were corrected for protein concentration. Additionally, SIC positively correlated with dietary intake, being strongest when saliva was collected after dinner. SIC varied with external factors, including thyroid function, use of some medications, smoking and overall health status. Evidence provided here supports the use of SIC as a viable, low-burden method for determining iodine status in populations. However, small sample sizes and high variability indicates the need for more extensive analyses across age groups, ethnicities, disease states and dietary groups to clarify the relative accuracy and reliability in each case and standardise procedure.
Vegan and vegetarian diets are widely supported and adopted, but individuals following such diets remain at greater risk of iodine deficiency. This systematic review and meta-analysis was conducted to assess the iodine intake and status in adults following a vegan or vegetarian diet in the modern day. A systematic review and quality assessment were conducted from October 2020 to December 2022 according to Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) and Meta-analysis of Observational Studies in Epidemiology (MOOSE) guidance. Studies were identified in Ovid MEDLINE, Web of Science, PubMed, and Scopus. Eleven articles were eligible for review containing 4421 adults (aged ≥ 18 years). Vegan groups had the lowest median urinary iodine concentration (mUIC) (12·2/l). None of the dietary groups had mUIC within the optimal range for iodine status (100–200 µg/l) (WHO). Vegan diets had the poorest iodine intake (17·3 µg/d) and were strongly associated with lower iodine intake (P = < 0·001) compared with omnivorous diets. Lower intake in vegan diets was influenced by sex (P = 0·007), the presence of voluntary or absence of Universal Salt Iodisation (USI) programmes (P = 0·01 & P = < 0·001), and living in a country with adequate iodine nutrition (P = < 0·001). Vegetarians and particularly vegans living in countries with no current USI programme continue to have increased risk of low iodine status, iodine deficiency and inadequate iodine intake. Further research into the usefulness of mandatory fortification of vegan appropriate foods is required.
Reduction in dietary vitamin B6 intake is associated with an increased relative risk of diseases such as cancer, atherosclerosis and cognitive dysfunction. The current research has assessed vitamin B6 intakes and PLP concentrations as a marker of vitamin B6 status among the UK adult (≥ 19 years) population. This study was carried out using a cross-sectional analysis of the National Diet and Nutrition Survey Rolling Programme (NDNS) (2008–2017). The impacts of lifestyle factors, including type of diet, smoking, alcohol consumption, and commonly used medications grouped by therapeutic usage, were determined, and data were analysed using IBM SPSS®. Results are expressed as medians (25th–75th percentiles), with P values ≤ 0·05 considered statistically significant. Among UK adults, the median intakes of total population of dietary vitamin B6 met the reference nutrient intake and median plasma PLP concentrations were above the cut-off of vitamin B6 deficiency; however, we found an association between reduction in vitamin B6 intake and plasma PLP concentration and age group (P < 0·001). Smokers had significantly lower plasma PLP concentrations than non-smokers (P < 0·001). Moreover, regression analysis showed some commonly used medications were associated with plasma PLP levels reduction (P < 0·05). Taken together, we report on a tendency for dietary vitamin B6 intake and plasma PLP concentrations to decrease with age and lifestyle factors such as smoking and medication usage. This information could have important implications for smokers and in the elderly population using multiple medications (polypharmacy).
Avian endoparasites play important roles in conservation, biodiversity and host evolution. Currently, little is known about the epidemiology of intestinal helminths and protozoans infecting wild birds of Britain and Ireland. This study aimed to determine the rates of parasite prevalence, abundance and infection intensity in wild passerines. Fecal samples (n = 755) from 18 bird families were collected from 13 sites across England, Wales and Ireland from March 2020 to June 2021. A conventional sodium nitrate flotation method allowed morphological identification and abundance estimation of eggs/oocysts. Associations with host family and age were examined alongside spatiotemporal and ecological factors using Bayesian phylogenetically controlled models. Parasites were detected in 20.0% of samples, with corvids and finches having the highest prevalences and intensities, respectively. Syngamus (33%) and Isospora (32%) were the most prevalent genera observed. Parasite prevalence and abundance differed amongst avian families and seasons, while infection intensity varied between families and regions. Prevalence was affected by diet diversity, while abundance differed by host age and habitat diversity. Infection intensity was higher in birds using a wider range of habitats, and doubled in areas with feeders present. The elucidation of these patterns will increase the understanding of parasite fauna in British and Irish birds.
This chapter considers why it is important to involve patients (and their carers) in medical student teaching and training. The educational case for their involvement is considered, along with some of the policy and ethical considerations. The Derby “Expert Patient” project is described as an example of how to set up a sustainable educational programme with patient involvement.
Healthcare workers (HCWs) have faced considerable pressures during the COVID-19 pandemic. For some, this has resulted in mental health distress and disorder. Although interventions have sought to support HCWs, few have been evaluated.
Aims
We aimed to determine the effectiveness of the ‘Foundations’ application (app) on general (non-psychotic) psychiatric morbidity.
Method
We conducted a multicentre randomised controlled trial of HCWs at 16 NHS trusts (trial registration number: EudraCT: 2021-001279-18). Participants were randomly assigned to the app or wait-list control group. Measures were assessed at baseline, after 4 and 8 weeks. The primary outcome was general psychiatric morbidity (using the General Health Questionnaire). Secondary outcomes included: well-being; presenteeism; anxiety; depression and insomnia. The primary analysis used mixed-effects multivariable regression, presented as adjusted mean differences (aMD).
Results
Between 22 March and 3 June 2021, 1002 participants were randomised (500:502), and 894 (89.2%) followed-up. The sample was predominately women (754/894, 84.3%), with a mean age of 44⋅3 years (interquartile range (IQR) 34–53). Participants randomised to the app had a reduction in psychiatric morbidity symptoms (aMD = −1.39, 95% CI −2.05 to −0.74), improvement in well-being (aMD = 0⋅54, 95% CI 0⋅20 to 0⋅89) and reduction in insomnia (adjusted odds ratio (aOR) = 0⋅36, 95% CI 0⋅21 to 0⋅60). No other significant findings were found, or adverse events reported.
Conclusions
The app had an effect in reducing psychiatric morbidity symptoms in a sample of HCWs. Given it is scalable with no adverse effects, the app may be used as part of an organisation's tiered staff support package. Further evidence is needed on long-term effectiveness and cost-effectiveness.
Developmental adversities early in life are associated with later psychopathology. Clustering may be a useful approach to group multiple diverse risks together and study their relation with psychopathology. To generate risk clusters of children, adolescents, and young adults, based on adverse environmental exposure and developmental characteristics, and to examine the association of risk clusters with manifest psychopathology. Participants (n = 8300) between 6 and 23 years were recruited from seven sites in India. We administered questionnaires to elicit history of previous exposure to adverse childhood environments, family history of psychiatric disorders in first-degree relatives, and a range of antenatal and postnatal adversities. We used these variables to generate risk clusters. Mini-International Neuropsychiatric Interview-5 was administered to evaluate manifest psychopathology. Two-step cluster analysis revealed two clusters designated as high-risk cluster (HRC) and low-risk cluster (LRC), comprising 4197 (50.5%) and 4103 (49.5%) participants, respectively. HRC had higher frequencies of family history of mental illness, antenatal and neonatal risk factors, developmental delays, history of migration, and exposure to adverse childhood experiences than LRC. There were significantly higher risks of any psychiatric disorder [Relative Risk (RR) = 2.0, 95% CI 1.8–2.3], externalizing (RR = 4.8, 95% CI 3.6–6.4) and internalizing disorders (RR = 2.6, 95% CI 2.2–2.9), and suicidality (2.3, 95% CI 1.8–2.8) in HRC. Social-environmental and developmental factors could classify Indian children, adolescents and young adults into homogeneous clusters at high or low risk of psychopathology. These biopsychosocial determinants of mental health may have practice, policy and research implications for people in low- and middle-income countries.