To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Ultra-processed foods (UPF), defined using frameworks such as NOVA, are increasingly linked to adverse health outcomes, driving interest in ways to identify and monitor their consumption. Artificial intelligence (AI) offers potential, yet its application in classifying UPF remains underexamined. To address this gap, we conducted a scoping review mapping how AI has been used, focusing on techniques, input data, classification frameworks, accuracy and application. Studies were eligible if peer-reviewed, published in English (2015–2025), and they applied AI approaches to assess or classify UPF using recognised or study-specific frameworks. A systematic search in May 2025 across PubMed, Scopus, Medline and CINAHL identified 954 unique records with eight ultimately meeting the inclusion criteria; one additional study was added in October following an updated search after peer review. Records were independently screened and extracted by two reviewers. Extracted data covered AI methods, input types, frameworks, outputs, validation and context. Studies used diverse techniques, including random forest classifiers, large language models and rule-based systems, applied across various contexts. Four studies explored practical settings: two assessed consumption or purchasing behaviours, and two developed substitution tools for healthier options. All relied on NOVA or modified versions to categorise processing. Several studies reported predictive accuracy, with F1 scores from 0·86 to 0·98, while another showed alignment between clusters and NOVA categories. Findings highlight the potential of AI tools to improve dietary monitoring and the need for further development of real-time methods and validation to support public health.
Collagen supplementation (CS) has emerged as a promising therapeutic approach with potential benefits for managing metabolic syndrome (MetS)-related risk factors. This narrative review integrates human evidence with preclinical mechanistic insights into the metabolic actions of collagen. Anti-obesity effects are attributed to increased satiety, gastric distension, GLP-1 secretion and enhanced fatty acid oxidation mediated by PPAR-α activation and AMPK signalling. In type 2 diabetes, collagen improves glucose homeostasis by enhancing insulin sensitivity, upregulating GLUT-4 and inhibiting dipeptidyl peptidase IV (DPP-IV), thereby prolonging incretin activity (GLP-1 and GIP) and supporting β-cell function. The antihypertensive effect of collagen peptides (CP) is primarily linked to angiotensin-converting enzyme (ACE) inhibition, which reduces angiotensin II levels while promoting bradykinin-mediated vasodilation and nitric oxide release. In addition, CP has shown potential in improving lipid profiles by modulating PPAR-γ and AMPK, increasing HDL-C and reducing LDL-C and triacylglycerols. Emerging evidence also supports a role for collagen in restoring gut microbiota balance, increasing short-chain fatty acid production and reducing pro-inflammatory and oxidative pathways, contributing to systemic metabolic regulation. Overall, these findings suggest CS exerts multi-targeted benefits on MetS components through modulation of endocrine, inflammatory and metabolic pathways. Nevertheless, larger, long-term clinical trials are warranted to determine optimal dosing regimens, evaluate long-term efficacy, and further elucidate microbiota-mediated effects.
Individuals with severe mental illness face a significantly reduced life expectancy compared to the general population. Addressing key modifiable risk factors is essential to reduce these alarming rates of mortality in this population. Nutritional psychiatry has emerged as an important field of research, highlighting the important role of nutrition on mental health outcomes. However, individuals with severe mental illness often encounter barriers to healthy eating, including poor diet quality, medication-related side effects such as increased appetite and weight gain, food insecurity and limited autonomy over food choices. While nutrition interventions play a key role in improving health outcomes and should be a standard part of care, their implementation remains challenging. Digital technology presents a promising alternative support model, with the potential to address many of the structural and attitudinal barriers experienced by this population. Nonetheless, issues such as digital exclusion and low digital literacy persist. Integrating public and patient involvement, along with behavioural science frameworks, into the design and delivery of digital nutrition interventions can improve their relevance, acceptability and impact. This review discusses the current and potential role of digital nutrition interventions for individuals with severe mental illness, examining insights, challenges and future directions to inform research and practice.
This study is the first study in Middle Eastern population that aimed to investigate the association between global diet quality score (GDQS) and risk of hypertension (HTN) in Iranian adults.
Design:
This population-based cohort study was conducted on 5718 individuals aged ≥ 18 years from the third and fourth Tehran Lipid and Glucose Study surveys, who were followed until the sixth survey (mean follow-up: 7·8 years). Dietary data were collected using a validated FFQ to calculate GDQS as a novel food-based metric designed to assess diet quality across diverse populations. It evaluates the adequacy of healthy food groups (e.g. fruits, vegetables and whole grains) while monitoring the moderation of unhealthy or excessive intake (e.g. refined grains, processed meats and sugary foods).
Setting:
Tehran Lipid and Glucose Study.
Participants:
Iranian men and women.
Results:
Participants had a mean (sd) age of 37·7 (sd 12·8) years, BMI of 26·6 (sd 4·7) kg/m2 and GDQS of 25·3 (sd 4·4). During the 7·8-year follow-up, 1302 (18 %) new cases of HTN were identified. Higher GDQS and its healthy components were associated with reduced HTN risk (hazard ratio (HR): 0·83; 95 % CI: 0·70, 0·98; Ptrend = 0·034 and HR: 0·78; 95 % CI: 0·65, 0·92; Ptrend = 0·005, respectively), while unhealthy components of GDQS showed no association with HTN risk (HR: 1·14; 95 % CI: 0·98, 1·33; Ptrend = 0·059). These protective associations were observed across all weight categories and both genders, with stronger effects among obese individuals (for GDQS: HR: 0·75; 95 % CI: 0·58, 0·98; P = 0·041; for healthy components: HR: 0·75; 95 % CI: 0·57, 0·99; P = 0·044) and females (for GDQS: HR: 0·77; 95 % CI: 0·62, 0·97; P = 0·028; for healthy components: HR: 0·76; 95 % CI: 0·60, 0·96; P = 0·023).
Conclusions:
A higher GDQS was associated with a reduced risk of incident HTN among Iranian adults. Adherence to a high-quality diet, particularly focusing on the healthy dietary components of GDQS, may serve as an effective strategy for preventing HTN, especially among obese individuals and women.
Sarcopenia, the age-related decline in muscle mass and strength, is a contributor to frailty and reduced quality of life. Emerging evidence suggests an emerging role of the gut microbiome in modulating skeletal muscle through microbial species and metabolites, such as short-chain fatty acids (SCFAs), potentially influencing inflammation, nutrient absorption, and glucose and protein metabolism. This review considers the potential of probiotics, prebiotics, and synbiotics as interventions to mitigate sarcopenia based on animal and human studies, while providing a critique of present barriers that need to be addressed. Preclinical models, including germ-free mice and faecal microbiota transplantation, demonstrate that gut microbiota from healthy or young donors may enhance overall muscle health via reductions in inflammatory and muscle atrophy markers. Limited human studies show that probiotics such as Lactobacillus and Bifidobacterium could improve branched-chain amino acid (BCAA) bioavailability and potentially sarcopenia indices, although findings have been inconsistent. Particularly, challenges including inconsistent microbial assessments, lack of dietary control and interindividual variability due to diet, age, genetics, comorbidities and medications may hinder progress in this field. Delivery methods (e.g. capsules, fermented foods or fortified products) could further complicate efficacy through probiotic stability and dietary restrictions in older adults. Standardised protocols [e.g. Strengthening The Organisation and Reporting of Microbiome Studies (STORMS) checklist] and multi-omics approaches may be critical to address these limitations and identify microbial signatures linked to sarcopenia outcomes. While preclinical evidence highlights mechanistic pathways pertinent to amino acid metabolism, translating findings to humans requires rigorous experimental trials.
Groundwater iron varies geographically and iron intake through drinking water can minimise iron deficiency (ID). Rice, a major share of daily meals (∼70% of total energy) in Bangladesh, absorbs a substantial amount of water. This study aimed to estimate the contribution of groundwater iron entrapped in cooked rice and its implications on the recommended iron intake. A cross-sectional study was conducted among 25 households, selected by the iron content of their drinking groundwater source in Sirajganj district, Bangladesh. Each household pre-supplied with 600 g of raw rice (300 g for each cooking), was instructed to cook ‘water-draining rice’ (WDR) and ‘water-sitting rice’ (WSR). Using atomic absorption spectrophotometry, iron content in filtered and non-filtered water was measured as 0.4 ± 0.2 mg/L and 6.1 ± 2.0 mg/L, respectively. After adjusting for water filtration, the weighted mean of total iron content in WDR and WSR was 6.18 mg and 5.70 mg, respectively. Assuming the average rice intake, iron content in WDR and WSR fulfilled approximately 98.15% and 90.62% of the average requirement for non-pregnant and non-lactating women (NPNL). The water-entrapped iron in cooked WDR and WSR fulfilled about 23.77% and 20.4% of Recommended Dietary Allowances, and 52.83% and 45.30% of Estimated Average Requirements, respectively in NPNL women, suggesting that groundwater entrapped in cooked rice is an influential dietary iron source. The substantial amount of iron from cooked rice can make an additional layer to the environmental contribution of iron in this setting with the potential to contribute ID prevention.
There is substantial international variation in recommended vitamin C intake levels. In the USA, the recommendation is 90 mg/d for men and 75 mg/d for women, while in the UK, the current recommendation – established in 1991 – is only 40 mg/d for adults. This UK level was based on the 1953 Sheffield study, which found that 10 mg/d prevents scurvy, with 40 mg/d chosen as the recommended level for yielding somewhat higher plasma levels. In this commentary, we argue that the UK recommendation overlooked key evidence available at the time. Specifically, at least six controlled trials published before 1991 reported benefits from vitamin C supplementation in participants whose baseline vitamin C intake was already 40 mg/d or higher. One randomised controlled trial, published in 1993, found benefits from vitamin C supplementation even at a baseline intake of about 500 mg/d; however, this trial involved ultramarathon runners, and the findings should not be broadly generalised. Nonetheless, such results challenge the assumption that 40 mg/d is universally adequate to maintain full health. We also highlight that the UK recommendations were narrowly focused on preventing dermatological symptoms of scurvy, despite strong evidence – even at the time – that vitamin C deficiency can also cause cardiac dysfunction and greater morbidity due to respiratory infections. We conclude that the current UK vitamin C recommendation should be re-evaluated in light of controlled trial evidence and broader clinical outcomes.
Although many online-based dietary surveys have been developed in recent years, systems that easily survey the dietary balance based on the Japanese diet are insufficient. This study aimed to evaluate the relationship between dietary balance scores from an online survey system based on the Japanese Food Guide Spinning Top, and nutrient/food intake calculated using the weighing method from dietary records (DRs), as well as to assess the system’s utility and applicability. An online dietary balance survey and semi-weighted DR assessment with food photographs were conducted in Japanese participants (n = 34). Registered dietitians entered the balance scores into the system based on the participants’ food photographs, and the scores were calculated using the system. Significant positive correlations (p < 0.001) were found between the online dietary balance scores and nutrient/food intake from DRs; especially for ‘grain dishes’ and carbohydrates (r = 0.704); ‘vegetable dishes’ and the vegetable dish group (sum of potatoes, vegetables, mushrooms, and algae) (r = 0.774); ‘main dishes’ and protein (r = 0.661); ‘milk’ and the milk and milk products group (r = 0.744); and ‘fruits’ and the fruits group (r = 0.748). Bland–Altman analysis showed that the dietary balance scores obtained by this system tended to underestimate the intake compared with the weighing method. Although there are limitations to the accurate estimation of nutrient and food intake, the online dietary balance scores obtained from the online dietary balance survey system were useful for understanding the dietary balance in the Japanese diet.
Kids SIPsmartER is a school-based behavioural intervention for rural Appalachia middle school students with an integrated two-way short message service (SMS) strategy for caregivers. When tested in a cluster randomized controlled trial, the intervention led to significant improvements in sugar-sweetened beverage (SSB) consumption among students and caregivers. This study explores changes in secondary caregiver outcomes, including changes in caregiver SSB-related theory of planned behaviour constructs (affective attitudes, instrumental attitudes, subjective norms, perceived behavioural control, and intentions), parenting practices, and the home environment. Participants included 220 caregivers (93% female, 88% White, 95% non-Hispanic, mean age 40.6) in Virginia and West Virginia at baseline and 7 months post-intervention. Relative to control caregivers (n = 102), intervention caregivers (n = 118) showed statistically significant improvements in instrumental attitudes (Coef.= 0.53, 95% CI [0.04, 1.01], p = 0.033), behavioural intentions (Coef.=0.46, 95% CI [0.05, 0.88], p = 0.027), parenting practices (Coef. = 0.22, 95% CI [0.11, 0.33], p < 0.001), and total home SSB availability (Coef. = –0.25, 95% CI [–0.39, –0.11], p < 0.001), with specific improvements for sweetened juice drinks (Coef. = –0.18, 95% CI [–0.35, –0.01], p = 0.043) and regular soda/soft drinks (Coef. = –0.31, 95% CI [–0.55, –0.07], p = 0.010). In contrast, there were no significant between group changes for affective attitudes, subjective norms, or perceived behavioural control. Our findings highlight future research areas and fill gaps in intervention literature. This study is among the few to develop and evaluate a scalable, theory-based caregiver SMS component in a rural, school-based intervention. Combined with evidence that Kids SIPsmartER improved SSB behaviours, our results emphasize the potential of theory-guided SMS interventions to impact SSB-related outcomes.
Despite the multiple advantages of 25-hydroxyvitamin D (calcifediol or 25(OH)D) compared to cholecalciferol, it is used sparingly. This study was planned to assess the safety and efficacy of supplementing daily 25 µg of calcifediol capsules vis-a-vis 100 µg (4000 IU) of cholecalciferol sachets in apparently healthy individuals with vitamin D deficiency in Chandigarh, India (latitude 30.7° North, 76.8° East). It was a prospective, interventional study to evaluate the effects of calcifediol vis-a-vis cholecalciferol. Following initial screening of 70 subjects in each group, 62 were included in the calcifediol and 41 in the cholecalciferol group. Forty-six from calcifediol and 37 from cholecalciferol group completed the 6-month follow up. There was a significant increase in serum 25(OH)D (355% in cholecalciferol & 574% in calcifediol groups, respectively, p < 0.001) and 1,25 (OH)2D (p < 0.001) with a marked decrease in iPTH (p < 0.001) and ALP (p = 0.016) in both groups. Though serum ALP decreased significantly more in the calcifediol group than the cholecalciferol group, no appreciable difference in other biochemical parameters was noted between the groups. No episodes of hypercalcaemia or incidence of new renal stone disease were observed during follow-up. However, hypercalciuria (spot urine calcium creatinine > 0.2 mg/mg) was noted in 8/46 individuals in the calcifediol group and 5/37 individuals in the cholecalciferol group at final visit with no significant difference between two groups. This study establishes the efficacy and safety of correcting vitamin D deficiency with daily 25 µg calcifediol capsules as an alternative to 4000 IU (100 µg) cholecalciferol sachets.
Responsible zooarchaeology encompasses: (1) care of reference collections, (2) management of zooarchaeological collections during study, (3) dissemination of results, and (4) long-term curation. Our responses to these challenges must be governed by shared values regarding the professional and ethical treatment of our natural and cultural heritage.
Zooarchaeological research reveals that humans are simultaneously resilient in the face of environmental change, and culpable as drivers of environmental change. Recent research indicates that even habitats thought to be unmodified by human activities were substantially, and often intentionally, altered by humans in the past. Zooarchaeological approaches to studying past environmental conditions generally fall within two primary themes: (1) the interactions between humans, animals, and the environments in which they live, and (2) the consequences of those interactions for both humans and animals.
Zooarchaeological research is guided by the scientific method. Zooarchaeologists distinguish between primary data, which are descriptive observations, and secondary data, which are analytical products derived from primary data. As much primary data as possible should be clearly recorded during the initial study, and these data should be accessible to future researchers.
The ultimate goal of zooarchaeological analysis is to use animal remains, alongside other evidence, to make inferences regarding the biological, cultural, and ecological behavior of people in the past. Secondary data, which are often mathematically derived from primary data, link primary observations about zooarchaeological specimens to larger cultural and ecological processes.