To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This review summarises evidence from cohort and intervention studies on the relationships between nutrition in early life, epigenetics and lifelong health. Established links include maternal diet quality with conception rates, micronutrient sufficiency before and during pregnancy with preterm birth prevention, gestational vitamin D intake with offspring bone health, preconception iodine status with child IQ, adiposity with offspring obesity and maternal stress with childhood atopic eczema. Animal studies demonstrate that early-life environmental exposures induce lasting phenotypic changes via epigenetic mechanisms, including DNA methylation, histone modifications and non-coding RNA, with DNA methylation of non-imprinted genes most extensively studied. Human data show that nutrition during pregnancy induces epigenetic changes associated with childhood obesity risk, such as Antisense long Non-coding RNA in the INK4 Locus (ANRIL, a long non-coding RNA) methylation variations linked to obesity and replicated across multiple populations. Emerging insights reveal that paternal nutrition and lifestyle also modify sperm epigenomics and influence offspring development. Although nutritional-randomised trials in pregnancy remain limited, findings from the NiPPeR trial showed widespread preconception micronutrient deficiencies and indicated that maternal preconception and pregnancy nutritional supplementation can reduce preterm birth and early childhood obesity. The randomised trials UPBEAT and MAVIDOS have shown that nutritional intervention can impact offspring epigenetics. Postnatal nutritional exposures further influence offspring epigenetic profiles, exemplified by ALSPAC cohort findings linking rapid infant weight gain to later methylation changes and increased obesity risk. Together, these studies support a persistent impact of maternal and early-life nutrition on child health and development, underpinned by modifiable epigenetic processes.
Governments are seeking to regulate food environments to promote health by restricting sales and marketing of processed foods high in fat, sugar and sodium. We aimed to evaluate whether the legal instruments in member states of the Western Pacific Region (WPR) mandate the declaration of nutrient composition for nutrients of concern in relation to Codex Alimentarius and non-communicable disease (NCD) prevention.
Design:
We undertook content analysis of legal instruments governing food quality and safety, documenting mandates for nutrient declarations in the WPR. Legal instruments were purposefully sourced through a systematic search of regional legal databases and Google. We performed qualitative and quantitative analysis, using an adapted version of Reeve and Magnusson’s Framework for Analysing and Improving the Performance of Regulatory Instruments.
Setting:
Legal instruments governing food quality and safety in twenty-eight member states of the WPR.
Results:
There was substantial variation in the nutrient declaration mandates within legal instruments, with only three out of twenty-eight countries mandating nutrient declarations in full alignment with Codex recommendations (energy, protein, available carbohydrate, fat, saturated fat, sodium and total sugars). Just four countries mandated the display of sodium, sugar, saturated fat and trans-fats, in line with NCD prevention recommendations. Sodium labelling was mandated in ten countries, sugar in seven and saturated fat in six.
Conclusions:
There is scope for countries to strengthen legal instruments for nutrient declarations to better support diet-related NCD prevention efforts. Regional support agencies can play a key role in promoting greater policy coherence and alignment with international best practice.
To evaluate the nutritional status of adults with beta thalassemia major in Vietnam by analysing body composition parameters and assessing the adequacy of energy, macronutrient and micronutrient intake.
Design:
A cross-sectional study was conducted among adult patients with beta thalassemia major. Nutritional status was assessed using three components: BMI, body composition and dietary intake.
Setting:
Department of Thalassemia, The National Institute of Hematology and Blood Transfusion in Vietnam.
Participants:
317 adult patients with beta thalassemia major (54·2 % females, median age 32).
Results:
Nearly half (49·5 %) of the patients had a normal BMI, while 18·3 % were severely underweight and 32·2 % were underweight. Severe underweight patients exhibited lower body fat, bone mineral content and visceral fat. Average daily energy intake (1449·9 kcal) was significantly below the estimated requirement (2079·5 kcal), with 81·4 % of patients consuming less than 85 % of their energy needs. Severely underweight patients consumed 12·06 g/d less fat (95 % CI: 6·85, 17·26) and 10·42 g/d less protein (95 % CI: 4·42, 16·42) than normal weight patients. Severe deficiencies in Ca, Mg, Fe and B-complex vitamins were prevalent, with the lowest mean probability of adequacy for minerals and vitamins observed in severely underweight patients.
Conclusion:
This study provides the first comprehensive assessment of the nutritional status of beta thalassemia major patients in Vietnam, revealing critical gaps in energy and nutrient intake. Evidence-based strategies, including dietary education and interventions tailored to the unique needs of this population, are urgently needed to improve nutritional outcomes and overall health.
To describe and evaluate nutrition-related policy, system and environmental (PSE) change strategies implemented in a rural, volunteer-run Georgia food pantry, exploring facilitators and barriers and changes in clients’ perceptions of food distributed following implementation of nutrition-related PSE changes.
Design:
The mixed-methods evaluation used pre-post key informant interviews, client surveys and programme documents to assess implementation and outcomes of a nutrition policy and other PSE changes.
Setting:
Hancock County, Georgia.
Participants:
Survey respondents were food pantry clients who completed surveys both in January 2021 and March 2022 (n 155). Key informants were programme staff, a local coalition member and food pantry leadership (n 9).
Results:
Nutrition-related PSE changes included a nutrition policy, produce procurement partnerships and enhanced refrigeration; an awareness campaign and nutrition education were also conducted. Facilitators included the implementation approach (e.g., encouraging small steps and joint policy development), relationship formation and partnerships. Barriers were modest capacity (e.g., funding and other resources), staffing/volunteers and limited experience with food policy and procurement processes. Client surveys in 2021–2022 showed canned/dried foods as most commonly received, with significant (p < 0.05) increases at follow-up in always receiving meat/poultry/seafood and significant decreases in always receiving canned fruits and dry beans/lentils. In both 2021 and 2022, substantial proportions of respondents reported food insecurity (>60 %), having obesity (>40 %), poor/fair health (>30 %) and a household member with hypertension/high blood pressure (>70 %).
Conclusions:
Nutrition-related PSE changes in rural food pantries to improve the healthfulness of foods distributed require substantial resources, yet if sustained, may increase client access to healthy foods and improve diets.
Body composition (BC) offers essential insights into the physical condition and performance capacity of athletes. Several factors can influence athletes’ BC, such as nutrition, which can improve lean mass (LM) and body fat percentage (%BF). This longitudinal observational study aimed to investigate the factors influencing BC in professional female football players, including hormones, dietary habits and physical activity, as these are relevant to their sport performance and health. Data related to dietary habits, dual-energy X-ray absorptiometry measurements, serum hormones, menstruation and global position system metrics were collected in November 2023 and late March 2024 from thirty-eight female football players from the Real Sociedad team. Of the thirty-eight players enrolled, thirty-five completed all assessments and were included in the final analyses. Spearman correlations and linear regression analyses were performed. Statistically significant models were executed for %BF and LM (adjusted R2 = 0·55 and 0·47, respectively). For %BF, total testosterone had a positive influence, while high-speed running per minute, follicle-stimulating hormone, distance covered per minute, prolactin and fat intake influenced negatively. In the LM model, positive associations were total testosterone, progesterone, age, adrenocorticotropic hormone and carbohydrate intake, while insulin, distance covered per minute and sex hormone binding globulin had negative associations. These results emphasise the complexity of the factors influencing BC in female football players. Personalising and periodising carbohydrate intake and monitoring training loads are crucial to prevent adverse effects such as higher %BF and muscle catabolism. Establishing healthy nutritional practices is essential for long-term health and performance.
The global syndemic of obesity, undernutrition and climate change – three interconnected challenges – threatens both human and planetary health. This review focuses on one critical intersection: older populations living with overweight and obesity in the context of sustainable nutrition. Obesity and sarcopenia, particularly the co-occurrence called sarcopenic obesity, are often overlooked until the onset or exacerbation of other diseases necessitates secondary care. Preventing sarcopenic obesity requires reducing excess fat mass while preserving muscle mass and function. This involves lowering total energy intake while ensuring adequate protein intake in terms of quantity, quality and distribution, combined with physical activity, particularly resistance exercise. Short-term studies show that both the source and dose of dietary protein significantly influence muscle protein synthesis rates. Longer-term studies examining the impact of plant-based diets on muscle health in older adults with or without overweight or obesity remain limited. Animal protein have shown a modest advantage over most plant-based protein in supporting muscle mass. Qualitative studies suggest that emphasising both the health benefits and palatability of plant-based protein sources is key to promoting dietary changes in older adults. In older adults with obesity, it is challenging to combine energy restriction with higher protein intake, especially when protein sources are plant-based. To prevent and treat sarcopenic obesity in older adults and support planetary health, a shift toward more plant-based protein sources is required, while ensuring sufficient protein quantity and quality to preserve muscle health during weight loss.
This review aims to (1) provide an overview of research investigating the relationship between body composition, specifically fat-free mass (FFM) and fat mass (FM), appetite and energy intake (EI) and (2) to investigate potential mechanisms underlying these relationships, with a focus on ageing. Appetite and EI are influenced by complex, multifactorial pathways involving physiological, psychological, environmental, social and cultural factors. Early research investigating the association of body composition with appetite and EI focused on FM; however, the role of FFM in appetite control is gaining increasing attention. Studies have shown that FFM is positively associated with EI in younger populations, including infants, adolescents and adults. In contrast, FM appears to have no association or a weak inverse association with appetite/EI. However, research in older adults is limited, and the underlying mechanisms are not fully understood. It has been suggested that one way in which FFM may influence appetite and EI is by impacting resting metabolic rate (RMR). FFM, which includes metabolically active tissues including skeletal muscle and organs, represents the largest determinant of RMR and therefore may influence appetite and EI by ensuring the energetic requirements of crucial tissue-organs and metabolic processes are reached. Given that declines in FFM and RMR are common with ageing, they may be possible targets for interventions aimed at improving appetite and EI. While current evidence in older adults supports a positive association between FFM and appetite, further longitudinal studies are needed to explore this relationship in different contexts, along with the underlying mechanisms.
To assess the frequency and correlates of meal-kit use across five countries using population-level data.
Design:
Online surveys conducted in 2022 assessed meal-kit use in the past week. Binary logistic regression models examined sociodemographic and nutrition-related correlates of meal-kit use, including self-reported home meal preparation and cooking skills, commercially prepared meal consumption and healthy eating, weight change and sustainability efforts.
Setting:
Canada, Australia, the UK, the USA and Mexico.
Participants:
20,401 adults aged 18–100 years.
Results:
Overall, 14 % of participants reported using meal-kits in the past week. Use was highest in the USA (18 %) and lowest in Canada (9 %). Meal-kit use was greater among individuals who were younger, male, of minority ethnicity, had high educational attainment, had higher income adequacy or had children living in the household (P < 0·01 for all). Use was greater for those who participated in any food shopping (v. none), those who prepared food sometimes (3–4 d/week or less v. never) and those who reported ‘fair’ or better cooking skills (v. poor; P < 0·05 for all). Consuming any ‘ready-to-eat’ food (v. none) and visiting restaurants more recently (v. > 6 months ago; P < 0·001 for all) were associated with greater meal-kit use. Eating fruits/vegetables more than 2 times/d and engaging in diet modification efforts were also associated with increased meal-kit use, as was engaging in weight change or sustainability efforts (P < 0·001 for all).
Conclusions:
Meal-kits tend to be used by individuals who make efforts to support their health and sustainability, potentially valuing ‘convenient’ alternatives to traditional home meal preparation; however, use is concentrated amongst those with higher income adequacy.
This study assessed the suitability of nutritional composition data from a commercial dataset for policy evaluation in Brazil.
Design:
We compared the proportions of packaged foods and beverages, classified according to the Nova food classification and the nutritional composition of matched products using data from a commercial database of food labels (Mintel-Global New Products Database (GNPD)) and the Brazilian Food Labels Database (BFLD), collected in 2017 as a ‘gold standard.’ We evaluated the agreement between the two datasets using paired t tests, Wilcoxon–Mann-Whitney test and the Intraclass Correlation Coefficient (ICC) for energy, carbohydrates, total sugars, proteins, total fats, saturated fats, trans-fats, sodium and fiber.
Setting:
Brazil.
Participants:
Totally, 11 434 packaged foods and beverages collected in 2017 provided by BFLD and 67 042 packaged foods and beverages launched from 2001 to 2017 provided by Mintel-GNPD.
Results:
The proportions of ultra-processed foods (UPF) were similar in both datasets. Paired products exhibited an excellent correlation (ICC > 0·80), with no statistically significant difference in the mean values (P ≥ 0·05) of most nutrients analysed. Discrepancies in fibre and fat content were noted in some UPF subcategories, including sweet biscuits, ice cream, candies, dairy beverages, sauces and condiments.
Conclusion:
The Mintel-GNPD dataset closely aligns with the BFLD in UPF distribution and shows a similar nutritional composition to a sample of matched foods available for purchase in stores, indicating its potential contribution to monitoring and evaluating food labelling policies in Brazil and in studies of food and beverages composition in food retail through the verification of policy compliance.
Ultra-processed foods (UPF), defined using frameworks such as NOVA, are increasingly linked to adverse health outcomes, driving interest in ways to identify and monitor their consumption. Artificial intelligence (AI) offers potential, yet its application in classifying UPF remains underexamined. To address this gap, we conducted a scoping review mapping how AI has been used, focusing on techniques, input data, classification frameworks, accuracy and application. Studies were eligible if peer-reviewed, published in English (2015–2025), and they applied AI approaches to assess or classify UPF using recognised or study-specific frameworks. A systematic search in May 2025 across PubMed, Scopus, Medline and CINAHL identified 954 unique records with eight ultimately meeting the inclusion criteria; one additional study was added in October following an updated search after peer review. Records were independently screened and extracted by two reviewers. Extracted data covered AI methods, input types, frameworks, outputs, validation and context. Studies used diverse techniques, including random forest classifiers, large language models and rule-based systems, applied across various contexts. Four studies explored practical settings: two assessed consumption or purchasing behaviours, and two developed substitution tools for healthier options. All relied on NOVA or modified versions to categorise processing. Several studies reported predictive accuracy, with F1 scores from 0·86 to 0·98, while another showed alignment between clusters and NOVA categories. Findings highlight the potential of AI tools to improve dietary monitoring and the need for further development of real-time methods and validation to support public health.
Collagen supplementation (CS) has emerged as a promising therapeutic approach with potential benefits for managing metabolic syndrome (MetS)-related risk factors. This narrative review integrates human evidence with preclinical mechanistic insights into the metabolic actions of collagen. Anti-obesity effects are attributed to increased satiety, gastric distension, GLP-1 secretion and enhanced fatty acid oxidation mediated by PPAR-α activation and AMPK signalling. In type 2 diabetes, collagen improves glucose homeostasis by enhancing insulin sensitivity, upregulating GLUT-4 and inhibiting dipeptidyl peptidase IV (DPP-IV), thereby prolonging incretin activity (GLP-1 and GIP) and supporting β-cell function. The antihypertensive effect of collagen peptides (CP) is primarily linked to angiotensin-converting enzyme (ACE) inhibition, which reduces angiotensin II levels while promoting bradykinin-mediated vasodilation and nitric oxide release. In addition, CP has shown potential in improving lipid profiles by modulating PPAR-γ and AMPK, increasing HDL-C and reducing LDL-C and triacylglycerols. Emerging evidence also supports a role for collagen in restoring gut microbiota balance, increasing short-chain fatty acid production and reducing pro-inflammatory and oxidative pathways, contributing to systemic metabolic regulation. Overall, these findings suggest CS exerts multi-targeted benefits on MetS components through modulation of endocrine, inflammatory and metabolic pathways. Nevertheless, larger, long-term clinical trials are warranted to determine optimal dosing regimens, evaluate long-term efficacy, and further elucidate microbiota-mediated effects.
Individuals with severe mental illness face a significantly reduced life expectancy compared to the general population. Addressing key modifiable risk factors is essential to reduce these alarming rates of mortality in this population. Nutritional psychiatry has emerged as an important field of research, highlighting the important role of nutrition on mental health outcomes. However, individuals with severe mental illness often encounter barriers to healthy eating, including poor diet quality, medication-related side effects such as increased appetite and weight gain, food insecurity and limited autonomy over food choices. While nutrition interventions play a key role in improving health outcomes and should be a standard part of care, their implementation remains challenging. Digital technology presents a promising alternative support model, with the potential to address many of the structural and attitudinal barriers experienced by this population. Nonetheless, issues such as digital exclusion and low digital literacy persist. Integrating public and patient involvement, along with behavioural science frameworks, into the design and delivery of digital nutrition interventions can improve their relevance, acceptability and impact. This review discusses the current and potential role of digital nutrition interventions for individuals with severe mental illness, examining insights, challenges and future directions to inform research and practice.
This study is the first study in Middle Eastern population that aimed to investigate the association between global diet quality score (GDQS) and risk of hypertension (HTN) in Iranian adults.
Design:
This population-based cohort study was conducted on 5718 individuals aged ≥ 18 years from the third and fourth Tehran Lipid and Glucose Study surveys, who were followed until the sixth survey (mean follow-up: 7·8 years). Dietary data were collected using a validated FFQ to calculate GDQS as a novel food-based metric designed to assess diet quality across diverse populations. It evaluates the adequacy of healthy food groups (e.g. fruits, vegetables and whole grains) while monitoring the moderation of unhealthy or excessive intake (e.g. refined grains, processed meats and sugary foods).
Setting:
Tehran Lipid and Glucose Study.
Participants:
Iranian men and women.
Results:
Participants had a mean (sd) age of 37·7 (sd 12·8) years, BMI of 26·6 (sd 4·7) kg/m2 and GDQS of 25·3 (sd 4·4). During the 7·8-year follow-up, 1302 (18 %) new cases of HTN were identified. Higher GDQS and its healthy components were associated with reduced HTN risk (hazard ratio (HR): 0·83; 95 % CI: 0·70, 0·98; Ptrend = 0·034 and HR: 0·78; 95 % CI: 0·65, 0·92; Ptrend = 0·005, respectively), while unhealthy components of GDQS showed no association with HTN risk (HR: 1·14; 95 % CI: 0·98, 1·33; Ptrend = 0·059). These protective associations were observed across all weight categories and both genders, with stronger effects among obese individuals (for GDQS: HR: 0·75; 95 % CI: 0·58, 0·98; P = 0·041; for healthy components: HR: 0·75; 95 % CI: 0·57, 0·99; P = 0·044) and females (for GDQS: HR: 0·77; 95 % CI: 0·62, 0·97; P = 0·028; for healthy components: HR: 0·76; 95 % CI: 0·60, 0·96; P = 0·023).
Conclusions:
A higher GDQS was associated with a reduced risk of incident HTN among Iranian adults. Adherence to a high-quality diet, particularly focusing on the healthy dietary components of GDQS, may serve as an effective strategy for preventing HTN, especially among obese individuals and women.
Sarcopenia, the age-related decline in muscle mass and strength, is a contributor to frailty and reduced quality of life. Emerging evidence suggests an emerging role of the gut microbiome in modulating skeletal muscle through microbial species and metabolites, such as short-chain fatty acids (SCFAs), potentially influencing inflammation, nutrient absorption, and glucose and protein metabolism. This review considers the potential of probiotics, prebiotics, and synbiotics as interventions to mitigate sarcopenia based on animal and human studies, while providing a critique of present barriers that need to be addressed. Preclinical models, including germ-free mice and faecal microbiota transplantation, demonstrate that gut microbiota from healthy or young donors may enhance overall muscle health via reductions in inflammatory and muscle atrophy markers. Limited human studies show that probiotics such as Lactobacillus and Bifidobacterium could improve branched-chain amino acid (BCAA) bioavailability and potentially sarcopenia indices, although findings have been inconsistent. Particularly, challenges including inconsistent microbial assessments, lack of dietary control and interindividual variability due to diet, age, genetics, comorbidities and medications may hinder progress in this field. Delivery methods (e.g. capsules, fermented foods or fortified products) could further complicate efficacy through probiotic stability and dietary restrictions in older adults. Standardised protocols [e.g. Strengthening The Organisation and Reporting of Microbiome Studies (STORMS) checklist] and multi-omics approaches may be critical to address these limitations and identify microbial signatures linked to sarcopenia outcomes. While preclinical evidence highlights mechanistic pathways pertinent to amino acid metabolism, translating findings to humans requires rigorous experimental trials.
Groundwater iron varies geographically and iron intake through drinking water can minimise iron deficiency (ID). Rice, a major share of daily meals (∼70% of total energy) in Bangladesh, absorbs a substantial amount of water. This study aimed to estimate the contribution of groundwater iron entrapped in cooked rice and its implications on the recommended iron intake. A cross-sectional study was conducted among 25 households, selected by the iron content of their drinking groundwater source in Sirajganj district, Bangladesh. Each household pre-supplied with 600 g of raw rice (300 g for each cooking), was instructed to cook ‘water-draining rice’ (WDR) and ‘water-sitting rice’ (WSR). Using atomic absorption spectrophotometry, iron content in filtered and non-filtered water was measured as 0.4 ± 0.2 mg/L and 6.1 ± 2.0 mg/L, respectively. After adjusting for water filtration, the weighted mean of total iron content in WDR and WSR was 6.18 mg and 5.70 mg, respectively. Assuming the average rice intake, iron content in WDR and WSR fulfilled approximately 98.15% and 90.62% of the average requirement for non-pregnant and non-lactating women (NPNL). The water-entrapped iron in cooked WDR and WSR fulfilled about 23.77% and 20.4% of Recommended Dietary Allowances, and 52.83% and 45.30% of Estimated Average Requirements, respectively in NPNL women, suggesting that groundwater entrapped in cooked rice is an influential dietary iron source. The substantial amount of iron from cooked rice can make an additional layer to the environmental contribution of iron in this setting with the potential to contribute ID prevention.
There is substantial international variation in recommended vitamin C intake levels. In the USA, the recommendation is 90 mg/d for men and 75 mg/d for women, while in the UK, the current recommendation – established in 1991 – is only 40 mg/d for adults. This UK level was based on the 1953 Sheffield study, which found that 10 mg/d prevents scurvy, with 40 mg/d chosen as the recommended level for yielding somewhat higher plasma levels. In this commentary, we argue that the UK recommendation overlooked key evidence available at the time. Specifically, at least six controlled trials published before 1991 reported benefits from vitamin C supplementation in participants whose baseline vitamin C intake was already 40 mg/d or higher. One randomised controlled trial, published in 1993, found benefits from vitamin C supplementation even at a baseline intake of about 500 mg/d; however, this trial involved ultramarathon runners, and the findings should not be broadly generalised. Nonetheless, such results challenge the assumption that 40 mg/d is universally adequate to maintain full health. We also highlight that the UK recommendations were narrowly focused on preventing dermatological symptoms of scurvy, despite strong evidence – even at the time – that vitamin C deficiency can also cause cardiac dysfunction and greater morbidity due to respiratory infections. We conclude that the current UK vitamin C recommendation should be re-evaluated in light of controlled trial evidence and broader clinical outcomes.
Although many online-based dietary surveys have been developed in recent years, systems that easily survey the dietary balance based on the Japanese diet are insufficient. This study aimed to evaluate the relationship between dietary balance scores from an online survey system based on the Japanese Food Guide Spinning Top, and nutrient/food intake calculated using the weighing method from dietary records (DRs), as well as to assess the system’s utility and applicability. An online dietary balance survey and semi-weighted DR assessment with food photographs were conducted in Japanese participants (n = 34). Registered dietitians entered the balance scores into the system based on the participants’ food photographs, and the scores were calculated using the system. Significant positive correlations (p < 0.001) were found between the online dietary balance scores and nutrient/food intake from DRs; especially for ‘grain dishes’ and carbohydrates (r = 0.704); ‘vegetable dishes’ and the vegetable dish group (sum of potatoes, vegetables, mushrooms, and algae) (r = 0.774); ‘main dishes’ and protein (r = 0.661); ‘milk’ and the milk and milk products group (r = 0.744); and ‘fruits’ and the fruits group (r = 0.748). Bland–Altman analysis showed that the dietary balance scores obtained by this system tended to underestimate the intake compared with the weighing method. Although there are limitations to the accurate estimation of nutrient and food intake, the online dietary balance scores obtained from the online dietary balance survey system were useful for understanding the dietary balance in the Japanese diet.
Kids SIPsmartER is a school-based behavioural intervention for rural Appalachia middle school students with an integrated two-way short message service (SMS) strategy for caregivers. When tested in a cluster randomized controlled trial, the intervention led to significant improvements in sugar-sweetened beverage (SSB) consumption among students and caregivers. This study explores changes in secondary caregiver outcomes, including changes in caregiver SSB-related theory of planned behaviour constructs (affective attitudes, instrumental attitudes, subjective norms, perceived behavioural control, and intentions), parenting practices, and the home environment. Participants included 220 caregivers (93% female, 88% White, 95% non-Hispanic, mean age 40.6) in Virginia and West Virginia at baseline and 7 months post-intervention. Relative to control caregivers (n = 102), intervention caregivers (n = 118) showed statistically significant improvements in instrumental attitudes (Coef.= 0.53, 95% CI [0.04, 1.01], p = 0.033), behavioural intentions (Coef.=0.46, 95% CI [0.05, 0.88], p = 0.027), parenting practices (Coef. = 0.22, 95% CI [0.11, 0.33], p < 0.001), and total home SSB availability (Coef. = –0.25, 95% CI [–0.39, –0.11], p < 0.001), with specific improvements for sweetened juice drinks (Coef. = –0.18, 95% CI [–0.35, –0.01], p = 0.043) and regular soda/soft drinks (Coef. = –0.31, 95% CI [–0.55, –0.07], p = 0.010). In contrast, there were no significant between group changes for affective attitudes, subjective norms, or perceived behavioural control. Our findings highlight future research areas and fill gaps in intervention literature. This study is among the few to develop and evaluate a scalable, theory-based caregiver SMS component in a rural, school-based intervention. Combined with evidence that Kids SIPsmartER improved SSB behaviours, our results emphasize the potential of theory-guided SMS interventions to impact SSB-related outcomes.
Despite the multiple advantages of 25-hydroxyvitamin D (calcifediol or 25(OH)D) compared to cholecalciferol, it is used sparingly. This study was planned to assess the safety and efficacy of supplementing daily 25 µg of calcifediol capsules vis-a-vis 100 µg (4000 IU) of cholecalciferol sachets in apparently healthy individuals with vitamin D deficiency in Chandigarh, India (latitude 30.7° North, 76.8° East). It was a prospective, interventional study to evaluate the effects of calcifediol vis-a-vis cholecalciferol. Following initial screening of 70 subjects in each group, 62 were included in the calcifediol and 41 in the cholecalciferol group. Forty-six from calcifediol and 37 from cholecalciferol group completed the 6-month follow up. There was a significant increase in serum 25(OH)D (355% in cholecalciferol & 574% in calcifediol groups, respectively, p < 0.001) and 1,25 (OH)2D (p < 0.001) with a marked decrease in iPTH (p < 0.001) and ALP (p = 0.016) in both groups. Though serum ALP decreased significantly more in the calcifediol group than the cholecalciferol group, no appreciable difference in other biochemical parameters was noted between the groups. No episodes of hypercalcaemia or incidence of new renal stone disease were observed during follow-up. However, hypercalciuria (spot urine calcium creatinine > 0.2 mg/mg) was noted in 8/46 individuals in the calcifediol group and 5/37 individuals in the cholecalciferol group at final visit with no significant difference between two groups. This study establishes the efficacy and safety of correcting vitamin D deficiency with daily 25 µg calcifediol capsules as an alternative to 4000 IU (100 µg) cholecalciferol sachets.