To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Inflammation is an underlying problem for many disease states and has been implicated in iron deficiency (ID). This study aimed to determine whether iron status is improved by epigallocatechin-3-gallate (EGCG) through reducing inflammation. Thirty-two male Sprague–Dawley rats were fed an iron-deficient diet for 2 weeks and then randomly divided into four groups (n 8 each): positive controls, negative controls, lipopolysaccharide (LPS, 0⋅5 mg/kg body weight), and LPS + EGCG (LPS plus 600 mg EGCG/kg diet) for 3 additional weeks. The study involved testing two control groups, both treated with saline. One group (positive control) was fed a regular diet containing standard iron, while the negative control was fed an iron-deficient diet. Additionally, two treatment groups were tested. The first group was given LPS, while the second group was administered LPS and fed an EGCG diet. Iron status, hepcidin, C-reactive protein (CRP), serum amyloid A (SAA), and interleukin-6 (IL-6) were measured. There were no differences in treatment groups compared with control in CRP, hepcidin, and liver iron concentrations. Serum iron concentrations were significantly lower in the LPS (P = 0⋅02) and the LPS + EGCG (P = 0⋅01) than in the positive control group. Compared to the positive control group, spleen iron concentrations were significantly lower in the negative control (P < 0⋅001) but not with both LPS groups. SAA concentrations were significantly lower in the LPS + EGCG group compared to LPS alone group. EGCG reduced SAA concentrations but did not affect hepcidin or improve serum iron concentration or other iron markers.
The aim of this review is to provide an overview of dietary interventions delivered during pregnancy for the prevention of gestational diabetes mellitus (GDM). GDM increases the risk of adverse pregnancy and neonatal outcomes, and also increases future cardiometabolic risks for both the mother and the offspring. Carrying or gaining excessive weight during pregnancy increases the risk of developing GDM, and several clinical trials in women with overweight or obesity have tested whether interventions aimed at limiting gestational weight gain (GWG) could help prevent GDM. Most dietary interventions have provided general healthy eating guidelines, while some had a specific focus, such as low glycaemic index, increased fibre intake, reducing saturated fat or a Mediterranean-style diet. Although trials have generally been successful in attenuating GWG, the majority have been unable to reduce GDM risk, which suggests that limiting GWG may not be sufficient in itself to prevent GDM. The trials which have shown effectiveness in GDM prevention have included intensive face-to-face dietetic support, and/or provision of key foods to participants, but it is unclear whether these strategies could be delivered in routine practice. The mechanism behind the effectiveness of some interventions over others remains unclear. Dietary modifications from early stages of pregnancy seem to be key, but the optimum dietary composition is unknown. Future research should focus on designing acceptable and scalable dietary interventions to be tested early in pregnancy in women at risk of GDM.
Reducing meat consumption is essential to curb further climate change and limit the catastrophic environmental degradation resulting from the current global food system. However, consumers in industrialised countries are hesitant to reduce their meat intake, often because they find plant-based foods less appealing. Despite the climate emergency, eating meat is still perceived as the norm, and recommended in most national dietary guidelines. To support the transition to more sustainable diets by providing insights for increasing the appeal of plant-based foods to mainstream consumers, this review presents recent research findings on how people think and communicate about meat-based and plant-based foods. The key findings we review include: (1) while vegans think about plant-based foods in terms of enjoyable eating experiences, omnivores think about plant-based foods in terms of health, vegan identity and other abstract information that does not motivate consumption in the moment. (2) Packages of ready-meals and social media posts on Instagram present plant-based foods with fewer references to enjoyable eating experiences than meat-based foods. (3) Presenting plant-based foods with language that references enjoyable eating experiences increases their appeal, especially for habitual meat eaters. This language includes words about sensory features of the food (e.g., crunchy, creamy), eating context (e.g. pub; with family) and immediate positive consequences of eating (e.g. comforting, delicious). In contrast, the term ‘vegan’ is strongly associated with negative stereotypes. Hence, rather than referring to being vegan, meat-free or healthy, the language used for plant-based foods should refer to sensory appeal, attractive eating situations and enjoyment.
Brain ageing, the primary risk factor for cognitive impairment, occurs because of the accumulation of age-related neuropathologies. Identifying effective nutrients that increase cognitive function may help maintain brain health. Tomatoes and lemons have various bioactive functions and exert protective effects against oxidative stress, ageing and cancer. Moreover, they have been shown to enhance cognitive function. In the present study, we aimed to investigate the effects of tomato and lemon ethanolic extracts (TEE and LEE, respectively) and their possible synergistic effects on the enhancement of cognitive function and neurogenesis in aged mice. The molecular mechanisms underlying the synergistic effect of TEE and LEE were investigated. For the in vivo experiment, TEE, LEE or their mixture was orally administered to 12-month-old mice for 9 weeks. A single administration of either TEE or LEE improved cognitive function and neurogenesis in aged mice to some extent, as determined using the novel object recognition test and doublecortin immunohistochemical staining, respectively. However, a significant enhancement of cognitive function and neurogenesis in aged mice was observed after the administration of the TEE + LEE mixture, which had a synergistic effect. N-methyl-d-aspartate receptor 2B, postsynaptic density protein 95, and brain-derived neurotrophic factor (BDNF) levels and tropomyosin receptor kinase B (TrkB)/extracellular signal-regulated kinase (ERK) phosphorylation also synergistically increased after the administration of the mixture compared with those in the individual treatments. In conclusion, compared with their separate treatments, treatment with the TEE + LEE mixture synergistically improved the cognitive function, neurogenesis and synaptic plasticity in aged mice via the BDNF/TrkB/ERK signalling pathway.
To assess the effect of daily egg consumption for six months on linear growth (primary outcome), weight-for-age, weight-for-length, mid-upper arm circumference-for-age, head circumference-for-age Z-scores, gross motor milestones development, anaemia and iron status (secondary outcomes) in a low socioeconomic community.
Participants:
Infants aged 6 to 9 months living in the peri-urban Jouberton area, in the Matlosana Municipality, South Africa.
Design:
A randomised controlled trial with a parallel design was implemented. Eligible infants were randomly allocated to the intervention (n 250) receiving one egg/day and the control group (n 250) receiving no intervention. The participants were visited weekly to monitor morbidity and gross motor development, with information on adherence collected for the intervention group. Trained assessors took anthropometric measurements, and a blood sample was collected to assess anaemia and iron status. There was blinding of the anthropometric assessors to the groups during measurements and the statistician during the analysis.
Results:
Baseline prevalence of stunting, underweight, wasting, overweight and anaemia was 23·8 %, 9·8 %, 1·2 %, 13·8 % and 29·2 %, respectively, and did not differ between groups. Overall, 230 and 216 participants in the intervention and control groups completed the study, respectively. There was no intervention effect on length-for-age, weight-for-age, weight-for-length Z-scores, gross motor milestone development, anaemia and iron status.
Conclusions:
Daily egg intake did not affect linear growth, underweight, wasting, motor milestones development, anaemia and iron status. Other interventions are necessary to understand the effect of animal-source food intake on children’s growth and development. This trial was registered at https://clinicaltrials.gov/ (NCT05168085).
This study was designed to assess the relationship between dietary insulin index (DII) and dietary insulin load (DIL) and rheumatoid arthritis (RA) risk in a case–control study. This study enrolled ninety-five newly diagnosed RA patients and 200 age- and sex-matched healthy controls. Dietary intakes were assessed using a validated 168-item semi-quantitative FFQ. DII and DIL were calculated using food insulin index values from previously published data. In the unadjusted model, individuals in the highest DIL tertile had the significantly higher odds of RA than those in the lowest tertile of the DIL scores (OR = 1·32, 95 % CI (1·15, 1·78), Pfor trend = 0·009). After adjusting for confounders, the risk of RA was 2·73 times higher for participants in the highest tertile of DIL than for those in the lowest tertile (OR = 2·73, 95 % CI (1·22, 3·95), Pfor trend < 0·001). In addition, patients in the highest DII tertile had higher risk of RA than those in the first tertile (OR = 2·22, 95 % CI (1·48, 3·95), Pfor trend = 0·008). This association persisted after adjusting for potential confounders (OR = 3·75, 95 % CI (3·18, 6·78), Pfor trend = 0·002). Our findings suggest that diets high in DII and DIL may increase the risk of developing RA, independent of other potential confounders. These findings can be verified by more research, particularly with a prospective design.
Fatigue and insomnia, potentially induced by inflammation, are distressing symptoms experienced by colorectal cancer (CRC) survivors. Emerging evidence suggests that besides the nutritional quality and quantity, also the timing, frequency and regularity of dietary intake (chrono-nutrition) could be important for alleviating these symptoms. We investigated longitudinal associations of circadian eating patterns with sleep quality, fatigue and inflammation in CRC survivors. In a prospective cohort of 459 stage I-III CRC survivors, four repeated measurements were performed between 6 weeks and 24 months post-treatment. Chrono-nutrition variables included meal energy contribution, frequency (a maximum of six meals could be reported each day), irregularity and time window (TW) of energetic intake, operationalised based on 7-d dietary records. Outcomes included sleep quality, fatigue and plasma concentrations of inflammatory markers. Longitudinal associations of chrono-nutrition variables with outcomes from 6 weeks until 24 months post-treatment were analysed by confounder-adjusted linear mixed models, including hybrid models to disentangle intra-individual changes from inter-individual differences over time. An hour longer TW of energetic intake between individuals was associated with less fatigue (β: −6·1; 95 % CI (−8·8, −3·3)) and insomnia (β: −4·8; 95 % CI (−7·4, −2·1)). A higher meal frequency of on average 0·6 meals/d between individuals was associated with less fatigue (β: −3·7; 95 % CI (−6·6, −0·8)). An hour increase in TW of energetic intake within individuals was associated with less insomnia (β: −3·0; 95 % CI (−5·2, −0·8)) and inflammation (β: −0·1; 95 % CI (−0·1, 0·0)). Our results suggest that longer TWs of energetic intake and higher meal frequencies may be associated with less fatigue, insomnia and inflammation among CRC survivors. Future studies with larger contrasts in chrono-nutrition variables are needed to confirm these findings.
The link between school feeding programmes (SFP) and the promotion of healthy eating and health is being explored in studies performed in different countries. The coronavirus disease-19 pandemic has revealed flaws and weaknesses in contemporary food systems, with many school-age children experiencing food insecurity and hunger. There is intense debate among policymakers regarding whether government SFP should be universal or targeted. Countries such as Brazil and India, which have two of the most comprehensive universal free-of-charge programmes, have shown the benefits of SFP, including improved nutritional status, support for more sustainable food systems, attendance and academic performance. Evidence shows and supports actions advocating that it is time to offer healthy and free school meals for all students.
The weight, urine colour and thirst (WUT) Venn diagram is a practical hydration assessment tool; however, it has only been investigated during first-morning. This study investigated accuracy of the WUT Venn diagram at morning and afternoon timepoints compared with blood and urine markers. Twelve men (21 ± 2 years; 81·0 ± 15·9 kg) and twelve women (22 ± 3 years; 68·8 ± 15·2 kg) completed the study. Body mass, urine colour, urine specific gravity (USG), urine osmolality (UOSM), thirst and plasma osmolality (POSM) were collected at first-morning and afternoon for 3 consecutive days in free-living (FL) and euhydrated states. Number of markers indicating dehydration levels were categorised into either 3, 2, 1 or 0 WUT markers. Receiver operating characteristics analysis calculated the sensitivity and specificity of 1, 2 or 3 hydration markers in detecting dehydration or euhydration. Specificity values across morning and afternoon exhibited high diagnostic accuracy for USG (0·890–1·000), UOSM (0·869–1·000) and POSM (0·787–0·990) when 2 and 3 WUT markers were met. Sensitivity values across both timepoints exhibited high diagnostic accuracy for USG (0·826–0·941) and UOSM (0·826–0·941), but not POSM in the afternoon (0·324) when 0 and 1 WUT markers were met. The WUT Venn diagram is accurate in detecting dehydration for WUT2 and WUT3 based off USG, UOSM and POSM during first-morning and afternoon. Applied medical, sport and occupational practitioners can use this tool in field settings for hydration assessment not only at various timepoints throughout the day but also in FL individuals.
Previous observational research showed that one of the most common strategies used to lose weight is to avoid or restrict the consumption of specific food items. However, the question of how people behave and implement strategies in actual decision-making situations involving food choices for weight loss purposes remains inconclusive. This experimental study using a food buffet aimed to examine people’s different dietary strategies and motives for selecting foods for an entire day for weight loss purposes compared with a normal-day (ND) food selection. A total of 111 participants (55 % women) had to choose foods for both a ND and a weight loss day (WLD) (within-study design). Kilocalories and nutrients were calculated based on the weights of the foods selected, and food choice motives were assessed using a questionnaire. The results showed that for weight loss purposes, the participants selected more vegetables (both sexes) and unsweetened beverages (only men) while reducing their choices of high-fat and high-energy products (both sexes). Participants’ food choices in both conditions (ND and WLD) differed from the official nutrition recommendations. They chose less carbohydrates and fibres and more fat and sugar than recommended. Health, kilocalories and nutrient content (carbohydrates, sugar, fat and protein) were more important food choice motives for weight loss purposes than for a ND food selection, while taste became less important. In conclusion, the participants appeared to be well capable of implementing several appropriate dietary strategies. Further research is needed to explore strategies to help them maintain these dietary changes over the long term.
Sufficient sleep is necessary for optimal health, daytime performance and wellbeing and the amount required is age-dependent and decreases across the lifespan. Sleep duration is usually affected by age and several different cultural, social, psychological, behavioural, pathophysiological and environmental factors. This review considers how much sleep children and adults need, why this is important, what the consequences are of insufficient sleep and how we can improve sleep. A lack of the recommended amount of sleep for a given age group has been shown to be associated with detrimental effects on health including effects on metabolism, endocrine function, immune function and haemostatic pathways. Obesity has increased worldwide in the last few decades and the WHO has now declared it a global epidemic. A lack of sleep is associated with an increased risk of obesity in children and adults, which may lead to future poor health outcomes. Data from studies in both children and adults suggest that the relationship between sleep and obesity may be mediated by several different mechanisms including alterations in appetite and satiety, sleep timing, circadian rhythm and energy balance. Moreover, there is evidence to suggest that improvements in sleep, in both children and adults, can be beneficial for weight management and diet and certain foods might be important to promote sleep. In conclusion this review demonstrates that there is a wide body of evidence to suggest that sleep and obesity are causally related and recommends that further research is required to inform policy, and societal change.
Evidence suggests that differences in meal timing between weekends and weekdays can disrupt the body’s circadian rhythm, leading to a higher BMI. We aimed to investigate the associations between mealtime variation from weekdays to weekends (eating midpoint jetlag), dietary intake and anthropometric parameters, based on individuals’ chronotype. The study utilised data from National Health and Nutrition Examination Survey 2017–2018. Food consumption was estimated by weighted average of participants’ food intake on weekdays and weekends. Eating midpoint jetlag, defined as the difference between the midpoint of the first and last mealtimes on weekends and weekdays, was calculated. Chronotype was assessed by participants’ mid-sleep time on weekends, adjusted for sleep debt. Linear regression analysis was conducted to investigate the associations between variables. The sample was categorised into chronotype tertiles. Among individuals in the third chronotype tertile, there was a positive association between eating midpoint jetlag and BMI (β = 1·2; 95 % CI (1·13, 1·27)). Individuals in the first tertile showed a positive association between eating midpoint jetlag and energy (β = 96·9; 95 % CI (92·9, 101·7)), carbohydrate (β = 11·96; 95 % CI (11·2, 12·6)), fat (β = 3·69; 95 % CI (3·4, 3·8)), cholesterol (β = 32·75; 95 % CI (30·9, 34·6)) and sugar (β = 8·84; 95 % CI (8·3, 9·3)) intake on weekends. Among individuals with an evening tendency, delaying meals on weekends appears to be linked to a higher BMI. Conversely, among individuals with a morning tendency, eating meals later on weekends is associated with higher energetic intake on weekends.
This study aimed to identify the longitudinal association between seaweed and type 2 diabetes mellitus (T2DM) in the Korean population. Data from 148 404 Korean adults aged 40 years and older without a history of T2DM, cardiovascular disease or cancer at baseline were obtained from the Korean Genome and Epidemiology Study data. The participants’ seaweed intake was obtained using a validated semi-quantitative food frequency questionnaire, and the diagnosis of T2DM was surveyed through a self-reported questionnaire during follow-up. The hazard ratio (HR) and 95 % confidence interval (CI) for T2DM were calculated using the Cox proportional hazard regression, and the dose–response relationship was analysed using a restricted cubic spline regression. Participants had a mean follow-up period of 5 years. Participants with the highest seaweed intake had a 7 % lower risk of T2DM compared with the group with the lowest intake (95 % CI (0·87, 0·99)). Interestingly, this association was stronger in those with normal weight (HR: 0·88, 95 % CI (0·81, 0·95)), while no association was observed in participants with obesity. Spline regression revealed an inverse linear relationship between seaweed intake and T2DM risk in participants with normal weight, showing a trend where increased seaweed intake is related to lower instances of T2DM (Pfor nonlinearity = 0·48). Seaweed intake is inversely associated with the onset of T2DM in Korean adults with normal weight.
Undernutrition is a major public health problem in developing countries. Around 40·2 % of children are stunted in Pakistan. This longitudinal study aimed to assess the effectiveness of locally produced ready-to-use supplementary foods in the prevention of stunting by detecting change in of children in intervention v. control arm against the 2006 WHO growth reference. A community-based non-randomised cluster-controlled trial was conducted from January 2018 to December 2020 in the district of Kurram, Khyber Pakhtunkhwa, Pakistan. A total of 80 clusters (each cluster comprising ≈ 250–300 households) were defined in the catchment population of twelve health facilities. Children aged 6–18 months were recruited n 1680. The intervention included a daily ration of 50 g – locally produced ready-to-use-supplementary food (Wawa-Mum). The main outcome of this study was a change in length for age z-score (LAZ) v. WHO growth standards. Comparison between the interventions was by t test and ANOVA. Cox proportional hazard models were used to assess the association between stunting occurrence and the utilisation of locally produced supplement. Out of the total 1680, fifty-one out of the total 1680, 51·1 out of the total 1680 and 51·1 % (n 859) were male. Mean age 13·9 months (sd + 859) were male. Mean age 13·9 months (sd + –4·4). At baseline, 36·9 % (n 618) were stunted. In the intervention group, mean LAZ score significantly increased from −1·13(2·2 sd) at baseline to −0·93(1·8 sd) at 6-month follow-up (P value 0·01) compared with the control group. The incidence rate of stunting in the intervention arm was 1·3 v. 3·4 per person year in the control arm. The control group had a significantly increased likelihood of stunting (Hazard Ratio (HR) 1·7, 95 % CI 1·46, 2·05, P value < 0·001) v. the intervention group. Locally produced ready-to-use supplementary food is an effective intervention for reducing stunting in children below 2 years of age. This can be provided as part of a malnutrition prevention package to overcome the alarming rates of stunting in Pakistan.
This study aimed to evaluate the early introduction of ultra-processed foods (UPF) and identify its association with overweight and anaemia in Brazilian children living in a situation of social vulnerability. A population-based cross-sectional study was conducted in a Brazilian capital. Children aged 12-59 months were included. The presence of overweight and anaemia was evaluated, as well as the introduction of twelve different UPF in children’s first year of life. Association analysis was performed using Poisson regression, with robust estimates of variances. A total of 561 children were studied; 85·5 % had consumed at least one UPF evaluated in the first year of life; 19·1 % were overweight and 52·0 % were anaemic. Adjusted multivariate analyses identified that the early introduction of soft drinks (Prevalence Ratio (PR) = 1·18, 95 % CI (1·02, 1·38)), packaged snacks (PR = 1·17, 95 % CI (1·05, 1·30)) and powdered soft drinks (PR = 1·36, 95 % CI (1·16, 1·60)) increased the likelihood of children being overweight, and the early introduction of chocolate drink (PR = 1·25, 95 % CI (1·02, 1·53)) increased the likelihood of them being anaemic, when comparing children who consumed these UPF before reaching 1 year of age with those who consumed these foods at 12 months of age or older. From the results found, one can see the existing relationship between the early introduction of UPF with overweight and anaemia, being necessary to intensify public health policies to combat malnutrition, focusing on the promotion of proper and healthy eating, especially during the phase of food introduction, focusing on the population living in socially vulnerable situations.
Metabolic-associated fatty liver disease (MAFLD) has been proposed to replace the term non-alcoholic fatty liver disease (NAFLD) in 2020. The association between micronutrients and MAFLD has not been reported. Therefore, this study aims to explore the association between micronutrients intake and MAFLD. This was a cross-section study based on the National Health and Nutrition Examination Survey (NHANES). The dietary intake of copper, zinc, iron, and selenium was evaluated using the 24-h dietary recall interview. Logistic regression analysis was used to explore the association between micronutrients and MAFLD, and the results were shown as odds ratio (OR) with 95 % confidence intervals (CIs). A total of 5976 participants were finally included for analysis, with 3437 participants in the MAFLD group. After adjusting potential confounders, copper intake at quartile Q3 (OR = 0⋅68, 95 % CI 0⋅50, 0⋅93) and Q4 (OR = 0⋅60, 95 % CI 0⋅45, 0⋅80) was found to be associated with lower odds of MAFLD. Iron intake at Q2 (OR = 0⋅64, 95 % CI 0⋅45, 0⋅92) and Q3 (OR = 0⋅61, 95 % CI 0⋅41, 0⋅91) was associated with the lower odds of MAFLD. Our findings found that high intake of copper and adequate intake of iron were associated with MAFLD, which may provide guidance for the management of MAFLD.
Research on the link between diet and multimorbidity is scarce, despite significant studies investigating the relationship between diet and individual chronic conditions. This study examines the association of dietary intake of macro- and micronutrients with multimorbidity in Cyprus's adult population. It was conducted as a cross-sectional study, with data collected using a standardised questionnaire between May 2018 and June 2019. The questionnaire included sociodemographic information, anthropometrics, medical history, dietary habits, sleep quality, smoking habits, and physical activity. The participants were selected using a stratified sampling method from adults residing in the five government-controlled municipalities of the Republic of Cyprus. The study included 1137 adults with a mean age of 40⋅8 years, of whom 26 % had multimorbidity. Individuals with multimorbidity consumed higher levels of sodium (P = 0⋅009) and vitamin A (P = 0⋅010) compared to those without multimorbidity. Additionally, higher fibre and sodium intake were also observed in individuals with at least one chronic disease of the circulatory system or endocrine system, compared to those with no chronic diseases in these systems (P < 0⋅05). Logistic regression models revealed that individuals with ≥2 chronic diseases compared to 0 or 1 chronic disease had higher fat intake (OR = 1⋅06, 95 % CI: 1⋅02, 1⋅10), higher iron intake (OR = 1⋅05, 95 % CI: 1⋅01, 1⋅09), lower mono-unsaturated fat intake (OR = 0⋅91, 95 % CI: 0⋅86, 0⋅96), and lower zinc intake (OR = 0⋅98, 95 % CI: 0⋅96, 0⋅99). Future research should replicate these results to further explore the intricate relationships between nutrient intake and multimorbidity. Our study's findings suggest that specific dietary components may contribute to preventing and managing multimorbidity.
During industrial processing, heat treatments applied to infant formulas may affect protein digestion. Recently, innovative processing routes have been developed to produce minimally heat-processed infant formula. Our objective was to compare the in vivo protein digestion kinetics and protein quality of a minimally processed (T−) and a heat-treated (T+++) infant formula. Sixty-eight male Wistar rats (21 d) were fed with either a diet containing 40 % T− (n 30) or T+++ (n 30), or a milk protein control diet (n 8) during 2 weeks. T− and T+++ rats were then sequentially euthanised 0, 1, 2, 3 or 6 h (n 6/time point) after ingestion of a meal containing their experimental diet. Control rats were euthanised 6 h after ingestion of a protein-free meal to determine nitrogen and amino acid endogenous losses. Nitrogen and amino acid true caecal digestibility was high for both T− and T+++ diets (> 90 %), but a tendency towards higher nitrogen digestibility was observed for the T− diet (96·6 ± 3·1 %) compared with the T+++ diet (91·9 ± 5·4 %, P = 0·0891). This slightly increased digestibility led to a greater increase in total amino acid concentration in plasma after ingestion of the T− diet (P = 0·0010). Comparable protein quality between the two infant formulas was found with a digestible indispensable amino acid score of 0·8. In conclusion, this study showed that minimal processing routes to produce native infant formula do not modify protein quality but tend to enhance its true nitrogen digestibility and increase postprandial plasma amino acid kinetics in rats.
The Türkiye–Syria earthquake struck eleven provinces directly in Türkiye on 6 February 2023. Emergency nutrition care is indispensable for sustaining the lives of victims and rescue personnel. To optimally support their well-being, emergency food must be both healthy (i.e. aligned with dietary guidelines) and safe. However, globally, there is a dearth of research on the emergency nutrition conditions in shelters in the immediate aftermath of natural disasters. This lack of scientific evidence could limit the extent to which nutritional gaps can be identified and remedied for future relief efforts. Therefore, the aim of this research was to evaluate the nutrition environment and nutritional quality of emergency meals distributed to survivors in Malatya, a heavily affected province in Türkiye. The rapid assessment was conducted in thirteen locations by using an embedded case–study design to evaluate the nutrition environment both quantitatively and qualitatively. Meals served to earthquake victims and volunteers were found to be insufficient in protein, fat, fibre, vitamin C, Ca and Fe, but Na levels were higher than the maximum threshold in many of the centres. The qualitative analysis illustrated insufficiency in three domains of the emergency food and nutrition environment: foods and beverages offered, cooking/food preparation and food safety and dining areas and other facilities. Given the major nutritional gaps identified in this study, future disaster preparations should implement emergency nutrition plans that ensure healthy, nutritious and safe food for survivors. Better coordination and use of technology are necessary for interventions to prevent malnutrition.
This systematic review aimed to investigate the association between dietary inflammatory potential and liver cancer to provide evidence regarding scientific dietary health education.
Design:
Systematic review and meta-analysis.
Setting:
A comprehensive literature review was conducted to identify case–control or cohort studies that involved dietary inflammation index (DII)/empirical dietary inflammation pattern (EDIP) and liver cancer in PubMed, EMBASE, Cochrane, and Web of Science databases. Using a combination of DII/EDIP and liver cancer as the search terms, the associations between DII/EDIP and liver cancer were then assessed.
Participants:
Three case–control studies and two cohort studies were brought into the meta-analysis, with 225 713 enrolled participants.
Results:
Meta-analysis of categorical variables showed that DII/EDIP in the highest category increased the risk of liver cancer compared to DII/EDIP in the lowest category (relative risk (RR) = 2·35; 95 % CI 1·77, 3·13; P = 0·000) and with low heterogeneity across studies (I2 = 40·8 %, P = 0·119). Meta-analysis of continuous variables showed that significant positive association between liver cancer and DII/EDIP scores (RR = 1·24; 95 % CI 1·09, 1·40; P = 0·001), and no heterogeneity (I² = 0·0 %, P = 0·471). Stratified according to the study design, there was a significant positive association between liver cancer and DII/EDIP scores in both cohort studies (RR = 2·16; 95 % CI 1·51, 3·07; P = 0·000) and case–control studies (RR = 2·75; 95 % CI 1·71, 4·41; P = 0·000).
Conclusion:
The higher the DII/EDIP score, the higher the risk of liver cancer. This finding may have prominent implications for the general population.