To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Keyhole is an internationally recognised front-of-pack nutrition label, guiding consumers to healthier food options. It indicates products in accordance with specific criteria for dietary fats, sugars, fibres, salt and wholegrains. The objective of this study was to simulate the potential impact of the Keyhole on adolescents’ energy and nutrient intakes by modelling a shift from reported food intakes to foods meeting the Keyhole criteria.
Design:
Self-reported dietary intake data were derived from a cross-sectional survey. Multiple replacement scenarios were calculated, where foods meeting the Keyhole criteria replaced reported non-compliant foods with varying proportions of replacement.
Setting:
Dietary survey ‘Riksmaten Adolescents 2016–2017’ in schools across Sweden.
Participants:
A nationally representative sample of 3099 adolescents in school years 5, 8 and 11 (55 % girls).
Results:
Overall, replacement with foods meeting the Keyhole criteria led to more adolescents meeting nutrition recommendations. Largest median intake improvements were seen for wholegrains (+196 %), SFA (-13 %), PUFA (+17 %) and fibres (+15 %). Smallest improvements were seen for free sugars (-3 %) and salt (-2 %), partly explained by the ineligibility of main food sources of free sugars for the Keyhole, and non-inclusion of ready meals that are often high in salt. Most micronutrient intakes were stable or improved. Unintentional effects included decreases in vitamin A, MUFA and energy intakes. Largest potential improvements in fat and fibre sources were observed in the youngest age group.
Conclusions:
A shift to Keyhole alternatives for everyday foods would improve adolescents’ nutrient intakes, even with smaller exchanges.
This study systematically reviewed the evidence on interventions seeking to improve Food and Nutrition Literacy (FNLIT) functional, interactive and critical skills in primary school-aged children. Electronic databases, including PubMed/MEDLINE, SCOPUS, Web of Science, Cochrane, Pro-Quest and Google Scholar were systematically searched. Randomised and non-randomised controlled trials, pre-/post-test and case–control designs were included. The primary outcomes were three levels of FNLIT: functional, interactive and critical. All citations, full-text articles and abstract data were screened by two independent reviewers. Any conflicts were then resolved through discussion. The quality of the included studies was individually evaluated using the Effective Public Health Practice Project (EPHPP) quality assessment tool. Two reviewers extracted data from the included studies, and a descriptive analysis was performed. The quality of all eligible studies (n 19) was rated as moderate/weak. A wide variety of skill-building activities were introduced by programmes, including recipe skills/food preparation, food label literacy, food tasting, gardening harvesting, and supporting cultural practices and ethnic foods. Only four studies measured food literacy (FL) (food label literacy) via a valid measure. Most interventions focused on the functional level of FL, except for two programmes (one scored weak and one scored moderate). In most of the studies, delivery of intervention content was facilitated by teachers (n 15). Promising interventions were tailored to the needs and interests of students, incorporated into the existing curriculum and facilitated by teachers. The successful intervention strategies led to improvements in functional, partly interactive and critical skills. Future interventions should focus, holistically, on all aspects of FNLIT, especially interactive and critical skills.
Few studies have examined the association between coffee consumption and muscle mass; their results are conflicting. Therefore, we examined the association between coffee consumption and low muscle mass prevalence. We also performed an exploratory investigation of the potential effect modification by demographic, health status-related and physical activity-related covariates. This cross-sectional study included 2085 adults aged 40–87 years. The frequency of coffee consumption was assessed using a self-administered questionnaire. Muscle mass was assessed as appendicular skeletal muscle mass/height2 using a multifrequency bioelectrical impedance analyser. We defined low muscle mass using cut-offs recommended by the Asian Working Group for Sarcopenia. Multivariable-adjusted OR for low muscle mass prevalence were estimated using a logistic regression model. The prevalence of low muscle mass was 5·4 % (n 113). Compared with the lowest coffee consumption group (< 1 cup/week), the multivariable-adjusted OR (95 % CI) of low muscle mass prevalence were 0·62 (0·30, 1·29) for 1–3 cups/week, 0·53 (0·29, 0·96) for 4–6 cups/week or 1 cup/d and 0·28 (0·15, 0·53) for ≥ 2 cups/d (P for trend < 0·001). There were no significant interactions among the various covariates after Bonferroni correction. In conclusion, coffee consumption may be inversely associated with low muscle mass prevalence.
The main objective was to investigate the association of household food insecurity (HFI) with child oral health. A secondary objective was to explore potential dietary and non-dietary mediators of the HFI–child oral health relationship.
Design:
Cross-sectional data from the nationally representative Ecuadorian National Health and Nutrition Survey (2018) were analysed. The data included self-reported child oral health, HFI (Food Insecurity Experience Scale), diet (FFQ) and oral care behaviours (toothbrushing frequency, toothpaste use). The association of HFI with the reported number of oral health problems was examined with stereotype logistic regression. Parallel mediation analysis was used to explore potential dietary (highly fermentable carbohydrate foods, plain water) and non-dietary (toothbrushing) mediators of the HFI–oral health relationship. Bias-corrected standard errors and 95 % CI were obtained using non-parametric bootstrapping (10 000 repetitions). Effect size was measured by percent mediation (PM).
Setting:
Ecuador.
Participants:
5–17-year-old children (n 23 261).
Results:
HFI affected 23 % of child households. 38·5 % of children have at least one oral health problem. HFI was associated with a greater number of oral health problems: 1–2 problems (adjusted odds ratio (AOR) = 1·37; 95 % CI (1·15, 1·58); P = 0·0001), 3–4 problems (AOR = 2·21; 95 % CI (1·98, 2·44); P = 0·0001), 5–6 problems (AOR = 2·57; 95 % CI (2·27, 2·88); P = 0·0001). The HFI–oral health relationship was partially mediated by highly fermentable carbohydrate foods (PM = 4·3 %), plain water (PM = 1·8 %) and toothbrushing frequency (PM = 3·3 %).
Conclusions:
HFI was associated with poorer child oral health. The HFI–oral health relationship was partially mediated by dietary and non-dietary factors. Longitudinal studies are needed to replicate our findings and investigate the role of other potential mediators.
Ingestion of mycoprotein stimulates skeletal muscle protein synthesis (MPS) rates to a greater extent than concentrated milk protein when matched for leucine content, potentially attributable to the wholefood nature of mycoprotein. We hypothesised that bolus ingestion of mycoprotein as part of its wholefood matrix would stimulate MPS rates to a greater extent compared with a leucine-matched bolus of protein concentrated from mycoprotein. Twenty-four healthy young (age, 21 ± 2 years; BMI, 24 ± 3 kg.m2) males received primed, continuous infusions of L-[ring-2H5]phenylalanine and completed a bout of unilateral resistance leg exercise before ingesting either 70 g mycoprotein (MYC; 31·4 g protein, 2·5 g leucine; n 12) or 38·2 g of a protein concentrate obtained from mycoprotein (PCM; 28·0 g protein, 2·5 g leucine; n 12). Blood and muscle samples (vastus lateralis) were taken pre- and (4 h) post-exercise/protein ingestion to assess postabsorptive and postprandial myofibrillar protein fractional synthetic rates (FSR) in resting and exercised muscle. Protein ingestion increased plasma essential amino acid and leucine concentrations (P < 0·0001), but more rapidly (both 60 v. 90 min; P < 0·0001) and to greater magnitudes (1367 v. 1346 μmol·l–1 and 298 v. 283 μmol·l–1, respectively; P < 0·0001) in PCM compared with MYC. Protein ingestion increased myofibrillar FSR (P < 0·0001) in both rested (MYC, Δ0·031 ± 0·007 %·h–1 and PCM, Δ0·020 ± 0·008 %·h–1) and exercised (MYC, Δ0·057 ± 0·011 %·h–1 and PCM, Δ0·058 ± 0·012 %·h–1) muscle, with no differences between conditions (P > 0·05). Mycoprotein ingestion results in equivalent postprandial stimulation of resting and post-exercise myofibrillar protein synthesis rates irrespective of whether it is consumed within or without its wholefood matrix.
The ongoing nutrition transition in lower- and middle-income countries (LMIC) in South East Asia may have a positive impact on protein nutrition. This study assessed the diversity of plant and animal protein food sources in relation to essential amino acid (EAA) adequacy in a population-based sample (N 1665) in Indonesia. Dietary intakes from in-person 24 h recalls provided data on energy and protein intakes (in g/d) from plants (grains, legumes), meat, poultry and fish, and eggs and dairy. Protein diversity scores were based on the number of protein food sources over 24 h. EAA scores were the ratio of amino acid intakes to recommended values. Protein diversity and EAA scores were then compared across multiple socio-demographic indices. Analysis of variance and χ2 tests were used to test for differences among groups. Energy intakes were 1678 kcal/d for men and 1435 kcal/d for women. Average protein intakes (and prevalence of inadequacy) were 59⋅4 g/d (41⋅7 %) for men and 51⋅5 g/d (51⋅1 %) for women. In regression analyses, higher protein diversity scores were associated with higher protein intakes, more animal protein and less plant protein and with higher EAA scores. Lower protein diversity scores were associated with lower intakes of lysine, leucine and valine relative to requirements, as well as with lower EAA, rural settings, less wealth and less modernisation. Greater diversity of animal protein food sources, observed among groups of higher socio-economic status, was linked to better amino acid adequacy and protein nutrition.
Traditional methods of dietary assessment are prone to measurement error, with energy intake often under-reported. The 24-h recall is widely used in dietary assessment, however, its reliance on self-report without verification of consumption can result in inaccuracies in true nutrient intake. Wearable cameras may provide a complementary approach to improve self-report accuracy by providing an objective and passive measure of food consumption. The purpose of the present study was to determine whether a wearable camera improves the accuracy of a 24-h recall compared with a 24-h recall alone in twenty adults aged 18–65 years. The study also explored limitations associated with wearable cameras. Participants wore the camera for 1 d and a 24-h recall was then conducted the following day, before and after viewing the camera images. Dietary data were analysed using Nutritics dietary analysis software, while eating habits were assessed by a self-report questionnaire. Energy and nutrient intakes were compared between the recall alone and the camera-assisted recall. Results showed a significant increase in mean energy intake with the camera-assisted recall compared with the recall alone (9677⋅8 ± 2708⋅0 kJ/d v. 9304⋅6 ± 2588⋅5 kJ/d, respectively, P = 0⋅003). Intakes of carbohydrates, total sugars and saturated fats were also significantly higher with the camera-assisted recall. In terms of challenges, there were occasionally technological issues such as proper positioning of the camera by the participants. In conclusion, reporting of energy and nutrient intake may be enhanced when a traditional method of dietary assessment, the 24-h recall, is assisted by a wearable camera.
Plant-based diets may increase the risk of vitamin B12 deficiency due to limited intake of animal-source foods, while dietary folate increases when adhering to plant-based diets. In this cross-sectional study, we evaluated the B12 and folate status of Norwegian vegans and vegetarians using dietary B12 intake, B12 and folic acid supplement use, and biomarkers (serum B12 (B12), plasma total homocysteine (tHcy), plasma methylmalonic acid (MMA) and serum folate). Vegans (n 115) and vegetarians (n 90) completed a 24-h dietary recall and a FFQ and provided a non-fasting blood sample. cB12, a combined indicator for evaluation of B12 status, was calculated. B12 status was adequate in both vegans and vegetarians according to the cB12 indicator; however 4 % had elevated B12. Serum B12, tHcy, MMA concentrations and the cB12 indicator (overall median: 357 pmol/l, 9·0 µmol/l, 0·18 µmol/l, 1·30 (cB12)) did not differ between vegans and vegetarians, unlike for folate (vegans: 25·8 nmol/l, vegetarians: 21·6 nmol/l, P = 0·027). Serum B12 concentration < 221 pmol/l was found in 14 % of all participants. Vegetarians revealed the highest proportion of participants below the recommended daily intake of 2 µg/d including supplements (40 v. 18 %, P < 0·001). Predictors of higher serum B12 concentrations were average daily supplement use and older age. Folate deficiency (< 10 nmol/l) was uncommon overall (< 2·5 %). The combined indicator cB12 suggested that none of the participants was B12-depleted; however, low serum B12 concentration was found in 14 % of the participants. Folate concentrations were adequate, indicating adequate folate intake in Norwegian vegans and vegetarians.
Unhealthy dietary habits can contribute to the development of colorectal cancer (CRC). Such habits may also be associated with post-treatment symptoms experienced by CRC survivors. Therefore, we aimed to assess longitudinal associations of post-treatment unhealthy dietary habits, i.e. intake of ultra-processed foods (UPF), red and processed meat, alcohol and sugar-sweetened drinks, with health-related quality of life (HRQoL), fatigue and chemotherapy-induced peripheral neuropathy (CIPN) in CRC survivors from 6 weeks up to 24 months post-treatment. In a prospective cohort among stage I-III CRC survivors (n 396), five repeated home visits from diagnosis up to 24 months post-treatment were executed. Dietary intake was measured by 7-d dietary records to quantify consumption of UPF, red and processed meat, alcohol and sugar-sweetened drinks. HRQoL, fatigue and CIPN were measured by validated questionnaires. We applied confounder-adjusted linear mixed models to analyse longitudinal associations from 6 weeks until 24 months post-treatment. We applied a post hoc time-lag analysis for alcohol to explore the directionality. Results showed that higher post-treatment intake of UPF and sugar-sweetened drinks was longitudinally associated with worsened HRQoL and more fatigue, while higher intake of UPF and processed meat was associated with increased CIPN symptoms. In contrast, post-treatment increases in alcohol intake were longitudinally associated with better HRQoL and less fatigue; however, time-lag analysis attenuated these associations. In conclusion, unhealthy dietary habits are longitudinally associated with lower HRQoL and more symptoms, except for alcohol. Results from time-lag analysis suggest no biological effect of alcohol; hence, the longitudinal association for alcohol should be interpreted with caution.
In Ethiopia, information is limited about energy and micronutrient intakes from complementary foods consumed by children in Productive Safety Net Program districts. Therefore, we assessed feeding practices and intakes of energy and selected micronutrients from complementary foods of children aged 6–23 months in a food insecure rural area of Ethiopia. Energy and micronutrient intakes were estimated from multiple-pass 24 h recall. Data were collected using a structured questionnaire. Only 1⋅9 % of children in the age range 6–8 months met recommended minimum dietary diversity of ≥5 food groups; this value slightly increased to 4 and 10⋅1 % in the older age groups (9–11 months and 12–23 months, respectively). Overwhelmingly, none of the children (9–11 months) did get the minimum acceptable diet (Children receiving minimum acceptable diet were 4 and 2⋅6 % in 6–8 months and 12–23 months, respectively). The overall prevalence of stunting was 34 % in younger children (6–8 months) and 51 % in older children aged 12–23 months. Median energy and selected micronutrient intakes from complementary foods were below corresponding WHO recommendations assuming average breast-milk amount and composition. The worst shortfalls were for vitamins A and C and for Ca. In contrast, median iron, protein and niacin intakes and densities were above the WHO recommendation. Caretakers and community leaders in the study setting need nutrition education on IYCF-related practices and on the importance of men's involvement in IYCF. Ensuring the accessibility and affordability of animal source foods (ASFs), fruits and vegetables, and feasible complementary foods is critical to address the quality of complementary feedings. This can be achieved through promoting nutrition-sensitive agriculture such as poultry and home gardening in this setting.
As we continue to elucidate the mechanisms underlying age-related brain diseases, the reductionist strategy in nutrition–brain function research has focused on establishing the impact of individual foods. However, the biological processes connecting diet and cognition are complex. Therefore, consideration of a combination of nutritional compounds may be most efficacious. One barrier to establishing the efficacy of multi-nutrient interventions is that the area lacks an established set of evidence-based guidelines for studying their effect on brain health. This review is an output of the International Life Sciences Institute (ILSI) Europe. A multi-disciplinary expert group was assembled with the aim of developing a set of considerations to guide research into the effects of multi-nutrient combinations on brain functions. Consensus recommendations converged on six key issues that should be considered to advance research in this area: (1) establish working mechanisms of the combination and contributions of each individual compound; (2) validate the relevance of the mechanisms for the targeted human condition; (3) include current nutrient status, intake or dietary pattern as inclusion/exclusion criteria in the study design; (4) select a participant population that is clinically and biologically appropriate for all nutritional components of the combination; (5) consider a range of cognitive outcomes; (6) consider the limits of reductionism and the ‘gold standard’ randomised controlled trial. These guiding principles will enhance our understanding of the interactive/complementary activities of dietary components, thereby strengthening the evidence base for recommendations aimed at delaying cognitive decline.
It has been suggested that added sugar intake is associated with non-alcoholic fatty liver disease (NAFLD). However, previous studies only focused on sugar-sweetened beverages; the evidence for associations with total added sugars and their sources is scarce. This study aimed to examine the associations of total added sugars, their physical forms (liquid v. solid) and food sources with risk of NAFLD among adults in Tianjin, China. We used data from 15 538 participants, free of NAFLD, other liver diseases, CVD, cancer or diabetes at baseline (2013–2018 years). Added sugar intake was estimated from a validated 100-item FFQ. NAFLD was diagnosed by ultrasonography after exclusion of other causes of liver diseases. Multivariable Cox proportional hazards models were fitted to calculate hazard ratios (HR) and corresponding 95 % CI for NAFLD risk with added sugar intake. During a median follow-up of 4·2 years, 3476 incident NAFLD cases were documented. After adjusting for age, sex, BMI and its change from baseline to follow-up, lifestyle factors, personal and family medical history and overall diet quality, the multivariable HR of NAFLD risk were 1·18 (95 % CI 1·06, 1·32) for total added sugars, 1·20 (95 % CI 1·08, 1·33) for liquid added sugars and 0·96 (95 % CI 0·86, 1·07) for solid added sugars when comparing the highest quartiles of intake with the lowest quartiles of intake. In this prospective cohort of Chinese adults, higher intakes of total added sugars and liquid added sugars, but not solid added sugars, were associated with a higher risk of NAFLD.
A global target of increasing exclusive breast-feeding (EBF) to at least 50 % by the year 2025 was set by the WHO for infants under 6 months. The lowest prevalence in the world was found in the Eastern Mediterranean region in 2010–18 and little is known about the status of mothers’ feeding practices in Saudi Arabia. The present study aimed to assess mothers’ actual feeding and weaning practices used with their infants by the mothers’ different age groups. The present study was conducted among 247 mothers of infants aged 4–12 months who were attending public well-baby clinics. Quantitative data were obtained by nutritionists using an electronic semi-structured questionnaire about mothers’ feeding practices. Only 5·3 % of mothers engaged in EBF, 44·9 % breast-fed their infants after an hour of birth, while 92·7 % of infants had ever been breast-fed. The average intent/plan to continue breast-feeding was 4·9(±3·1) months. Younger mothers introduced weaning food around 4 weeks earlier than older mothers (mean differences were −0·4, 95 % CI −0·71, −0·13; P = 0·031). A total of 64·3 % of infants received complementary feeding before completing 17 weeks. Maternal age group and delivery mode were the only factors associated with the early introduction of complementary feeding. A total of 69·2 % of the mothers believed that ‘it is a good time’ and 61·1 % felt that ‘infants are hungry and need other sources of food’. Online sources and family advice were the top sources of information on mothers’ feeding practices. Provision of professional advice about EBF and optimal weaning practices are significant areas for improvement in terms of compliance with recommended infant feeding practices.
The relationship of a diet low in fibre with mortality has not been evaluated. This study aims to assess the burden of non-communicable chronic diseases (NCD) attributable to a diet low in fibre globally from 1990 to 2019.
Design:
All data were from the Global Burden of Disease (GBD) Study 2019, in which the mortality, disability-adjusted life-years (DALY) and years lived with disability (YLD) were estimated with Bayesian geospatial regression using data at global, regional and country level acquired from an extensively systematic review.
Setting:
All data sourced from the GBD Study 2019.
Participants:
All age groups for both sexes.
Results:
The age-standardised mortality rates (ASMR) declined in most GBD regions; however, in Southern sub-Saharan Africa, the ASMR increased from 4·07 (95 % uncertainty interval (UI) (2·08, 6·34)) to 4·60 (95 % UI (2·59, 6·90)), and in Central sub-Saharan Africa, the ASMR increased from 7·46 (95 % UI (3·64, 11·90)) to 9·34 (95 % UI (4·69, 15·25)). Uptrends were observed in the age-standardised YLD rates attributable to a diet low in fibre in a number of GBD regions. The burden caused by diabetes mellitus increased in Central Asia, Southern sub-Saharan Africa and Eastern Europe.
Conclusions:
The burdens of disease attributable to a diet low in fibre in Southern sub-Saharan Africa and Central sub-Saharan Africa and the age-standardised YLD rates in a number of GBD regions increased from 1990 to 2019. Therefore, greater efforts are needed to reduce the disease burden caused by a diet low in fibre.
The aim of this study was to investigate the effects of guarana supplementation on cognitive performance before and after a bout of maximal intensity cycling and to compare this to an equivalent caffeine dose. Twenty-five participants completed the randomised double-blind crossover trial by performing cognitive tests with one of three supplements, on three different days: guarana (125 mg/kg), caffeine (5 mg/kg) or placebo (65 mg/kg protein powder). After 30 min of rest, participants performed simple (SRT) and choice reaction time (CRT) tests, an immediate word recall test and Bond–Lader mood scale. This was followed by a cycling V̇O2max test, and cognitive tests were then immediately repeated. Guarana supplementation decreased CRT before exercise (407 (sd 45) ms) in comparison with placebo (421 (sd 46) ms, P = 0·030) but not caffeine (417 (sd 42) ms). SRT after exercise decreased following guarana supplementation (306 (sd 28) ms) in comparison with placebo (323 (sd 32) ms, P = 0·003) but not caffeine (315 (sd 32) ms). Intra-individual variability on CRT significantly improved from before (111·4 (sd 60·5) ms) to after exercise (81·85 (sd 43·1) ms) following guarana supplementation, and no differences were observed for caffeine and placebo (P > 0·05). Alertness scores significantly improved following guarana supplementation (63·3 (sd 13·8)) in comparison with placebo (57·4 (sd 13·4), P = 0·014) but not caffeine (61·2 (sd 12·8)). There were no changes to V̇O2max, immediate word recall or any other Bond–Lader mood scales. Guarana supplementation appears to impact several parameters of cognition. These results support the use of guarana supplementation to possibly maintain speed of attention immediately following a maximal intensity exercise test (V̇O2max).
Alternative plant-based meats have grown in popularity with consumers recently and researchers are examining the potential health effects, or risks, from consuming these products. Because there have been no studies to date that have specifically assessed the health effects of plant-based meats on biomarkers of inflammation, the purpose of this work was to conduct a secondary analysis of the Study With Appetizing Plantfood – Meat Eating Alternatives Trial (SWAP-MEAT). SWAP-MEAT was a randomised crossover trial that involved generally healthy adults eating 2 or more servings of plant-based meats per day for 8 weeks (i.e. Plant phase) followed by 2 or more servings of animal meats per day for 8 weeks (i.e. Animal phase). Results of linear mixed-effects models indicated only 4 out of 92 biomarkers reached statistical significance. The results were contrary to our hypothesis, since we expected relative improvements in biomarkers of inflammation from the plant-based meats.
To explore the relationship between ultra-processed foods (UPF) consumption and dietary, lifestyle and social determinants using pathway analysis in the baseline of the Cohort of Universities of Minas Gerais (CUME project).
Design:
Cross-sectional study, in which path analysis was used to estimate direct and indirect effects of dietary practices, sleep, time on the computer and professional status on UPF consumption.
Setting:
Data were collected in 2016, through an online questionnaire composed of sociodemographic, anthropometric, lifestyle and dietary practices questions, and a FFQ.
Participants:
Baseline participants from the CUME Project (n 2826), adults who graduated from Universidade Federal de Viçosa or Universidade Federal de Minas Gerais, Brazil.
Results:
Being employed (P = 0·024), the time spent on the computer (P = 0·031) and the frequency of fried food intake (P < 0·001) were positively and directly associated with UPF consumption, whereas the sleep duration (P = 0·007) and the number of meals per d (P < 0·001) were negatively and directly associated with UPF consumption. Indirect effects were observed between being employed, mediated by the sleep duration (P = 0·032) and fried food intake (P = 0·005), whereas being a student is mediated by the time on the computer (P = 0·048).
Conclusion:
The time spent on the computer, sleep duration and fried food consumption showed direct effects on UPF consumption. They also acted as mediators on the relationship between professional status and UPF consumption. Besides, the number of meals eaten each day also was directly associated with UPF consumption.
The present study aimed to determine the prevalence of adiposity-based chronic disease (ABCD) and its association with anthropometric indices in the Mexican population. A cross-sectional study was conducted in 514 adults seen at a clinical research unit. The American Association of Clinical Endocrinology/AACE/ACE criteria were used to diagnose ABCD by first identifying subjects with BMI ≥ 25 kg/m2 and those with BMI of 23–24·9 kg/m2 and waist circumference ≥ 80 cm in women or ≥ 90 cm in men. The presence of metabolic and clinical complications associated with adiposity, such as factors related to metabolic syndrome, prediabetes, type 2 diabetes, dyslipidaemia and arterial hypertension, were subsequently evaluated. Anthropometric indices related to cardiometabolic risk factors were then determined. The results showed the prevalence of ABCD was 87·4 % in total, 91·5 % in men and 86 % in women. The prevalence of ABCD stage 0 was 2·4 %, stage 1 was 33·7 % and stage 2 was 51·3 %. The prevalence of obesity according to BMI was 57·6 %. The waist/hip circumference index (prevalence ratio (PR) = 7·57; 95 % CI 1·52, 37·5) and the conicity index (PR = 3·46; 95 % CI 1·34, 8·93) were better predictors of ABCD, while appendicular skeletal mass % and skeletal muscle mass % decreased the risk of developing ABCD (PR = 0·93; 95 % CI 0·90, 0·96; and PR = 0·95; 95 % CI 0·93, 0·98). In conclusion, the prevalence of ABCD in our study was 87·4 %. This prevalence increased with age. It is important to emphasise that one out of two subjects had severe obesity-related complications (ABCD stage 2).
We used data from the Campinas Health Survey (ISACamp 2014/15) and the Food Consumption and Nutritional Status Survey (ISACamp-Nutri 2015/16) to estimate the prevalence of the consumption of foods and beverages that contain low-calorie sweeteners (LCS) by individuals ≥ 10 years to estimate the dietary exposure of the population to high levels of LCS. We first estimated the prevalence of consuming LCS-containing foods and beverages and identified the top sources of LCS consumption. We then verified whether the prevalence of consumption varied according to individual-level characteristics or the presence of obesity and diabetes. Finally, we estimated the population dietary exposure to high levels of LCS and compared it with the acceptable daily intake (ADI) levels. Over 40 % of the study population consumed at least one LCS-containing food or beverage. Sweetened beverages, tabletop sweeteners and dairy beverages were the top contributors to the consumption of LCS. Among all age groups, education levels, and income levels, the consumption of LCS-containing foods and beverages ranged from 35 % to 55 %. The prevalence was only slightly greater among higher income 40–59-year-olds than among other income groups and was not higher among individuals with obesity or diabetes. Although dietary exposure to LCS did not exceed the ADI levels, we identified several limitations in our ability to measure exposure to high levels of LCS. Because of these challenges and the unclear evidence linking LCS to better health outcomes, the consumption of LCS-containing foods and beverages should be closely monitored.