To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Dietary behaviours and the food systems in which they occur have a significant impact on climate change. The 2022 Intergovernmental Panel on Climate Change (IPCC) reports and other major climate reports have identified population-level dietary shifts towards balanced, sustainable healthy diets as an important mitigation (i.e. prevention) solution for climate change. Thus, public health nutrition researchers and practitioners have a crucial role to play in combatting the climate crisis. They have the content expertise, interdisciplinary training and technical skills needed to facilitate wide-scale dietary behaviour changes at multiple levels of influence and ultimately improve both human and planetary health. This commentary article: (i) summarises how dietary behaviours and food systems contribute to climate change, with a particular focus on high-income countries; (ii) reviews food-system-related climate change mitigation solutions most relevant to public health nutrition researchers and practitioners; and (iii) identifies key gaps in the literature and future research directions for the field.
This study aimed to identify patterns of anthropometric trajectories throughout life and to analyse their association with the occurrence of sarcopenia in people from the Longitudinal Study of Adult Health (ELSA-Brasil). It is a cross-sectional study involving 9670 public servants, aged 38–79 years, who answered the call for new data collection and exams, conducted approximately 4 years after the study baseline (2012–2014). Data sequence analysis was used to identify patterns of anthropometric trajectory. A theoretical model was elaborated based on the directed acyclic graph (DAG) to select the variables of minimum adjustment in the analysis of the causal effect between trajectory and sarcopenia. Poisson regression with robust variance was adopted for data analysis. The patterns of change in the anthropometric trajectory were classified in stable weight (T1); change to normal weight (T2); change to excess weight (T3); weight fluctuation (T4) and change to low weight (T5). The prevalence of sarcopenia in men and women who changed the anthropometric path for the low weight was twice as large when compared to participants with a stable weight trajectory. A protective effect of the excess weight trajectory was observed for the occurrence of sarcopenia in them. The results pointed to the need for health policies that encourage the proper management of body components in order to prevent and control obesity, as well as to preserve the quantity and quality of skeletal muscle mass throughout life, especially in older adults.
The goal of the present study was to evaluate the association between depression and ultra-processed food (UPF) consumption as risk factors for developing type 2 diabetes (T2D).
Design:
A prospective community study.
Setting:
Baseline data (2009–2010) from CARTaGENE community health study from Quebec, Canada, were used. Food and drink consumption was assessed using the Canadian-Diet History Questionnaire II and grouped according to their degree of processing by the NOVA classification, and participants were categorised into tertiles of UPF (g/d). Depression was defined using either a validated cut-off score on the Patient Health Questionnaire-9 or antidepressant use. The outcome was the incidence of T2D, examined in 3880 participants by linking survey data with administrative health insurance data. Cox regression models estimated the associations between UPF, depression and incident T2D.
Participants:
40–69-year-old individuals at baseline.
Results:
In total, 263 (6·8 %) individuals developed T2D. Participants with high depressive symptoms and high UPF consumption showed the highest risk for T2D (adjusted hazard ratios (aHR) = 1·58, 95 % CI (0·98, 2·68)), compared to those with low depressive symptoms and low UPF consumption. The risk for T2D was similar when high depressive symptoms and antidepressant use were combined with high UPF (aHR 1·62, 95 % CI (1·02, 2·57)).
Conclusions:
This study shows that co-occurring depression and high UPF consumption were associated with a higher risk for T2D. Early management and monitoring of both risk factors might be essential for diabetes prevention.
The aim of this study was to examine the relative validity of the online Meal-based Diet History Questionnaire (MDHQ) for assessing the overall diet quality and quality of each meal type (breakfast, lunch, dinner and snacks). In total, 222 Japanese adults (111 for each sex) aged 30–76 years completed the online MDHQ and then the 4-non-consecutive-day weighed dietary record (DR). The diet quality was assessed using the Healthy Eating Index-2015 (HEI-2015) and Nutrient-Rich Food Index 9.3 (NRF9.3). For the HEI-2015, compared with the DR, the MDHQ provided high median values for breakfast (in women only) and dinner and low median values for snacks. There were no significant differences observed for overall diet and lunch. For the NRF9.3, the MDHQ provided higher median values for breakfast and dinner and a lower median value for overall diet than the DR in women, with no significant differences for lunch and snacks. In men, no significant difference was observed, except for overall diet (the MDHQ providing a lower median value). For the HEI-2015, median Spearman correlation coefficient was 0·43, with a range from 0·12 (snacks in women) to 0·68 (breakfast in men). For the NRF9.3, median Spearman correlation coefficient was 0·47, with a range from 0·26 (snacks in men) to 0·65 (breakfast in men). Bland–Altman plots showed wide limits of agreement and, in some cases, proportional bias. In conclusion, the online MDHQ showed an acceptable ability for ranking individuals according to the quality of overall diet, breakfast, lunch and dinner, but not snacks.
Non-celiac gluten sensitivity is characterised by the presence of gastrointestinal and extraintestinal symptoms following gluten ingestion. Recent studies suggested an association between non-celiac gluten sensitivity and the consumption of fermentable oligosaccharides, disaccharides, monosaccharides and polyols (FODMAP). This systematic review aimed to examine literature evidence on the relationship between non-celiac gluten sensitivity and FODMAP intake. A comprehensive search was carried out for randomised clinical trials addressing gastrointestinal symptoms as the primary outcome, published between 2010 and 2020 in Portuguese, English or Spanish, and indexed in Scopus, PubMed, SciELO, Cochrane Library, CINAHL, Embase or VHL (LILACS) databases. The systematic review was performed using the population, intervention, comparison and outcome (PICO) framework. A total of 1133 articles were retrieved for further assessment. Three articles were selected for systematic review, one of which included two interventions with different periods and assessments. Quality of evidence was assessed according to the GRADE protocol. The selected articles used different instruments to measure gastrointestinal symptoms and quality of life, hindering comparison of data. Clinical trials identified an association between decreased gastrointestinal symptoms and FODMAP restriction. There are few studies on the topic, and those available used different instruments to assess gastrointestinal symptoms and quality of life. Nevertheless, current evidence supports the gluten-free diet still represents first-line therapy. However, a FODMAP restriction can decrease gastrointestinal symptoms in individuals with non-celiac gluten sensitivity. Further research is needed to confirm this finding.
When compared with the general population, people living with severe mental illness (SMI) are 1·8 times more likely to have obesity while in adult mental health secure units, rates of obesity are 20 % higher than the general population. In England, there are currently 490 000 people living with SMI. The aim of this systematic review was to collate and synthesise the available quantitative and qualitative evidence on a broad range of weight management interventions for adults living with SMI and overweight or obesity. Primary outcomes were reductions in BMI and body weight. Following sifting, eighteen papers were included in the final review, which detailed the results of nineteen different interventions; however, there was a lack of qualitative evidence. Pooled results for three studies (MD − 3·49, 95 % CI − 6·85, −0·13, P = 0·04) indicated a small effect in terms of body weight reduction but no effect on BMI for four studies (MD − 0·42, 95 % CI − 1·27, 0·44, P = 0·34). Key recommendations for future research included integration of qualitative methodology into experimental study design, a review of outcome measures and for study authors to follow standardised guidelines for reporting to facilitate complete and transparent reporting.
The role of early life nutrition's impact on relevant health outcomes across the lifespan laid the foundation for the field titled the developmental origins of health and disease. Studies in this area initially concentrated on nutrition and the risk of adverse cardio-metabolic and cancer outcomes. More recently the role of nutrition in early brain development and the subsequent influence of later mental health has become more evident. Scientific breakthroughs have elucidated two mechanisms behind long-term nutrient effects on the brain, including the existence of critical periods for certain nutrients during brain development and nutrient-driven epigenetic modifications of chromatin. While multiple nutrients and nutritional conditions have the potential to modify brain development, iron can serve as a paradigm to understand both mechanisms. New horizons in nutritional medicine include leveraging the mechanistic knowledge of nutrient–brain interactions to propose novel nutritional approaches that protect the developing brain through better timing of nutrient delivery and potential reversal of negative epigenetic marks. The main challenge in the field is detecting whether a change in nutritional status truly affects the brain's development and performance in human subjects. To that end, a strong case can be made to develop and utilise bioindicators of a nutrient's effect on the developing brain instead of relying exclusively on biomarkers of the nutrient's status.
To examine changes in the proportions of daily, weekly and occasional consumers of sugar-sweetened soda in six European countries that introduced/updated a tax between 2001–2002 and 2017–2018 and in neighbouring comparison countries (without a tax).
Design:
Repeated cross-sectional surveys.
Setting:
Health Behaviour in School-aged Children study, spanning five survey years (school years 2001–2002 to 2017–2018).
Participants:
Nationally representative samples of 13-year- and 15-year-old adolescents (n 236 623, 51·0 % girls).
Results:
Tax sizes (€0·02/l to €0·22/l) and pre-tax soda consumption were heterogeneous across countries. Prevalence of daily soda consumption reduced in the survey year following tax implementation in Latvia (from 17·9 to 11·9 %, P = 0·01), Finland (4·2 to 2·5 %, P = 0·001), Belgium (35·1 to 27·8 %, P < 0·001) and Portugal (17·4 to 14·9 %, P = 0·02), but not in Hungary (29·8 to 31·3 %, P = 0·47) or France (29·4 to 28·2 %, P = 0·27). However, reductions were similar (Finland) or smaller (Belgium, Portugal) than those in the comparison countries, except in Latvia where the reduction was larger (Pinteraction < 0·001). Prevalence of weekly soda consumption remained stable (Finland, Hungary and France) or increased (Latvia, Belgium); only Portugal experienced a decline (P < 0·001), which was larger than in the comparison country (Pinteraction < 0·001). Prevalence of occasional soda consumption (<1x/week) did not rise after implementation of the tax in Latvia, Finland, Hungary, France or Belgium, or the rise was similar to the comparison country in Portugal (Pinteraction = 0·15).
Conclusions:
Countries with a soda tax did not experience larger beneficial changes in post-tax adolescent consumption frequency of soda than comparison countries. Further studies, with different taxation types, are needed in the adolescent population.
Food insecurity on college campuses is a major public health problem and has been documented for the last decade. Sufficient food access is a crucial social determinant of health, thus campuses across the country have implemented various programmes, systems and policies to enhance access to food which have included food pantries, campus gardens, farmers’ markets, meal share or voucher programmes, mobile food applications, campus food gleaning, food recovery efforts, meal deliveries and task force/working groups. However, little is understood about how to best address food insecurity and support students who are struggling with basic needs. The impact of food insecurity on students’ academic and social success, in addition to their overall well-being, should be investigated and prioritised at each higher education institution. This is especially true for marginalised students, such as minority or first-generation students, who are at heightened risk for food insecurity. In order to create a culture of health equity, in which most at-risk students are provided resources and opportunities to achieve optimal well-being, higher education institutions must prioritise mitigating food insecurity on the college campus. Higher education institutions could benefit from adopting comprehensive and individualised approaches to promoting food security for marginalised students in order to facilitate equal opportunity for optimal scholastic achievement among students of all socio-demographic backgrounds.
Anaemia remains among the most prevalent nutritional problems among children in developing countries. In Ethiopia, more than half of children <5 years of age are anaemic. In the early stages of life, it leads to poor cognitive performance, delay psychomotor development and decreases working capacity in later life. The present study aimed to assess the prevalence and associated factors of anaemia among children aged 6–23 months in the Bale zone. A community-based cross-sectional study was conducted from 1 to 30 June 2021. Multistage stratified sampling and simple random sampling techniques were employed to select 770 samples. An interviewer-administered questionnaire was used to collect data on socio-demographic, child health and feeding practices. Haemoglobin levels were estimated using a portable Hemosmart machine. Children with haemoglobin values below 11 g/dl were considered anaemic. Binary logistic regression analysis was performed to identify factors associated with anaemia. Statistical significance was set at P < 0⋅05. The prevalence of anaemia was 47⋅9 % (95 % CI (44⋅4, 51⋅5)). The multivariate analysis showed that child age (6–11 months) (AOR 1⋅47; 95 % CI (1⋅06, 2⋅03)), household food insecurity (AOR 1⋅44; 95 % CI (1⋅01, 2⋅04)), having diarrhoea and cough in the past 2 weeks (AOR 1⋅70; 95 % CI (1⋅18, 2⋅44)) and (AOR 1⋅97; 95 % CI (1⋅28, 3⋅04), respectively), not consuming the recommended dietary diversity (AOR 2⋅72; 95 % CI (1⋅96, 3⋅77)) and stunting (AOR 1⋅88; 95 % CI (1⋅31, 2⋅70)) were significantly associated with anaemia. Anaemia in children aged 6–23 months was a severe public health problem in the study area. Integrated nutritional interventions combined with iron fortification and supplementation is recommended.
A mother's nutritional status and participation in household decision-making, a proxy for empowerment, are known determinants of improved nutrition and health outcomes for infants and young children; however, little is known about the association among adolescents. We examined the association between maternal nutritional status, decision-making autonomy and adolescent girls’ nutritional status. We analysed data of 711 mother–adolescent girl pairs aged 10–17 years from the Mion District, Ghana. Maternal nutritional status and decision-making autonomy were the independent variables while the outcomes were adolescent girls’ nutritional status as defined by anaemia, stunting and body mass index-for-age Z-score categories. Girl-level (age, menarche status and the frequency of animal-source food consumption), mother-level (age, education level, and monthly earnings) and household-level (wealth index, food security status and family size) covariates were adjusted for in the analysis. All associations were examined with hierarchical survey logistic regression. There was no association between maternal height and adolescent girls being anaemic, underweight or overweight/obese. Increasing maternal height reduced the odds of being stunted [adjusted odds ratio (OR) 0⋅92, 95 % CI (0⋅89, 0⋅95)] for the adolescent girl. Maternal overweight/obesity was positively associated with the girl being anaemic [OR 1⋅35, 95 % CI (1⋅06, 1⋅72)]. The adolescent girl was more than five times likely to be thin [OR 5⋅28, 95 % CI (1⋅64–17⋅04)] when the mother was underweight. Maternal decision-making autonomy was inversely associated with stunting [OR 0⋅88, 95 % CI (0⋅79, 0⋅99)] among the girls. Our findings suggest that intergenerational linkages of a mother's nutritional status are not limited to childhood but also during adolescence.
The objective of this paper is to review the global effort to eliminate iodine deficiency and its impact on public health. Iodine is an essential component of hormones produced by the thyroid gland. Iodine deficiency has multiple adverse effects in humans due to inadequate thyroid hormone production that are termed the iodine deficiency disorders. The major adverse effect is impaired cognition in children. The WHO's first estimate of the global prevalence of goitre in 1960 suggested that 20–60 % of the world's population was affected, with most of the burden in low- and middle-income countries. Iodine deficiency was identified as a key global risk factor for impaired child development where the need for intervention was urgent. This spurred a worldwide effort to eliminate iodine deficiency led by a coalition of international organisations working closely with national governments and the salt industry. In most countries, the best strategy to control iodine deficiency is carefully monitored iodisation of salt. The reach of current iodised salt programmes is remarkable: in 2018, 88 % of the global population used iodised salt. The number of countries with adequate iodine intake has nearly doubled over the past 20 years from 67 in 2003 to 118 in 2020. The resulting improvement in cognitive development and future earnings suggests a potential global economic benefit of nearly $33 billion. Iodine programmes are appealing for national governments because the health and economic consequences are high and can be easily averted by salt iodisation, a low-cost and sustainable intervention.
The objective of this research was to determine if, based on gender, adolescents were exposed to different marketing techniques that promoted food and beverages over social media.
Design:
A secondary analysis of adolescent boy (n 26) and girl (n 36) exposures (n 139) to food and beverage marketing was conducted. Mann–Whitney U and Fisher’s exact tests were conducted to compare the number, healthfulness and the marketing techniques of exposures viewed by boys and girls.
Setting:
Ottawa, Ontario, Canada.
Participants:
Sixty-two adolescents aged 12–16 years.
Results:
Boys and girls were exposed to similar volumes of food marketing instances (median = 2 for both boys and girls, Mann–Whitney U = 237, P = 0·51) per 10-min period of social media use. More girls viewed products that were excessive in total fat compared to boys (67 % v. 35 %, P = 0·02). Boys were more likely to view instances of food marketing featuring a male as the dominant user (50 % v. 22 %, P = 0·03), appeals to achievement (42 % v. 17 %, P = 0·04), an influencer (42 % v. 14 %, P = 0·02) and appeals to athleticism (35 % v. 11 %, P = 0·03), whereas girls were more likely to view instances of food marketing featuring quizzes, surveys or polls (25 % v. 0 %, P = 0·01).
Conclusions:
Food and beverage companies utilise marketing techniques that differ based on gender. More research examining the relationship between digital food and beverage marketing and gender is required to inform the development of gender-sensitive policies aimed at protecting adolescents from unhealthy food marketing.
Observational research, mainly prospective cohort studies (PCS), has represented a long-standing challenge for those attempting to draw up consistent policy recommendations in the area of diet and health. This has been due to the inherent limitations in ascribing causality from observed associations due to problems of confounding of the findings and publication and citation bias. Developments in nutritional epidemiology research over the past 20–30 years have enabled causal criteria to be derived from observational studies and the totality of the primary literature to be reviewed objectively, reducing previous focus on narrative accounts of individual studies. The gold standard approach to assessing causal relationships is via randomised controlled trials (RCT), but neither RCT nor PCS provide direct evidence for biological plausibility, which is a key criterion for assessing causality. Although extensive mechanistic data are available in the literature, a systematic approach to select and assess quality and relevance of published studies has not been available. This limits their use in the development of diet and health policy. Recent studies have investigated a proposed two-step framework and novel methodologies for integrating heterogeneous data from cell, animal and human studies. Pilot and feasibility studies have shown this to be a useful novel approach to studies of diet and cancer, but further refinements are required, including development of appropriate quality criteria which are less dependent on RCT designs. Future studies are needed to fully verify the approach and its potential for use in other diet–disease relationships.
Contemporary diets in Western countries are largely acid-inducing and deficient in potassium alkali salts, resulting in low-grade metabolic acidosis. The chronic consumption of acidogenic diets abundant in animal-based foods (meats, dairy, cheese and eggs) poses a substantial challenge to the human body's buffering capacities and chronic retention of acid wherein the progressive loss of bicarbonate stores can cause cellular and tissue damage. An elevated dietary acid load (DAL) has been associated with systemic inflammation and other adverse metabolic conditions. In this narrative review, we examine DAL quantification methods and index observational and clinical evidence on the role of plant-based diets, chiefly vegetarian and vegan, in reducing DAL. Quantitation of protein and amino acid composition and of intake of alkalising organic potassium salts and magnesium show that plant-based diets are most effective at reducing DAL. Results from clinical studies and recommendations in the form of expert committee opinions suggest that for a number of common illnesses, wherein metabolic acidosis is a contributing factor, the regular inclusion of plant-based foods offers measurable benefits for disease prevention and management. Based on available evidence, dietary shifts toward plant-based nutrition effectively reduces dietary-induced, low-grade metabolic acidosis.
Alternate day fasting (ADF) with consumption of calories up to 25 % of the daily energy intake on fast days is one of the most used intermittent fasting regimens and promoted as a promising, alternative approach for treating obesity. Feelings of appetite are critical for adherence to dietary approaches, and therefore the success of dietary interventions. This systematic review aimed to assess the effects of a minimum of 8 weeks of ADF on subjective feelings of appetite and body weight for adults with overweight and obesity. We conducted the review in accordance with the Cochrane guidelines, including systematic searches in four databases. Because of the high level of clinical and methodological heterogeneity, a narrative approach was used to synthesise the results. Eight studies with a total of 456 participants met the eligibility criteria: three randomised controlled trials and five uncontrolled before-after studies. Seven of the studies had high risk of bias. Feelings of appetite were assessed by hunger in eight studies, fullness in seven studies, satisfaction in four studies and desire to eat in one study. All the studies assessed weight loss. The certainty of the evidence was rated low or very low for all outcomes, thus no firm conclusions can be drawn about the potential benefits of ADF on subjective feelings of appetite and body weight. Despite the high interest in ADF, good quality evidence is still needed to determine its effectiveness and if offered in clinical practice, ADF should be offered cautiously while concomitantly evaluated.
Access to and utilisation of antenatal care (ANC) is important for optimising health and nutrition during pregnancy. This study aimed to assess adherence to and factors associated with ANC and antenatal supplement use among Laotian women, and consider culturally appropriate strategies to increase micronutrient intakes. Mother–child (aged 21 d to <18 months) dyads (n 699) enrolled in a hospital-based prospective cohort study with the community comparison group in Luang Prabang province were interviewed about their antenatal history, supplement use, household sociodemographic and dietary practices, including postpartum food avoidances. Ninety percent of women (mean age 24⋅7 ± 6⋅3 years) reported receiving ANC during their pregnancy, with the majority reporting four to seven contacts, while 84⋅6 and 17⋅3 % reported supplement use during pregnancy and lactation, respectively. Adequate ANC contacts (≥8) and supplement use was more likely among women with complete primary education and from higher socioeconomic status households, and less likely among women belonging to ethnic minority populations and those who delivered their child at home. All women continued to consume salt while adhering to postpartum food avoidances; however, 58⋅5 and 38⋅7 % of habitual consumers restricted fish and soy sauces, respectively. Eighty-six percent of women reported they would be willing to take supplements when adhering to postpartum dietary restrictions. Overall, women's reported ANC attendance and antenatal supplement use was suboptimal. Understanding predictors of and barriers to ANC and supplement use may help implement effective public health strategies to improve adherence. Alongside targeted supplementation, salt fortification with micronutrients may be a viable population-wide intervention that needs further evaluation.
This study used publicly available Form 990 tax documents to quantify food industry donations to patient advocacy organisations (PAO) dedicated to supporting patients with non-communicable diseases.
Design:
Observational, cross-sectional assessment of significant national and international food industry donations to US-based non-communicable disease-focussed PAO between 2000 and 2018. Researchers recorded and categorised the: (1) frequency and value of donations; (2) reason for donation; (3) name and type of PAO recipient and (4) non-communicable disease focus of the PAO.
Setting:
Form 990 tax documents.
Participants:
Nine food and beverage companies that donated to non-communicable disease-focussed PAO.
Results:
Adjusting for inflation, nine food and beverage companies collectively donated $10 672 093 (n 2709) to the PAO between 2001 and 2018. The largest category of donations was ‘matching gifts’ (67·9 %, median amount = $115·16), followed by ‘general operations support’ (25·8 %, median amount = $107·79). Organisations focussing on cancer received the largest number and amount of donations ($6 265 861, n 1968). Eight of the nine companies made their largest monetary value of donation to PAO focussed on cancer.
Conclusions:
Publicly available tax data provide robust information on food industry donation practices. Our findings document the food industry’s role in supporting patient advocacy organisations and raise questions regarding conflicts of interest. Increased awareness of food industry donation practices involving PAO may generate pressure for policies mandating transparency or encourage donors and recipients to voluntarily disclose donations. If public disclosure becomes widespread, constituents, advocates, researchers and policymakers can better supervise and address potential conflicts of interest.
Iron (Fe) status among healthy male and female blood donors, aged 18–65 years, is estimated. General characteristics and lifestyle factors, dietary habits and major one-carbon metabolism-related polymorphisms were also investigated. An explorative cross-sectional study design was used to examine a sample of blood donors attending the Transfusion Medicine Unit of the Verona University Hospital, Italy. From April 2016 to May 2018, 499 subjects were enrolled (255 men, 244 women, 155 of whom of childbearing age). Major clinical characteristics including lifestyle, dietary habits and Fe status were analysed. The MTHFR 677C > T, cSHMT 1420C > T, DHFR 19bp ins/del and RFC1 80G > A polymorphisms were also assayed. Mean plasma concentrations of Fe and ferritin were 16·6 µmol/l (95 % CI 16·0, 17·2) and 33·8 µg/l (95 % CI 31·5, 36·2), respectively. Adequate plasma Fe concentrations (> 10·74 µmol/l) were detected in 84·3 % and adequate ferritin concentrations (20–200 µg/l) was found in 72·5 % of the whole cohort. Among the folate-related polymorphisms analysed, carriers of the DHFR 19bp del/del mutant allele showed lower ferritin concentration when compared with DHFR 19bp ins/del genotypes. In a sample of Italian healthy blood donors, adequate plasma concentrations of Fe and ferritin were reached in a large proportion of subjects. The relationship of Fe status with lifestyle factors and folate-related polymorphisms requires more investigation to clarify further gene–nutrient interactions between folate and Fe metabolism.