Hostname: page-component-89b8bd64d-shngb Total loading time: 0 Render date: 2026-05-06T03:01:06.423Z Has data issue: false hasContentIssue false

Risk–benefit analysis of mineral intakes: case studies on copper and iron

Published online by Cambridge University Press:  22 September 2010

Susan J. Fairweather-Tait*
Affiliation:
School of Medicine, University of East Anglia, Norwich NR4 7TJ, UK
Linda J. Harvey
Affiliation:
School of Medicine, University of East Anglia, Norwich NR4 7TJ, UK
Rachel Collings
Affiliation:
School of Medicine, University of East Anglia, Norwich NR4 7TJ, UK
*
*Corresponding author: Professor Susan Fairweather-Tait, email s.fairweather-tait@uea.ac.uk, fax +44 1603 593752
Rights & Permissions [Opens in a new window]

Abstract

Dietary reference values for essential trace elements are designed to meet requirements with minimal risk of deficiency and toxicity. Risk–benefit analysis requires data on habitual dietary intakes, an estimate of variation and effects of deficiency and excess on health. For some nutrients, the range between the upper and lower limits may be extremely narrow and even overlap, which creates difficulties when setting safety margins. A new approach for estimating optimal intakes, taking into account several health biomarkers, has been developed and applied to selenium, but at present there are insufficient data to extend this technique to other micronutrients. The existing methods for deriving reference values for Cu and Fe are described. For Cu, there are no sensitive biomarkers of status or health relating to marginal deficiency or toxicity, despite the well-characterised genetic disorders of Menkes and Wilson's disease which, if untreated, lead to lethal deficiency and overload, respectively. For Fe, the wide variation in bioavailability confounds the relationship between intake and status and complicates risk–benefit analysis. As with Cu, health effects associated with deficiency or toxicity are not easy to quantify, therefore status is the most accessible variable for risk–benefit analysis. Serum ferritin reflects Fe stores but is affected by infection/inflammation, and therefore additional biomarkers are generally employed to measure and assess Fe status. Characterising the relationship between health and dietary intake is problematic for both these trace elements due to the confounding effects of bioavailability, inadequate biomarkers of status and a lack of sensitive and specific biomarkers for health outcomes.

Information

Type
Symposium on ‘Nutrition: getting the balance right in 2010’
Copyright
Copyright © The Authors 2010
Figure 0

Fig. 1. Dose–response curves derived from evidence-based (open boxes) and derived (shaded box) data used to derive dietary reference values using European Union (EU), Institute of Medicine (IOM) and United Nations University (UNU) terminology respectively: AR, average requirement; EAR, estimated average requirement; ANR, average nutrient requirement; PRI, population reference intake; RDA, recommended daily allowance; INLx, individual nutrient level for x% of the population; UL, tolerable upper intake limit; UL, upper level; UNL, upper nutrient level.

Figure 1

Table 1. Information required for risk–benefit analysis of mineral intakes

Figure 2

Fig. 2. Intake–status–health relationships and examples of confounding factors for Cu and Fe.

Figure 3

Fig. 3. Population distribution for (a) glutathione peroxidase activity with low selenium intakes and (b) frank selenosis with high selenium intakes. (c) Population distribution modelling for other possible effects resulting from different levels of intake of selenium. (Reprinted with permission from Renwick et al.(18).)