To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Understanding individual variability to dietary interventions is emerging as an important consideration in dietary interventions. Prior research has demonstrated ranging success of interventions. For example, Gardner and colleagues (2007) compared 4 weight loss diets in participants over a 12-month period(1), noting the range of weight loss was between 3.1kg to 6.3kg depended on individual and diet. Song and colleagues (2023) examined post prandial glucose response (PPGR) to four different carbohydrate meals. Dependent on the meal, the PPGR varied significantly between individuals(2). As such, it is inappropriate to assume that there is one dietary pattern appropriate for all individuals. Understanding the driving factors behind individual variation to specific foods and dietary patterns will allow us to tailor interventions to create optimal health outcomes for each individual. The aim of our study is to examine individual responses to different diets promoted for health. In our study, we investigated the biological diversity in response to the same dietary inputs among 23 participants at risk of type 2 diabetes and chronic disease over a two-week period. All participants completed four days on three dietary interventions (Mediterranean, Australian and low carbohydrate diets). Urine, serum, plasma, and faecal samples were collected, alongside the use of continuous glucose monitoring data, to explore the metabolic and glycaemic responses. Our findings reveal significant individual differences in blood glucose levels and metabolic outcomes. When examining fasting blood glucose levels, the low carbohydrate and Australian diets were optimal for 8 participants each, while the Mediterranean diet was optimal for 7 participants. However, this did not always correlate with post prandial blood glucose level optimisation. While blood, urine and faecal samples are yet to be analysed, these are expected to provide further understanding of individual biological responses. These results underscore the limitations of a universal dietary approach for optimising glycaemic control and highlight the necessity of personalised dietary recommendations that consider individual metabolic profiles. Our study provides crucial insights for future advances in precision nutrition, suggesting that personalised nutrition plans could lead to more effective management and prevention of T2D.
A recent umbrella review found that overall greater exposure to ultra-processed foods (UPF) was associated with worse outcomes. For cancer incidence overall (k = 7) and for colorectal cancer (k = 7) there was suggestive low-quality evidence of a positive association. There was no evidence for association with cancers of other sites(1). Analysis of data from EPIC found positive associations for UPF consumption with head and neck cancer, and oesophageal adenocarcinoma, with a small proportion of the association mediated via adiposity(2). This study aims to examine the association between UPF consumption and incidence of cancer, using data from the Melbourne Collaborative Cohort Study (MCCS). Adults aged 40–69 years and born in Australia, Greece or Italy (n = 41,513) were recruited between 1990 and 1994; invasive cancer incidence was identified up until June 30th, 2021, by linkage to cancer registries. Dietary data was collected using a food frequency questionnaire developed for the MCCS, and the NOVA classification was used to identify UPF(3). After exclusions, 35,039 (n = 21,244 females and 13,795 males) people were included in the analysis. UPF consumption (% of total grams intake) was modelled as quintiles and as a continuous variable. Flexible parametric models were fitted to estimate hazard ratios (HR) and 95% confidence intervals (CI) for cancer risk associated with UPF consumption after adjusting for sex, country of birth, socio-economic position, average lifetime alcohol intake (grams/day), smoking status and intensity, education, physical activity score. Overall cancer, overall obesity-related cancer, and individual obesity-related cancers where there were more than 100 cases, were considered as the outcomes. There were 10,445 incident cancers identified, of which 4,237 were considered as obesity-related, with more than 100 cases for cancers of pancreas (n = 270), colorectum (1369), endometrium (n = 223), kidney (n = 216), ovary (n = 159), multiple myeloma (n = 187), and post-menopausal breast (n = 1479). The highest UPF consumers included more males and people born in Australia than the lowest consumers. Positive associations were observed for all cancers: HR for continuous variable 1.03, 95% CI (1.01, 1.05); Q5 vs Q1HR 1.09, 95% CI (1.02, 1.16); obesity-related cancers: HR for continuous variable 1.04, 95% CI (1.01, 1.08); Q5 vs Q1HR 1.10, 95% CI (1.00, 1.22), and post-menopausal breast cancer: HR for continuous variable 1.07, 95% CI (1.00, 1.14); Q5 vs Q1HR 1.12, 95% CI (0.90, 1.41). Direct associations between UPF consumption and cancer outcomes were found, although some of those associations were weak. Limiting the consumption of UPF may reduce cancer risk.
There has been a proliferation of ultra-processed foods (UPF) in the food environment since the 1980s which have newly been linked to growing number of non-communicable diseases (NCD) including cardiovascular disease, cancers, type 2 diabetes, fatty liver disease, depression, frailty, and hypertension(1). There is intense debate surrounding whether the mechanism for the negative effect on health of consumption of UPF is their nutrition composition or processing(1). There is growing evidence that macronutrient ratios are important for chronic disease risk, predict longevity and may adversely affect micronutrient intakes(2). Intake of macronutrients from ultra-processed sources may lead to macronutrient imbalances and higher energy intakes(3) while being deficient in micronutrients, making it difficult to achieve energy balance and meet micronutrient requirements. Using nationally representative nutrition surveillance data on the Australian population, the National Nutrition and Physical Activity Survey, in this paper we employ the Geometric Framework for Nutrition(2) to examine the multidimensional dietary composition of UPF. Diet was assessed for adults (n = 9.341) with two 24-hour recalls. Diets were classified by degree of processing according the NOVA classification system and classified as ultra-processed diets (UPD, > 60% energy from UPF), moderate or low in UPF i.e., minimally processed diets (MPD, < 20% energy from UPF). Outcomes included the nutrient rich food index (NRF 9.3 index)(4), the Nutri-Score(5), and macronutrient and micronutrient intakes. Micronutrients were plotted over macronutrient ratios for MPD and UPD to determine whether micronutrient intakes could be met within the acceptable macronutrient distribution ranges (AMDR). Scheffe’s polynomials were fitted to the data for total energy intake, macronutrient intake and micronutrient intake. Vitamin and mineral intakes were higher for MPD compared to UPD (p < 0.001). Overall nutrient density decreased and the NRF 9.3 scores were 399.2 for MPD and 297.7 for UPD (p < 0.001). For the Nutri-Score, MPD diets scored A (highest quality) and UPD scored C (moderate quality). Poor scores were due to higher energy density, saturated fat, added sugar and sodium increased with UPD, while protein, dietary fibre and micronutrient density and fruit, vegetable, nut and legume ratios decreased. Diets met the estimated average requirement (EAR) for all micronutrients within AMDR for MPD but not for UPD. Regardless of processing, in almost all nutritional indicators of health, diets high in UPF were unsatisfactory relative to nutritional recommendations. We conclude that compositional factors alone point to the mechanisms through which ultra-processed dietary patterns could lead to poor health, in the full understanding that processing likely has additional effects over and above composition that exacerbates the problem.
Most Australians consume excess sodium and inadequate potassium both causing high blood pressure—the leading risk factor for death in Australia(1). Switching regular salt to potassium-enriched, reduced-sodium salt is a novel solution, shown to lower blood pressure, cardiovascular disease risk and premature death(2). The aim of this study was to explore Australian adults’ knowledge, attitudes and behaviours related to potassium-enriched salt. Adults aged ≥ 18 years and who were the main or joint grocery buyer were recruited through a web panel provider between February and March 2024 to complete an online survey. Quotas were used to achieve representation of the age, sex and geographical distribution in Australia. The survey was developed based on existing questionnaires of consumers’ perception of potassium-enriched salt identified in a systematic review, and the behaviour change wheel framework to allow for a systematic assessment of consumers’ capability, opportunity and motivation to switch to potassium-enriched salt. Survey responses that were completed in less than one-third of the median time (< two minutes and 20 seconds) were excluded from analysis. All data were collated and analysed using the statistical program Stata/SE 14.0 (StataCorp LP). A total of 4113 adults (52% female) with a mean (SD) age of 47.9 years (18.3) completed the survey and were eligible for inclusion. About half (47%) of participants reported that they have seen/heard of a potassium-enriched salt. Of those, 41% always, often or sometimes use potassium enriched salt with the main reasons being it is what’s available at home (32%) and it was recommended by a healthcare professional (29%). However, only 3% of all participants reported that potassium-enriched salt or low-sodium salt was the main type of salt used during cooking and eating at home. Most participants reported that they could be influenced to switch to potassium-enriched salt by the following factors, if it was better for their health (85%), affordable (78%), recommended by healthcare professionals (75%) and tasted good (73%). Most participants believed that individuals (77%), food manufacturers (68%) and fast-food chains (65%) were responsible for making the switch to potassium-enriched when told it was a healthier alternative to regular salt. A greater proportion of participants correctly identified that eating more potassium was beneficial for health (73%) compared to those that correctly identified that eating more sodium was not beneficial for health (57%). While current knowledge and use of potassium-enriched salt is low in Australia, the study identified existing and potential drivers for switching to potassium-enriched salt. The study highlights greater awareness-raising activities about the health benefits and acceptable taste of potassium-enriched salt, particularly by healthcare professionals, could help scale-up the switch to potassium-enriched salt.
Identifying reliable blood pressure biomarkers is essential for understanding how dietary interventions might supported a reduction in hypertension. Metabolomics, which involves the analysis of small molecules in biological samples(1), offers a valuable tool for uncovering metabolic biomarkers linked to both dietary patterns and blood pressure, providing insights for more effective dietary strategies to manage or prevent hypertension. The aim was to evaluate associations between plasma and urinary metabolite concentrations with blood pressure measures (systolic blood pressure [SBP] and diastolic blood pressure [DBP]) in healthy Australian adults. This cross-sectional secondary analysis used baseline data from a randomised, cross-over feeding trial(2). Plasma and urinary metabolomic data were generated using Ultra-high Performance Liquid Chromatography-Tandem Mass Spectrometry (UHPLC-MS/MS) through Metabolon Inc.’s (Morrisville, USA) Global Discovery Panel. Blood pressure was assessed in clinic using the Uscom BP+ supra-systolic oscillometric central blood pressure device, with the cuff positioned on the upper arm at the strongest pulse signal location. Participants sat relaxed and comfortably for 5 minutes before their measurements were taken. They remained seated with legs uncrossed, feet flat on the floor, and were instructed to maintain even breathing throughout the tests. Blood pressure was measured with three consecutive readings taken from the supported left arm, with a 1-minute rest between each reading. The first reading was discarded, and the average of the remaining two was used as the final measurement. Metabolite concentrations were log-transformed. Associations among blood pressure measures and urinary or plasma metabolites were evaluated using linear regression models, adjusting for age and sex. A total of 34 healthy Australian adults (mean age 38.4 ± 18.1 years, 53% females) baseline data was included. After adjusting for multiple comparisons using the Benjamini-Hochberg procedure with a significance threshold of q < 0.2, a negative association between two urinary metabolites (gamma-glutamyl histidine and gamma-glutamyl phenylalanine) and DBP was identified. In addition, 32 plasma metabolites were associated with SBP with 18 showing a negative association, including 1,2-dilinoleoyl-GPC (18:2/18:2) and 1-linoleoyl-GPC (18:2), and 14 showing a positive association (beta-hydroxyisovalerate, 3-Hydroxyisobutyrate). Potential mechanisms based on existing research that might explain these associations include the role of gamma-glutamyl peptides in lowering DBP by reducing oxidative stress and improving endothelial function(3). In contrast, 3-hydroxybutyrate may elevate blood pressure due to metabolic disturbances linked to impaired branched-chain amino acid catabolism(4). Furthermore, 1,2-Dilinoleoyl-GPC and 1-linoleoyl-GPC, both contain linoleic acid, which could contribute to lowering systolic blood pressure (SBP) by mitigating vascular inflammation(5). Although some of these metabolites have been implicated in blood pressure regulation in prior research, others revealed new associations. These findings suggest potential candidate nutritional biomarkers for blood pressure, but further research is needed to confirm their reproducibility, and causal role in blood pressure regulation.
Genome-wide association studies (GWAS) of food preferences(1) and intake(2) have identified hundreds of loci, most previously linked to health conditions. This suggests these loci may reflect participants’ health status, leaving unclear their direct influences on eating behaviour. Given that taste and olfactory perception play a crucial role in food preferences and choices(3), this study aims to: i) investigate the influence of genetic variants within taste and olfactory receptor genes on food preferences and ii) use these variants to investigate the potential causal influence of food preferences on health. We assess the associations across 1214 nonsynonymous variants (minor allele frequency ≥ 0.01) within 425 non-pseudo taste and olfactory receptor genes and 140 food-liking traits in the UK Biobank (n = 162006 unrelated Europeans; mean age = 57). Food likings were measured on a 9-point scale, with 1 being ‘Extremely dislike’ and 9 being ‘Extremely like’. We identify 700 associations (FDR-corrected p < 0.05), of which 88 are also associated with their corresponding food intake traits in the UK Biobank. We replicate 84 associations in the younger Avalon Longitudinal Study of Parents and Children (ALSPAC; n = 2802 unrelated Europeans; mean age = 25), including OR2T6 rs6587467 for onion liking (p = 5.4 × 10-41 in UK Biobank, p = 2.9 × 10-4 in ALSPAC), whereas others cannot be replicated (e.g., OR4K17 rs8005245 for garlic liking, p-value = 1.9 × 10-69 in UK Biobank, p = 0.66 in ALSPAC). These variants account for greater phenotypic variances in food-liking traits in the ALSPAC than in the UK Biobank (e.g., 0.54% and 0.25% for garlic liking in ALSPAC and UK Biobank, respectively), suggesting genetically determined sensory perception has larger impacts on food preferences in young adulthood. Lastly, we use an epidemiological technique, Mendelian randomisation(4), to assess the potential causal influence of food preferences on health outcomes using food-liking-associated variants and summary results from large-scale GWAS. Taking likings for onions and bananas as an example, our results show that both are causally associated with lower systolic blood pressure (onions: beta = −1.257, p = 0.001; bananas: beta = −3.166, p = 0.005; unit = mmHg/liking score). While liking for onions decreases the risk of type 2 diabetes (odds ratio [OR, 95% confidence interval] = 0.856 [0.781, 0.939]), liking for bananas increases it (OR = 1.289 [1.051, 1.579]). We found no evidence for causal associations with coronary artery diseases (onions: OR = 0.995 [0.879, 1.126]; bananas: OR = 0.982 [0.742, 1.299]). This study furthers current knowledge of direct genetic influences on food preferences, which helps understand individual differences in eating behaviour and has implications for personalised nutrition. Results from causal modelling provide complementary evidence for previous observational studies and could be used to guide future trials.
Cognitive function is pivotal for athletic success, encompassing decision-making processes, concentration, and overall performance. Sports beverages include carbohydrates for energy and sodium to enhance rehydration(1). Research has suggested they offer numerous advantages for mental function during exercise(2). The intensity of exercise also has a significant impact on an athletes cognitive ability. High-intensity exercise gradually enhances arousal to an optimal level, shifting from a state of rest to high-alert, consequently leading to improved cognitive performance, confirmed by measurement via an electroencephalogram(3). This aim of this study was to examine the acute effect of a beverage containing carbohydrate and sodium on cognitive performance, including interference control, response inhibition, working memory and cognitive flexibility, following moderate-intensity continuous exercise (MICE) or high-intensity interval exercise (HIIE). Healthy experienced recreational runners were recruited. Baseline data was collected before the first trial. Participants (n = 11) underwent four trials in a randomised crossover design. A combination of 8 KM MICE running (64–76% of maximum heart rate) and 8 KM HIIE running (77–93% of maximum heart rate) were completed, while ingesting either SB (carbohydrate: 6.2%, sodium: 21 mEq/L) or plain water (W); average intake: 308 ± 188 ml. Computer-based cognitive performance tests (Simon task, Go and No-Go task, N-Back task & Stroop task) were conducted after the completion of each exercise session, where response time and accuracy were measured. Paired T-tests were used to determine where the differences existed within the treatments and vs the baseline. A significant impairment of Simon effect was seen in MICE+SB and HIIE+SB compared to baseline (p = 0.022, p = 0.005). Response time in congruent stimuli was significantly improved in HIIE+SB compared to baseline and to MICE+SB (p = 0.008, p = 0.021). Response time in incongruent stimuli was significantly improved in HIIE compared with MICE in both SB and W ingestion (p = 0.001, p = 0.042) and significantly impaired in MICE+SB compared with baseline (p < 0.001). In the Go and No-Go task, no significant differences were observed in response inhibition. In the N-Back task, working memory response time was significantly improved in MICE+SB compared with baseline (p = 0.019). Response time in the Stroop task in incongruent stimuli was significantly improved in MICE+SB and HIIE+W compared to baseline (p = 0.039, p = 0.047). In conclusion, beverages containing carbohydrate and sodium were found to generally improve cognitive performance. In addition, HIIE can be considered a practical approach to improve acute cognitive performance. However, further studies are required to more accurately investigate the optimal combination of exercise intensity and beverage carbohydrate concentration required to maximise cognitive performance.
The food environment plays an important role in nutrition-related health outcomes. The influence of market settings on overall diet quality may be substantial where individuals regularly attend any given market. Salamanca Market (Hobart, Tasmania), and Carriageworks Market (Sydney, New South Wales; NSW) are two popular Saturday markets, attended regularly by locals. Salamanca Market provides a diverse range of edible (~30% of stalls) and non-edible (~70% of stalls) goods with a non-exclusive focus on ‘Tasmania’s own’ products. Carriageworks ‘Farmers Market’ features primarily edible products (~90% of stalls), requiring products to be grown/made/produced by the stallholder in NSW or Australian Capital Territory(1). While benefiting from tourism, these two markets are strongly attended by locals(2), therefore making an important contribution to health of individuals as a food environment experienced frequently and repeatedly. This study examined changes in food/beverage stall offerings at these markets over a 10- to 13-year period, as a reflection on the potential changes in weekly markets and hence food environments experienced across Australia. Stallholder information for Salamanca (summer stallholders 2011–12 to 2023–24) and Carriageworks (2015–2024 inclusive) markets were obtained from the respective official websites and analysed for stall categorisation. Current range of available products and categorisations were confirmed through in-person audits in January 2024. Descriptive statistics were used to summarise the findings. Over the study period, Salamanca Market food and beverage offerings grew by 21% to comprise 31.7% (89/291) of all stalls in 2024. This was driven by an increase in stalls offering primarily ‘discretionary’ foods/beverages: Take Away Food increased by 58% to 38% and Confectionary nearly doubled to 11% of food/beverage stalls. Stalls selling alcohol as their primary product increased from 13% to 64% of all beverage stalls over this time; vendors selling primarily whisky/spirits grew from 1 stall in 2012–13 to 9 stalls in 2023–24. This occurred simultaneously with a reduction in ‘core’ foods availability: Fruit and Vegetable stalls dropped from 24% to 10% and Dairy halved to 3% of food/beverage stalls. However, self-titled ‘farmers market’ Carriageworks demonstrated relative stability in stall composition over 10 years. Notable changes included reduction in number of Meat/Fish/Poultry/Alternatives stalls by 50%, and an almost halving of Condiments stalls, however other stall types (Fruit/Vegetables; Dairy) have remained relatively unchanged in number and proportion; almost two thirds of stalls primarily sold ‘core’ foods. In summary, stall composition at a market prizing a diversity of offerings demonstrated a transition to a poorer food environment, whereas stall composition at a ‘farmers market’ demonstrated less change and a ‘healthier’ composition of food/beverage offerings over 10–13 years. The food environment offered at ‘farmers markets’ is likely to be superior for healthful food purchasing and positive impacts on nutrition-related outcomes, compared to mixed-business markets.
Evidence supports plant-based dietary patterns for preventing cardiovascular diseases (CVD)(1). Fat mass is a strong predictor of CVD(2), however it’s unclear whether this mediates the relationship between plant-based dietary patterns and CVD. Hence, the aim of this study was to determine if longitudinal associations between plant-based dietary patterns and incidence of CVD events, CVD mortality and all-cause mortality are mediated by fat mass in mid-life. Dietary data (Oxford WebQ) from 14,247 adults (median 56 years [IQR 49–61]) in the UK Biobank cohort study were used to derive diet quality index scores for an overall plant-based diet (PDI), a healthy plant-based diet (hPDI), and a less healthy plant-based diet (uPDI). Health registries and national records provided CVD event and mortality data. Percentage fat mass was measured by dual X-ray absorptiometry. Cox proportional hazard ratios (95% CI) identified associations between each diet quality index and CVD events, CVD mortality or all-cause mortality. Regression-based mediation analysis was used to identify the direct effect (plant-based diet quality indices on CVD mortality, CVD events, or all-cause mortality), and the indirect effect, which was mediated by fat mass. New CVD events (n = 364), CVD mortality (n = 52) and all-cause mortality (n = 220) were identified with mean follow up of 11.6 (SD ± 0.4) years for CVD events, and 11.5 (SD ± 0.7) years for mortality. The mean score for PDI was 50.5 (SD ± 5.9), hPDI 52.8 (SD ± 7.2), and uPDI 54.0 (SD ± 6.8). The PDI and hPDI were inversely associated with fat mass, and the uPDI was positively associated with fat mass (p < 0.001). There was no association between the diet quality indices and health outcomes, with (direct effect) or without (total effect) the fat mass mediator for males and females (p ≥ 0.1). Fat mass was associated with risk of mortality in some models, after controlling for the indices, such as a lower risk of all-cause mortality after controlling for the hPDI (p < 0.1). There was a significant negative indirect effect of hPDI on CVD mortality via fat mass for females only (observed coefficient 0.81; 95% CI 0.60–0.99). Overall, there was limited evidence of a mediating effect from fat mass in the association between plant-based dietary patterns and incidence of CVD events, CVD mortality and all-cause mortality. Studies with larger samples and longer follow up are needed to determine whether the mediating effect of fat mass on hPDI and CVD mortality in females is reproducible.
Nutrition misinformation is pervasive on frequently accessed online sources such as social media and websites(1). Young adults in particular are at a higher risk of viewing or engaging with this content due to their higher Internet and social media usage(2). As such, this study aimed to understand the preferences, perceptions and use of online nutrition content in this age group. Young Australian adults (aged 18–25 years; n = 20) were individually interviewed via the video calling platform Zoom. Interviews ranged between 19 and 42 minutes. The interviewer followed a semi-structured format and questions were guided using a piloted template. Reflexive thematic analysis was conducted using NVivo. Quotes addressing the research questions were coded. Codes were grouped into themes and sub-themes and were summarised in a narrative format. Results showed that all but one participant used social media (n = 19) and Internet websites (n = 16) to view nutrition content. Content viewed or accessed from social media varied, whereas website content catered to the consumers’ goals and interests. While content from social media and was perceived as easy to use and accessible, perceived reliability varied. Short-form content, prevalent on online platforms, was considered less reliable, despite its engaging nature. This suggests that there exists a trade-off between the engagement and trust of nutrition content. Additionally, content containing sponsorships or product endorsements was less trusted. On the other hand, participants were more likely to trust content created by health professionals. The oversaturation of content also demotivated participants from evaluating the reliability of content. When asked about preferences, participants valued personalised content, mixed formats (i.e., short and long-form content), and evidence-based information such as statistics and references. They also preferred casual and entertaining content that incorporated modern and high audiovisual qualities (e.g., voiceovers). In conclusion, young Australian adults in the study recognise that unreliable nutrition content is not exclusive to certain platforms. The findings suggest that the accessibility and engagement of content and the ambiguity of professional ‘credentials’ may lead them to trust information that is potentially of low quality and accuracy, or alternatively, disregard high quality information. Findings also show that there needs to be a balance between engaging formats and presenting evidence-based information when designing nutrition content. Future research should explore how the factors influencing perceptions and preferences of online nutrition content in young Australian adults, as identified in this study, impact the usage of online nutrition content and dietary behaviours. Further consultation with this cohort can inform tailored interventions that aim to enhance young adults’ food and nutrition literacy and diet quality.
One in five Australian children are food insecure(1), and the majority are consuming inadequate vegetables and too many energy dense nutrient poor foods(2). A healthy school lunch program provided to all children will ensure fair and equitable access to essential nourishment during a school day that can improve academic achievement(3), attention, behaviour and concentration(4) and mental health and wellbeing(5). However, prior to implementing a school meal program, opinions and thoughts of key stakeholders, including teachers, should be explored. The aim of this study was to explore Victorian primary school teachers’ perceptions and opinions of current lunch practices and school provided lunch programs.An online survey of primary school teachers in Victoria was administrated via Qualtrics. Frequencies and percentages of responses were calculated and Chi Square tests were used to explore associations with demographic variables (gender, school type (government, independent, catholic, other)). A total of n = 322 Victorian primary school teachers completed the survey (95% female, 81% government schools). All year levels (Prep–grade 6) were represented. Thirty percent agreed and 45% were unsure whether there should be a school provided lunch program. Most teachers (91%) believed that the school provided lunches would allow children to eat healthy foods and 73% believed it would be convenient for parents. Perceived barriers included; cost for parents (81%), teachers do not want to serve food (75%) and time it would take to serve the food (70%). There was no difference between teachers in government and independent schools in preferences for a school lunch program or perceived potential benefits. However, more teachers from independent schools believed it would take too much time to serve food to all children (81%) compared to teachers from government schools (68%; Pearson’s Chi-Square test, p = 0.045). Additionally, more teachers from government schools thought that delivering a school provided lunch program may reduce teaching time, than teachers from independent schools (52% vs 37% respectively: Pearson’s Chi-square test, p = 0.041).While most teachers agreed school provided lunches would provide children an opportunity to eat healthy food for lunch and would be convenient for parents, less than a third agreed there should be a lunch program and many were unsure. Concerns included taking too much time to serve food and thinking they would need to serve the food themselves. This may be due to a lack of understanding of school provided lunch programs. Further research can investigate effective ways to mitigate some of the identified barriers when designing a school meal program.
Unhealthy diet-related behaviour is linked to an increased risk of colorectal cancer (CRC) and therefore people at increased-risk of CRC are advised to follow healthy dietary recommendations. Assessing disparities in diet quality based on sociodemographic factors could help to tailor dietary interventions(1). We aimed to determine the relationship between diet quality and sociodemographic factors in people at increased risk of CRC. This was a cross-sectional study including adults at increased risk of CRC due to a prior history of colorectal neoplasia and/or a known significant family history of CRC. Participants completed a survey including the Australian Eating Survey (AES)(2), and collection of demographic characteristics including age, gender, education, and socioeconomic indices (SEI) from Oct 2023 to July 2024. The AES survey was used to calculate diet quality using the Australian Recommended Food Score (ARFS)(2). The ARFS was calculated by summing the eight sub-scales that includes vegetables, protein foods, breads/cereals, dairy foods, water, and spreads/sauces. The total ARFS ranges from 0–73, with a higher score indicating a higher diet quality. Associations between diet quality and sociodemographic factors were determined using a log Poisson regression model with robust variance estimation. 1940 individuals (52% female) completed the survey. The median age was 67.44 years (IQR: 59.56 ± 72.66), with 11.49% (n = 223) aged under 50 years, 86.0% (n = 1669) aged 50–79 years and 2.5% (n = 48) aged over 79 years. The mean (± SD) ARFS was 28.76 ± 10.48 points. The ARFS did not significantly differ with gender (males: 29.0 ± 10.57; females: 28.6 ± 10.36), family history of CRC (family history: 28.7 ± 10.55; no family history: 28.8 ± 10.44), or SEI (higher tertile: 28.76 ± 10.41; lower tertile: 28.92 ± 10.65) (p > 0.05). Diet quality was associated with age, with ARFS lower in younger (18–49y) (28.72 ± 10.18) than older (80–89y) participants (31.19 ± 8.5) (p < 0.05). Regarding dietary components, dairy intake was lower in females than males (Relative Risk (RR) = 0.94, 95% confidence interval (CI) 0.90–0.99), while individuals with the middle SEI tertile had lower fruit intake compared with the highest tertile (RR = 0.94, 95% CI 0.84–0.99), and those left school before year 12 had lower vegetable intake, compared to those with tertiary education (RR = 1.07, 95% CI 1.01–1.13). This study has shown that individuals at elevated risk for CRC have a quality of diet that is poorer than the general population, with greater disparities seen in young individuals. Further differences were observed in dairy, vegetable and fruit intakes based on sex, education, and socioeconomic status. There is a need for further promotion of dietary interventions in people at elevated risk for CRC.
Iron deficiency is the most common nutritional deficiency globally. Premenopausal women are at particular risk due to increased requirements for iron associated with menstrual blood loss and pregnancy. To prevent iron deficiency, recommended intakes have been developed based on physiological requirements for absorbed iron and iron bioavailability. However, iron bioavailability is difficult to estimate as it depends on the composition of the diet and an individual’s absorptive efficiency. Several algorithms have been proposed to estimate iron bioavailability from diets based on the form of the iron and the presence of absorption modifiers. These algorithms can be complex and often underestimate bioavailability. Recently, a new approach was developed by Dainty et al.(1,2), which is based on calculated iron requirements, total dietary iron intakes, and the distribution of serum ferritin concentration values in the population. This model has been used by the European Food Safety Authority to set recommended iron intakes for adults(3). In contrast, the recommended iron intakes for Australian adults are based on iron bioavailability estimates from the US Institute of Medicine, which were primarily derived from 15 free living US adults(4). Therefore, the aim of this study was to predict dietary iron absorption in a representative sample of premenopausal Australian women using the model developed by Dainty et al.(1,2) Dietary iron intake and serum ferritin data from the 2011–13 Australian National Nutrition and Physical Activity Survey and National Health Measures Survey were analysed in 503 premenopausal women aged 18–49 years. Women were excluded if they were pregnant or lactating, had elevated C-reactive protein, consumed iron-containing supplements, or misreported energy intake. Dietary iron intake was assessed via two non-consecutive 24-hour recalls. Usual daily iron intake was determined by the Multiple Source Method. Dietary iron absorption was estimated using the predictive model developed by Dainty et al.(1,2) and the Institute of Medicine’s distribution of individual dietary iron requirements(4). Mean (SD) usual dietary iron intake was 10.4 (2.6) mg/d. The prevalence of serum ferritin < 15 μg/L was 14.1% (95% CI: 10.2%, 19.3%), and < 30 μg/L was 37.0% (95% CI: 31.8%, 42.5%). Predicted dietary iron absorption at serum ferritin concentrations of < 15 μg/L was 29.5%, and at serum ferritin concentrations of < 30 μg/L it was 19%. Our findings do not support the bioavailability assumption of 18% used to develop the Australian recommended iron intakes for premenopausal women based on the need to maintain serum ferritin concentrations of 15 μg/L. Our results may be useful in revising the recommended iron intakes for Australian premenopausal women.
Food is a key lever for human and planetary health(1). Shifting to more plant-based foods supports environmentally sustainable, healthy and affordable diets(1). Taste preferences are formed in early childhood(2), presenting an opportunity for influencing plant-based food intake throughout the lifespan. Early Childhood Education and Care (ECEC) are important food environments due to high attendance rates for long hours(3), where children receive half of their daily nutritional needs(4). This study aimed to understand plant-based vs animal-based protein food provision in ECEC, their contribution to key nutrients, and their costs. Two weeks’ menus and recipes were collected from Victorian ECEC between 2018 and 2019 and entered into Foodworks10 for nutritional analysis. Desktop analysis categorised meals (lunches and snacks) by protein type as animal-based (red meat, white meat, fish, eggs, dairy, processed meat), plant-based (legumes, protein-enriched plant milk, seeds), or combined (both). Recipe items were priced at a metropolitan supermarket in March 2024 to determine cost per child per day and cost per child per lunch meal. A restricted maximum-likelihood mixed-effects model was used to estimate mean differences in lunch meal costs between the different meal protein types, adjusted for serving size. Iron bioavailability was assessed using previously published algorithms. Total daily energy, protein, calcium and iron were compared to 50% of the Australian Recommended Daily Intake for 2–3 year olds(5). Eighteen centres provided menus (n = 180 days, 540 meals). Preliminary findings indicated that 73% of meals contained animal-based protein, 7% a combination of animal and plant, and 4% plant-based protein. Animal-based protein meals most often contained dairy foods (64%, n = 253), followed by red meat (13%, n = 53). Plant-based protein meals mostly contained legumes (85%, n = 17). Mean (± SD) iron provision was below recommendations (2.86 mg ± 1.47 mg). Total protein (26 g ± 12 g) and calcium provision (271 mg ± 137.21 mg) were above recommendations. Mean food cost per child per day was AUD 2.46 (± AUD 1.09) and mean lunch meal cost per child was AUD 1.36 (± AUD 0.84). Animal-based lunches were AUD 0.45 more expensive than plant-based (p ≤ 0.01, 95% CI: AUD 0.15–AUD 0.73). These findings highlight very low provision of plant-based proteins in ECEC menus. Low red meat and iron provision suggests that plant-based protein should not displace current red meat on menus. High dairy and more than sufficient calcium may indicate that ‘meat-free meals’ are predominantly dairy-based, providing an opportunity for plant-based proteins in these meals. Plant-based protein lunches were a third cheaper than animal-based counterparts, suggesting an affordable option. Young children attending ECEC settings are currently missing the opportunity for exposure to plant-based proteins as healthy, environmentally sustainable and affordable additions to menus.
Colorectal cancer is a prevalent global health issue. In Australia, it ranks as the third most common newly diagnosed cancer, with around 15,000 new cases annually(1). Despite treatment advances, high incidence and mortality rates highlight the need for effective prevention and new therapies. Polyphenols, abundant in plant-based foods, have shown promise in inhibiting cancer cell growth and inducing apoptosis, offering a dietary approach to reduce cancer risk and improve outcomes. Whole-grain cereals like sorghum are recognised sources of phenolic compounds and can scavenge free radicals(2). This study aimed to evaluate the role of sorghum-derived polyphenols in modulating the major cancer development pathways. Also, the impact of processing techniques (cooking and fermentation) on sorghum polyphenols and cancer was evaluated. Polyphenols were extracted from the raw, cooked, fermented, and fermented-cooked sorghum flour samples(3). The phenolic content was measured using benchtop chemical assays including the DPPH radical scavenging assay and the ferric-reducing ability of plasma assay. UHPLC analysis coupled with Online ABTS characterised the polyphenols present in these extracts and provided their antioxidant activities. Using these extracts, a resazurin red cytotoxicity assay was performed on HT-29 colorectal cancer cells to determine the optimal concentrations for the downstream experiments. HT-29 cells were incubated with the black sorghum phenolic extracts (500 ug/mL and 2000 ug/mL) for 12 and 24 hours. Following this, the gene expression of several cancer regulatory genes (APC, KRAS, TTN, GLUT-1, HIF-1a and HIF-1b) was evaluated by qPCR. Treatment of HT-29 cells with raw sorghum phenolic extracts significantly (p < 0.05) upregulated APC and TTN genes at 12 and 24-hour time points and the KRAS gene at 24-hour time points compared to the control. This indicates the impact of sorghum-derived polyphenols on genome mutation and instability in cancer development pathways. Also, treatment at 500 ug/mL significantly (p < 0.05) upregulated the expression of GLUT-1 suggesting the impact on dysregulated cellular metabolism cancer development pathway. Processed sorghum phenolic extracts also significantly regulated KRAS gene expression. Overall, the results from this study showed that sorghum polyphenols modulate the expression of key cancer development pathway-associated genes in HT-29 cells. The findings underscore the potential of dietary polyphenols in cancer prevention and highlight the need for further research to optimise their use and understand their mechanisms of action in vivo.
Pre-school children’s dietary intake in Australia is substandard, with only 18% of children aged 2–3 years meeting the recommended intake for vegetables, and more than one-third of their daily kilojoules coming from energy-dense, nutrient-poor foods. Several child eating behaviour traits (e.g., food fussiness, enjoyment of food, satiety responsiveness and food responsiveness) are associated with the dietary intake of pre-school children(1). However, the associations between child eating behaviour traits and overall dietary quality in pre-school children have not been examined, which is important as children do not consume food groups or nutrients in isolation. It is also important to understand how biological factors such as age may influence child eating behaviours, given that eating behaviour traits such as food fussiness can develop and change with age(2). Therefore, the aims of this study were to examine the associations between preschool children’s eating behaviour traits and their dietary quality and to examine the moderating effect of children’s age on these associations. Cross-sectional survey data was collected online from mothers of pre-school aged children (2–5 years) from across Australia. The Children’s Eating Behaviour Questionnaire (CEBQ) measured four child eating behaviour traits: food fussiness, enjoyment of food, food responsiveness and satiety responsiveness. A validated thirteen-item food frequency questionnaire measured child dietary quality; 5 items measured healthy foods/behaviours, and 8 measured discretionary foods/unhealthy behaviours, with a maximum score of 65(3). Linear regression assessed associations between child eating behaviour traits and dietary quality, including interactions between child eating behaviour traits and child age. Of the 1367 respondents, half of the children were male (50.2%) and the mean age of the children was 3.3 years (SD = 1.0). The mean child dietary quality score was 51.9 (out of 65, range 21 to 64). Enjoyment of food was positively associated with dietary quality (B coefficient: 2.51, p < 0.001), whilst food fussiness and satiety responsiveness were inversely associated with dietary quality (B coefficients: −2.59 and −2.25, respectively, p < 0.001), and food responsiveness was not related to diet quality. Child age moderated associations between food fussiness and dietary quality (B coefficient: −0.38, p = 0.025), but not the other eating behaviour traits. The difference in dietary quality between lower and higher food fussiness was most pronounced among 5-year-old children. In conclusion, the findings from this study suggests that future interventions targeting poor dietary quality of pre-school children should consider targeting children with lower food enjoyment or higher food fussiness or satiety as possible ways to improve child dietary quality. Future interventions should also have a particular focus on strategies to reduce food fussiness for older preschoolers, as well as fussiness prevention strategies for younger preschoolers.
Diabetes-related foot ulcers (DFU) are common, with 56,000 Australia presenting with DFUs every year(1). Optimal nutrition is critical in wound healing, however, in DFU there is limited data available on nutritional status and healing outcomes. Therefore, the aim is to summarise the work leading to the development of a co-designed intervention for individuals with DFU and determine the dietary intake of individuals with DFU. Three separate studies were conducted: (1) qualitative interviews within individuals with DFU, (2) qualitative interviews with health care professionals working with people with DFU, and (3) comparison of current dietary intakes of individuals with DFU against international guidelines. To explore the individual with DFU perspective, a qualitative study using a reflexive thematic approach of conversational style interviews was undertaken. A targeted heterogenous sample with active or recent history of DFU, were recruited from a high-risk foot service in New South Wales, Australia. To gain an understanding of the perspectives of Australian health professionals involved in the care of individuals with DFU regarding nutrition assessment and management, semi-structured interviews were conducted nation-wide. To compare dietary intake, descriptive analysis was conducted. Dietary intake was collected using the Australian Eating Survey food frequency questionnaire and used to generate nutrient intake data. Nineteen interviews with individuals with DFU identified negative experiences with dietitians, seeing them as judgemental. Dietary misconceptions were common, with many having an unhealthy perception of food and no participants previously given personalised dietary advice for wound healing. However, it was evident that participants were willing to do anything to improve their wound healing, including a dietary intervention. Participants expressed a strong preference for personalised, face-to-face dietary advice. A total of 19 health professionals participated in interviews. Major barriers to implementation of nutrition assessment and management were identified: inadequate time, lack of knowledge and lack of clinical guidance. Facilitators included: professional development, a standardised clinical pathway and screening tool, and a resource addressing wound healing and diabetes management. One hundred and fifteen participants with DFU were included in the dietary analysis. Most individuals with DFU did not meet current consensus guidelines for optimal dietary intake for wound healing. Inadequate intake of protein, vitamin A, vitamin C, vitamin E and zinc was identified 46%, 45%, 26%, 86%, and 37% of participants, respectively. This body of work highlighted that individuals with DFU are interested in receiving personalised medical nutrition therapy, however they are not currently meeting higher wound healing nutrient requirements. Furthermore, health care professionals are not confident in supporting these individuals. These findings suggest that co-designing interventions for individuals with DFU needs to utilise co-design of the patient and clinician perspective to increase acceptability of the nutrition intervention.
Higher-quality Australian diets are reported to taste more bitter(1), have healthier nutritional profiles and align more closely with the recommendations of the Australian Dietary Guidelines(2). Greater consumption of bitter foods may benefit health, but most research has focused on green leafy vegetables(3). However, there are other foods and beverages (F&Bs) that taste bitter and could increase the bitterness of diets if consumed in greater amounts(2). Yet, strategies to increase bitter F&B consumption and enhance the bitterness of diets remain largely underexplored. An online-based cross-sectional survey of Australian adults was conducted (in July and August 2023) to explore barriers, facilitators, and strategies associated with willingness to try or increase consumption of bitter F&Bs. Eight non-discretionary bitter F&Bs available in the Australian market (including coffee, tea, soda water, Brussels sprouts, rockets, grapefruit, walnuts, and eggplant) were selected. The design of survey questions was guided by conceptual models of food choice. Respondents were asked about their familiarity with and consumption habits of bitter F&Bs and their willingness to incorporate more bitter F&Bs into their diets. Respondents were grouped into those who had never tried bitter F&Bs, non-consumers or consumers, who were further categorised into low-, moderate- or high-consumers. This analysis focused on respondents with low bitter F&B consumption, non-consumers, and individuals who had never tried bitter F&Bs, as the potential to increase consumption was greatest. This study enrolled 879 respondents across Australia. Respondents had previously tried an average of six of the eight bitter F&Bs (median = 6). Most respondents (85.4%) were willing to increase their consumption of bitter F&Bs. While the bitter taste was consistently reported as the main barrier to greater consumption, the reported facilitators and strategies varied between consumer groups and the different F&Bs. More than half of the respondents (61.1%) had never tried bitter vegetables (i.e., Brussels sprouts and rockets). For this group, ‘nutrition education’ (selected by 34%) and ‘appealing presentation’ (selected by 25.9%) were the most commonly selected facilitator and preferred strategy, respectively. Non-consumers of other bitter foods in the survey reported ‘price’ (selected by 43.8%) and ‘convenience’ (selected by 16.5%) as the most important facilitator and strategy, respectively. While ‘food availability’ (selected by 39.3%) was the common facilitator among low-consumers of bitter beverages, ‘easier preparation’ and ‘altering the taste’ (selected by 19.9% and 17.3%) were the most preferred strategies. This study provides valuable insights into the acceptability of bitter F&Bs among Australian adults. These findings could help tailor dietary interventions to groups of individuals based on their consumption habits of particular bitter F&Bs to support increased consumption. Further research is needed to understand whether improving bitter F&B consumption increases the bitterness of diets overall and whether this is associated with improved health outcomes.
Low vitamin D status (circulating 25-hydroxyvitamin D [25(OH)D] concentration < 50 nmol/L) affects nearly one in four Australian adults(1). The primary source of vitamin D is sun exposure; however, a safe level of sun exposure for optimal vitamin D production has not been established. As supplement use is uneven, increasing vitamin D in food is the logical option for improving vitamin D status at a population level. The dietary supply of vitamin D is low since few foods are naturally rich in vitamin D. While there is no Australia-specific estimated average requirement (EAR) for vitamin D, the Institute of Medicine recommends an EAR of 10 μg/day for all ages. Vitamin D intake is low in Australia, with mean usual intake ranging from 1.8–3.2 μg/day across sex/age groups(2), suggesting a need for data-driven nutrition policy to improve the dietary supply of vitamin D. Food fortification has proven effective in other countries. We aimed to model four potential vitamin D fortification scenarios to determine an optimal strategy for Australia. We used food consumption data for people aged ≥ 2 years (n = 12,153) from the 2011–2012 National Nutrition and Physical Activity Survey, and analytical food composition data for vitamin D3, 25(OH)D3, vitamin D2 and 25(OH)D2(3). Certain foods are permitted for mandatory or voluntary fortification in Australia. As industry uptake of the voluntary option is low, Scenario 1 simulated addition of the maximum permitted amount of vitamin D to all foods permitted under the Australia New Zealand Food Standards Code (dairy products/plant-based alternatives, edible oil spreads, formulated beverages and permitted ready-to-eat breakfast cereals (RTEBC)). Scenarios 2–4 modelled higher concentrations than those permitted for fluid milk/alternatives (1 μg/100 mL) and edible oil spreads (20 μg/100 g) within an expanding list of food vehicles: Scenario 2—dairy products/alternatives, edible oil spreads, formulated beverages; Scenario 3—Scenario 2 plus RTEBC; Scenario 4—Scenario 3 plus bread (which is not permitted for vitamin D fortification in Australia). Usual intake was modelled for the four scenarios across sex and age groups using the National Cancer Institute Method(4). Assuming equal bioactivity of the D vitamers, the range of mean usual vitamin D intake across age groups for males for Scenarios 1 to 4, respectively, was 7.2–8.8, 6.9–8.3, 8.0–9.7 and 9.3–11.3 μg/day; the respective values for females were 5.8–7.5, 5.8–7.2, 6.4–8.3 and 7.5–9.5 μg/day. No participant exceeded the upper level of intake (80 μg/day) under any scenario. Systematic fortification of all foods permitted for vitamin D fortification could substantially improve vitamin D intake across the population. However, the optimal strategy would require permissions for bread as a food vehicle, and addition of higher than permitted concentrations of vitamin D to fluid milks/alternatives and edible oil spreads.
Nuts are nutrient-rich, energy-dense foods that are associated with better diet quality in children(1), yet intake in Australian children remains low(2). Prospective studies have demonstrated positive associations between nut consumption and cognitive performance in children(3), while randomised controlled trials (RCTs) assessing nut consumption and cognitive performance in adults have reported inconsistent findings(4). This 2-phase cross-over RCT examined the feasibility of Australian children eating an almond-enriched diet (30 g almonds, 5 days per week) compared with a nut-free diet for 8 weeks each. Associated changes in diet quality, lifestyle factors and cognitive performance were also measured. Forty children (48% female, 8–13 years) who were low habitual nut consumers (< 30 g/day) and free from nut allergies and cognitive, behavioural or medical conditions that could affect study outcomes were enrolled. Feasibility outcomes included retention, compliance with study foods and changes in ratings of liking and palatability of almonds. Other outcomes were assessed before and after each 8-week diet phase, separated by a 2-week washout. Parent/guardian–child dyads completed questionnaires about diet (diet quality score), physical activity, and sleep behaviour. Sleep quality and length were recorded for 7 nights prior to clinic visits. At each visit sleepiness was captured (Karolinska Sleepiness Scale) before children completed a computerised test battery (COMPASS) to assess cognitive performance across attention/concentration, executive function, memory, processing speed and verbal fluency domains. Analyses were performed using SPSS 26.0 software with statistical significance defined as p < 0.05. Data were analysed using mixed effects models, with diet and time as fixed effects, a random effect of ID and controlling for diet order, age, sex and sleepiness. Retention was excellent with all participants completing the study and mean compliance with almonds was 98%. Mean liking and palatability ratings declined after 8 weeks (−23 points, p = 0.006) but remained favourable. There were no significant changes in diet quality, physical activity or sleep (behaviour, length or quality) during the study. Changes in cognitive performance over time and between diets ranged from trivial to small (Cohen’s d = 0.01–0.28) for all tests, failing to reach significance except for simple reaction time (faster response over time, d = −0.1, F(1,115.7) = 4.455, p = 0.037) and Peg and Ball response time (faster after nut-free diet, d = 0.28, F(1,115.4) = 4.176, p = 0.043). This study demonstrated that it was feasible to conduct an almond-enriched dietary intervention in Australian children, with excellent retention and compliance to study requirements. Whilst significant changes were limited for scientific outcomes, this study was not designed to be powered for these outcomes. Rather, these data will be valuable for determining required sample sizes in future studies assessing nut interventions and cognitive performance in children.