To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In adults, a diet low in sodium and high in potassium helps reduce blood pressure and risk of cardiovascular disease(1). Whether this same relationship exists among children is less clear. There is some evidence to indicate that reducing sodium intake is favourable for maintaining healthy blood pressure levels across childhood(2). However, it is not clear how sodium intake during childhood influences blood pressure and whether sex or body mass index moderates any effects. The effects of potassium intake and the sodium-to-potassium ratio on children’s blood pressure have been generally inconsistent(3). It is important to understand the relationship between sodium and potassium on childhood blood pressure as elevated blood pressure across childhood increases the risk of future hypertension and target organ damage(4). Few studies in children have utilised the 24-hour urinary electrolyte excretion, an objective measure, of sodium intake. Therefore, this study examined the relationship between 24-hour urinary sodium, potassium and sodium-to-potassium molar ratio and blood pressure among Australian schoolchildren aged 4–12 years; and if these associations were moderated by body weight or sex. Data from 793 children who participated in the Salt and Other Nutrient Intakes in Children study were included in this analysis. Children recruited from primary schools (n = 61) located across the state of Victoria provided one 24-hour urine collection, and anthropometric and blood pressure measurements. Blood pressure z-scores standardised for age, sex and height were calculated. Multiple linear regression analysis, with adjustment for covariates (age, sex, socioeconomic position and weight category), was conducted. Body weight (underweight/healthy weight, overweight and obese) and sex subgroup analyses, including interaction terms were completed. Mean (SD) sodium excretion was 2386 (1046) (SD) mg/d and 72% of children exceeded the recommended upper level for sodium intake. Mean (SD) potassium excretion was 1796 (675) mg/d and the sodium-to-potassium molar ratio was 2.4 (1.1). Eighteen percent of children had elevated blood pressure. Overall, there were no associations between 24-hour sodium or potassium excretion and blood pressure in adjusted regression models. However, there was a significant positive association between sodium excretion and systolic blood pressure z-score in children with obesity (b-coefficient 0.70 [95% CI 0.05, 1.33], p = 0.04, n = 23) and among girls (b-coefficient 0.09 [95% CI 0.01, 0.17], p = 0.02, n = 365). We found a positive association between 24-hr urinary excretion and SBP in girls and children living with obesity, providing further support to the hypothesis that body weight is a moderator of this relationship through heightened salt sensitivity. Public health interventions aiming to reduce elevated blood pressure during childhood are likely to be most effective by reducing sodium intake in conjunction with promoting healthy weight.
The theory of Developmental Origins of Health and Disease (DOHaD) suggests that the foetal origins of adult diseases are determined by perinatal exposure. Therefore, dietary intake during pregnancy is an opportunistic time to influence future disease susceptibility in infants. ORIGINS is a longitudinal birth cohort study aimed to reduce the rising epidemic of non-communicable diseases through ‘a healthy start to life’(1). We aimed to describe the dietary intakes of pregnant women in this cohort in Western Australia and compare this to the Nutrient Reference Values (NRVs) and Australian Recommended Food Score (ARFS)(2). The dietary intakes of women were collected using the Australian Eating Survey (AES), a semi-quantitative Food Frequency Questionnaire (FFQ). A total of n = 374 women completed the AES FFQ(3) at both 18- and 36-weeks’ gestation between 2016 and 2023. A descriptive cross-sectional analysis using STATA, was used to explore the macronutrient, micronutrient and food group intake at the two time points. Participants had a mean age of 32 years, were of Caucasian background (82.6%), had a tertiary education (78.1%), and majority were in the normal (37.7%) or overweight (29.4%) BMI category. Overall, it was found that the energy contribution from carbohydrate was low as compared to the recommended range (44% vs 45–65%); however, total fat (37% vs 30–35%) and saturated fat was high (14% vs < 10%). Participants were below the NRVs for micronutrient intakes for: calcium (~17–21% below NRVs), iron (~52% below NRVs), iodine (~21–23% below NRVs) and folate (~40% below NRVs) at 18- or 36-weeks. Participants consumed double the sodium NRVs (~198–201% above NRVs), and had low diet quality scores for all food groups (score = 35/73) at both timepoints. These findings suggest that despite ongoing promotion of healthy eating during pregnancy, more dietary support and education may be required during pregnancy for both mother and the long-term health of their offspring
Dietary nitrate is a precursor to nitric oxide, for which plausible mechanisms exist for both beneficial and detrimental influences in multiple sclerosis (MS)(1,2). Whether dietary nitrate has any role in MS onset is unclear. We aimed to test associations between nitrate intake from food sources (plant, vegetable, animal, processed meat, and unprocessed meat) and likelihood of a first clinical diagnosis of central nervous system demyelination (FCD). We used data from the Ausimmune Study (264 cases, 474 controls). Case participants (aged 19–59 years) presenting to medical professionals in four latitudinally different regions of Australia were referred to the study with an FCD. The Australian Electoral Roll was used to recruit one to four controls per case, matched by age (± 2 years), sex and study region. Habitual dietary intake representing the 12-month period preceding the study interview was assessed to determine dietary nitrate intake. In addition to matching variables, data on education, smoking history, and history of infectious mononucleosis, weight and height were collected. A blood sample was taken for measurement of serum 25-hydroxyvitamin D concentration, which was de-seasonalised. To test associations, we used logistic regression with full propensity score matching. We used two levels of covariate matching: in model 1, cases and controls were matched on the original matching variables (age, sex, and study region); in model 2, cases and controls were additionally matched on well-established/potential risk factors for MS (education, smoking history, and history of infectious mononucleosis) and dietary factors (total energy intake and dietary misreporting). In females only (n = 573; 368 controls and 205 cases), higher nitrate intake (per 60 mg/day) from plant-based foods (fully adjusted odds ratio [aOR] = 0.50, 95% CI, 0.31, 0.81, p < 0.01) or vegetables (aOR = 0.44, 95% CI, 0.27, 0.73, p < 0.01) was statistically significantly associated with lower likelihood of FCD. No association was found between nitrate intake (any sources) and likelihood of FCD in males. To our knowledge, this is the first study to investigate dietary nitrate intake in relation to FCD. Our result that higher intake of nitrate from plant-based foods (mainly vegetables) was associated with lower likelihood of FCD in females supports our previous findings showing that following a Mediterranean diet (rich in vegetables) associates with lower likelihood of FCD(3). The lack of association in males may be due to low statistical power and/or differing food preferences and pathological processes among males and females. Our results support further research to delineate the independent effect of nitrates form other dietary factors and explore a possible beneficial role for plant-derived nitrate in people at high risk of MS.
Adolescents with obesity may engage in dieting to facilitate weight loss(1). However, dieting is also associated with eating disorder risk and body dissatisfaction. This study aimed to understand adolescent dieting behaviours and associations with eating disorder risk, weight bias internalisation and body appreciation. Adolescents (n = 141), median (IQR) age of 14.8 (12.9 to 17.9) years, mean (SD) BMI 35.39 (4.17) kg/m2, with obesity and ≥ 1 related complication were enrolled into the Fast Track to Health trial(2), which aimed to compare two dietary intervention. At the first dietetic visit, adolescents were asked whether they had previously seen a dietitian (yes or no) and if they had previously tried any diets; the types of diets tried were categorised. Self-report questionnaires including Eating Disorder Examination Questionnaire, Binge Eating Scale, Weight Bias Internalization Scale and Body Appreciation Scale were assessed. One-way ANOVA was used to investigate the difference in questionnaire scores based on the number of diets trialled (no diets, one diet, two/three diets). Of 141 adolescents enrolled, 68 (48.2%) had previously seen a dietitian and 106 (75.2%) had tried at least one diet. Most adolescents had tried one type of diet (n = 74, 52.5%), with 29 (20.6%) having tried two or three different diets. There were no associations between sex or age and history of seeing a dietitian or attempting to diet. Adolescent with a higher BMI, expressed as a percentage of the 95th percentile, were more likely to have seen a dietitian, but there was no association between BMI and the number of diets used. Adolescents who had tried two/three diets had higher scores on the Eating Disorder Examination Questionnaire compared to those who reported not dieting (mean [SD] 2.81 [1.12] vs 1.98 [1.08], p = 0.016). There were no differences in scores on the binge eating scale, weight bias internalization scale and body appreciation scale based on the number of diets trialled (p > 0.05). Many adolescents presenting to obesity treatment will have tried one or more diets with or without the support of a dietitian. Repeated dieting attempts may be an early indicator of eating disorder risk in this population. However, further research is needed to understand the duration of dieting and specific dieting practices used. Clinicians providing nutrition education and prescribing diet interventions should be aware of this and the potential influence on adolescent perceptions of healthy and unhealthy dieting practices.
A shift towards including more plant-based wholefoods and a reduction in highly processed and animal sourced foods is highlighted by the World Health Organization as a relatively simple, yet impactful way consumers can make their diets more sustainable and healthier for the planet(1). Changing the population’s diet has proven difficult and the threat of obesity and chronic disease development does not appear to have been a strong motivating factor for middle-aged and older adults. Awareness of the facilitators and obstacles that affect healthful dietary choices is important to inform design of nutrition promotion and behaviour change programs. The aim of this study was to synthesise available evidence that reported what middle-aged and older adults described about barriers and enablers to eating foods consistent with sustainable dietary patterns. A convergent integrated mixed methods systematic review was selected as the best method to answer the research question and we followed the Joanna Briggs Institute method(2). The MEDLINE, CINAHL, Scopus and PsychInfo databases were searched for studies published from January 2009 to February 2024 that reported the beliefs, opinions and attitudes of people aged 45-years and older in high-income countries about food selections and barriers and enablers to eating foods generally consistent with sustainable practices. Twenty-four studies met the inclusion criteria and had data extracted and quality critically appraised using the Mixed Methods Appraisal Tool(3). Findings were combined in a narrative review. The enablers included an understanding of environmental impacts of food; knowledge about nutritional benefits of plant foods and of sustainable eating patterns; and awareness of the health and environmental concerns about some animal food sources. The reported barriers included: cost of eating a sustainable diet; a preference for taste of animal foods; anxieties about nutritional adequacy of plant-based diets; cultural factors and no knowledge of diet and sustainability. The findings may be useful in design of education campaigns and health promotion.
Throughout a three-year undergraduate nutrition degree, students learn the theoretical knowledge and technical skill set required to become a nutritionist. However, nutrition curriculum does not always include an opportunity to practice the application of knowledge and skills in a real-world setting or develop an understanding of the transferable skills required in a workplace, thus, general employability skills and information on workplace expectations are not developed(1). A mixed methods research study was undertaken to understand student perspectives of employer expectations and general employability skills. A validated work-ready tool(1) was used to survey undergraduate nutrition students at Australian universities (n = 171); students who participated in an industry placement were interviewed pre/post placement (n = 22); nutrition industry experts and employers (n = 9) were interviewed and an industry focus group (n = 6) was conducted to develop a deeper understanding of the impact of including curriculum that developed employability skills and exposure to workplaces and expectations. Survey data showed 74% of students had an understanding of how to effectively use their skills in the workplace, 75% indicated they knew what was expected of them if they used their skills in the workplace, and 64% understood the steps in the process of using their skills. Students who completed an employability subject at one university were invited to complete the same survey upon completion and 100% of students reported having this knowledge and understanding. Thematic analysis of student interviews revealed multiple benefits of placement such as industry insights; role clarity; deeper understanding of workplace expectations; the opportunity to practice application of knowledge and skills, identify unknown information, gain experience; networking opportunities; feeling better prepared for interviews and job searching post-degree and improved confidence across all themes identified. Thematic analysis of industry interviews and focus groups also revealed themes related to the benefits of a placement experience for students. Employers identified placement as an opportunity for students to develop an understanding of graduate and workplace expectations; better understand roles and industry; practice application of knowledge and skills; and to develop networks and confidence. Employers also identified a lack of confidence in some students asking for help; students’ inability to translate scientific knowledge to a range of lay audiences; and an understanding of how nutrition works with or within other departments, and suggested that employability skills should be developed within courses and/or on placement to better prepare students for the workplace. The results indicate that the inclusion of employability development and/or placements within the curriculum develops an understanding of the employability skills required by nutrition graduates, builds greater awareness of employment expectations, roles, industry, and workplaces, and provides an opportunity to practice the application of their knowledge and skills resulting in improved graduate confidence.
Iron and zinc deficiencies are prevalent globally and have been found to co-exist within populations(1). Previous work from our laboratory found dual iron and zinc deficiency alters iron absorption compared with iron deficiency alone (unpublished). Iron deficient (FeD), zinc deficient (ZnD), dual iron and zinc deficient (FeZnD) and healthy Caco-2 cells demonstrated different rates of cell media acidification, indicative of altered cell metabolism. The aim of this study was to understand changes to energy substrate consumption and intracellular concentrations under individual and dual deficient conditions. Iron, zinc and dual iron and zinc deficiencies were induced in Caco-2 cells by Chelex-100 removal of minerals from FBS in the growth media and repleting all but the target mineral. Cell media was changed every second day for the first 9 days post seeding, then every day until day 14 due to increased rate of media acidification particularly in iron deficient cells. Glucose (Megazyme, GOPOD), lactate (Megazyme, K-LATE) and protein (Thermo Scientific, SKU# 23225) contents were analysed using commercial kits. Creatine and phosphocreatine were analysed by HPLC. ANOVA was used for statistical analyses using Tukey’s test for post-hoc analyses with a significance of p < 0.05. FeZnD cells had significantly higher intracellular glucose (p = 0.000), and lower lactate (p = 0.000) concentrations compared to healthy control cells. Intracellular glucose and lactate concentrations in cells were also significantly different to that in FeD and ZnD (p < 0.05) cells. Iron deficient and zinc deficient cells were not different to healthy cells (control) for either intracellular glucose (p(FeD) = 0.354, p(ZnD) = 0.996) or lactate (p(FeD) = 0.251, p(ZnD) = 1.000) concentrations. Iron deficient cells did not show a difference in glucose media loss compared to healthy (p = 0.715) cells, whereas zinc deficient (p = 0.000) and dual iron and zinc deficient (p = 0.000) cells had significantly lower glucose loss than healthy cells. Zinc deficiency and dual iron and zinc deficiency resulted in a significant reduction in total intracellular creatine (phosphocreatine + creatine) compared to control cells (p = 0.000). In contrast, zinc deficiency and dual iron and zinc deficiency resulted in a significant increase in phosphocreatine compared to control or iron deficient cells (p < 0.05). Changes to glucose, lactate and phosphocreatine found in this study indicate that dual iron and zinc deficient cells do not mirror changes in individual deficiency. How changes in energy substrates affect nutrient absorption is uncertain from this present work. Further research is required to improve understanding with the view to translate findings into effective treatment and prevention of micronutrient deficiencies.
A subset of Australia’s workforce are shift workers undertaking critical work for our community, who are at greater risk for obesity and related conditions, such as type 2 diabetes and cardiovascular disease(1,2,3). The lifestyle and circadian disruption experienced by night shift workers is currently not addressed in existing dietary guidance for obesity management. The Shifting Weight using Intermittent Fasting in night shift workers study (SWIFt) is a world-first, randomised controlled trial that compares three, 24-week weight-loss interventions for night shift workers: continuous energy restriction (CER) and two twice-per-week intermittent fasting (IF) interventions (fasting during a night shift or during the day). This qualitative study aimed to explore the experiences of participants while following the dietary interventions to understand how intervention features and associated behaviour change mechanisms influence engagement. Semi-structured interviews at baseline and 24-weeks were conducted and audio diaries were collected every two weeks from participants using a maximum variation sampling approach, and analysed using the five steps of framework analysis(4). Each coded text for intervention enablers was mapped to the following behaviour change frameworks: the COM-B model, the Theoretical Domains Framework (TDF), and the Behaviour Change Taxonomy (BCT). Of the 250 participants randomised to the SWIFt study, 47 interviews from n = 33 participants were conducted and n = 18 participants completed audio diaries. Three major themes were identified related to intervention factors influencing engagement: 1) Simplicity and ease are important for night shift workers; 2) Support and accountability are needed to change behaviour and to tackle fluctuating motivation; and 3) An individualised approach is sometimes needed. Ten enabler sub-themes were identified: ease and acceptability of provided foods, structured and straightforward approach, flexible approach, easier with time, simplicity and small changes, dietetic support, accountability, self-monitoring, increased nutrition knowledge, and focus on regular eating. The enabler sub-themes were predominantly related to the ‘motivation’ and ‘capability’ domains of the COM-B model and one sub-theme related to the ‘opportunity’ domain. Mapping to the ‘capability’ COM-B domain was more frequent for the CER intervention compared to the IF interventions. For the Theoretical Domains Framework (TDF), the following domains were the most frequently reported: ‘behavioural regulation’, ‘knowledge’, ‘goals’ and ‘environmental context and resources’. For the Behaviour Change Taxonomy (BCT), the following domains were the most frequently reported: ‘instruction on how to perform a behaviour’, ‘goal setting (behaviour)’, ‘self-monitoring of outcome(s) of behaviour’, and ‘adding objects to the environment’. This study provides important findings detailing the behaviour change mechanisms perceived to positively influence night shift worker engagement during the weight-loss interventions of the SWIFt study, which will help inform the translation of interventions into non-research settings.
Depression and dementia represent significant public health issues, affecting approximately 1 in 10 and 1 in 12 older Australians, respectively. While current pharmacological treatments are effective in relieving symptoms, they often entail undesirable adverse effects, including gastrointestinal issues and bradycardia(1,2). This highlights the need for primary preventative measures, including food- and nutrition-based approaches. Chronic brain inflammation is believed to interfere with the gut–brain axis(3). Consumption of fermented dairy products rich in beneficial gut microbes may attenuate this inflammation and offer protective health benefits. This study aimed to examine whether fermented dairy intake could mitigate the risk of incident depression and dementia. Utilising data from the Sydney Memory and Ageing Study I of 1037 participants 70–90 years, 816 participants (mean age: 76.7) were followed from 2005 until 2012 for incident depression, and 974 participants (mean age: 80.7) were followed up from 2005 until 2014 for incident dementia. Fermented dairy intake was assessed using the Dietary Questionnaire for Epidemiological Studies version 2 and categorised yoghurt and regular cheese into quartiles (Q) and low-fat cheese into consumers/non-consumers, with no consumption as the reference group. Depression diagnoses were assessed via self-reported physician-diagnosed history, medication use, service utilisation, and heavy alcohol use. Dementia diagnoses followed the criteria in the fourth edition of the Diagnostic and Statistical Manual of Mental Disorders. Cox proportional hazards models examined the associations between fermented dairy intake and the risk of incident depression/dementia. Additionally, linear regression models were applied to assess for depressive symptoms score (measured by the Geriatric Depression Scale-15) and psychological distress score (measured by the Kessler Psychological Distress Scale-10). All models were adjusted for sociodemographic, lifestyle factors, and medical histories. Over a median follow-up of 3.9 and 5.8 years, 120 incident depression and 100 incident dementia cases occurred, respectively. Those who consumed high yoghurt (Q4: 145.8–437.4 g/day) and low-fat cheese (consumers: 0.4–103.1g/day) intakes were associated with a lower risk of incident depression, both compared to non-consumers (yoghurt: adj.HR: 0.38, 95% CI: 0.19–0.77; low-fat cheese: adj.HR: 0.50; 95% CI: 0.29–0.86). They were also associated with lower depressive symptom scores (yoghurt: adj.β = −0.46; 95% CI: −0.84, −0.07; low-fat cheese: adj.β = −0.42; 95% CI: −0.73, −0.11). However, those who consumed a higher intake of regular cheese (Q4: 14.7–86.1 g/day) had an elevated risk of incident depression (adj.HR: 1.88; 95% CI: 1.02, 3.47), and those in Q2 (0.1–7.2 g/day) had significantly higher depressive symptom scores (adj.β = 0.42; 95% CI: 0.05, 0.78). No significant findings were found for psychological distress scores or incident dementia. Our findings of a cohort of older Australians suggest that higher yoghurt and low-fat cheese intakes may reduce incident depression and depressive symptoms, while a higher intake of regular cheese may increase these risks.
Nutritional metabolomics is an emerging objective dietary biomarker method to help characterise dietary intake. Our recent scoping review identified gaps and inconsistencies in both design features and level of detail of reported dietary intervention methods in human feeding studies measuring the metabolome(1) and our cross-over feeding study protocol details dietary information for identification of metabolites that characterise ‘healthy’ and ‘unhealthy’ (typical) Australian diets(2). The current study aimed to gain consensus on core diet-related item details (DID) and recommendations for reporting DIDs to inform development of a reporting checklist. The aim of this checklist is to guide researchers on reporting dietary information within human feeding studies measuring the dietary metabolome. A two-stage online Delphi was conducted encompassing 5 survey rounds (February–July 2024). This study is approved by the University of Newcastle’s Human Research Ethics Committee (HREC; H-2023-0405). Sixty-seven experts were invited across expertise in clinical trial design, feeding study intervention implementation, metabolomics, and/or human biospecimen analyses. Twenty-eight DIDs categorised across five domains underwent consensus development. Stage 1 (2 rounds) gained consensus on a core set of DIDs, including phrasing. Stage 2 (3 rounds) gained consensus on standard reporting recommendations for each DID and acceptance of the final reporting guideline. The research team convened after every round to discuss consensus-driven results. Experts resided from Australia, New Zealand, United States, United Kingdom, Sweden, Israel, Italy and Denmark. Twenty-five completed stage 1 and n = 22 completed stage 2. After stage 1, two DIDs merged and two new DIDs were identified, totalling 29 core DIDs. At the end of stage 2, round 2, based on expert feedback, all items were organised to determine differing degrees of reporting in the methods section of publications, with additional recommendations collated for other sections, including supplementary files. The reporting guideline (DID-METAB Checklist) was generated and accepted by the expert working group in round 3, with all experts agreeing that relevant journals should include the checklist as a suggested reporting tool for relevant studies or used alongside existing reporting tools. The Delphi process gained consensus on a core set of DIDs, and consolidated expert views on the level of detail required when reporting DIDs in research. The Delphi process generated the reporting guideline (DID-METAB Checklist) which can be implemented independently or as an extension to existing guidelines such as CONSORT (at item 5) or SPIRIT (at item 11) to improve reproducibility and comparability of feeding studies. Endorsement by scientific societies and journals will be key for the dissemination strategy and optimising the utility of the tool to strengthen the evidence base of nutritional metabolomics. The DID-METAB Checklist will be a key tool to advance reporting of diet-related methodologies in metabolomics for both personalised and precision nutrition interventions in clinical practice.
Carbohydrates are an important source of energy, playing a crucial role in the growth and development of children(1). Carbohydrate intakes in adults are well studied, but the intake and key food sources of carbohydrates during early childhood are poorly understood. Assessing carbohydrate intake and identifying key food sources of carbohydrates and their subtypes will aid in dietary monitoring to identify suboptimal intake and food sources. Thus, the current study aimed to describe the intakes of total carbohydrates and their subtypes (starch and total sugar), identify their main food sources, and examine tracking among young Australian children over the first 5 years of life. Data from children who participated in follow-ups at ages 9 months (n = 393), 18 months (n = 284), 3.5 years (n = 244), and 5 years (n = 240) in the Melbourne InFANT Program(2) were used. Child dietary intake was collected using three 24-hour recalls. Descriptive statistics were used to summarise the total carbohydrate and subtype intakes and their main food sources. Tracking was examined using Pearson’s correlations of the residualised scores across different time points. From ages 9 months to 5 years, total carbohydrate intake increased from 99.7 g/d to 174 g/d. Total sugar and starch intakes (g/d) also increased throughout early childhood. The percentage of energy (%E) from total carbohydrates remained stable over time (48.4–50.5%). However, the %E from total sugar decreased from 29.4% at 9 months to 22.6% at 5 years, while the %E from starch increased from 16.7% at 9 months to 26.0% at 5 years. The primary source of total carbohydrates at 9 months was infant formula. At later ages (18 months, 3.5 years, and 5 years), the key sources of total carbohydrate intake were bread/cereals, fruits, and milk/milk products. The major sources of total sugar intake at all time points were fruits and milk/milk products. However, intakes of total sugar from discretionary foods such as cakes/cookies increased with age. The main food sources for starch intake were consistent across all ages which included breads/cereals, cakes/cookies, and pasta. Weak to moderate tracking of total carbohydrate, total sugar, and starch (g/d) was observed from as early as age 9 months to age 5 years. Given the detrimental effects of discretionary foods on health, our results reinforce the importance of reducing sugar intake from discretionary foods and promoting healthy alternatives, such as wholegrains, from early childhood. The tracking of carbohydrate intake from as early as age 9 months suggest that carbohydrate intakes were established early in life, emphasising the importance of early dietary interventions. Our findings revealed in-depth insights into carbohydrate intake trends during early childhood, which may contribute valuable evidence to inform the refinement of carbohydrate intake recommendations in young Australian children.
The food supply is known to have a substantial impact on greenhouse gas emissions. The Global Warming Potential Star (GWP*) refers to the amount of carbon dioxide equivalents produced by food items, and can be used to quantify the impact of the food supply on the environment. GWP* values are available for n = 232 Australian food products(1). In order to estimate the climate footprint of diets in Australia, GWP* values must be applied to all foods in AUSNUT 2011–13, the most current Australian food composition database. The aim of this study was to systematically apply GWP* values to foods in AUSNUT 2011–13, to facilitate calculation of the climate footprint of Australian dietary data. To create the GWP*-AUSNUT 2011–13 database, all n = 5740 food and beverage items in AUSNUT 2011–13 were reviewed, and GWP* values were applied via a systematic approach. Initially, GWP* values were matched to AUSNUT 2011–13 foods based on conceptual similarities, where an appropriate GWP* item existed (e.g., Beef, mince, < 5% fat, raw was matched to the GWP* item ‘Beef meat’, with a GWP* value of 16.68 CO2e/kg). For AUSNUT 2011–13 foods where there was not an appropriate GWP* item, for example composite foods with multiple ingredients, the AUSNUT 2011–13 Recipe File was used to determine constituent ingredients and match these to GWP* items (e.g., Beef, mince, < 5% fat, baked, roasted, fried or stir-fried, grilled or BBQ’d, canola oil was matched to the GWP* items ‘Beef meat’ and ‘Canola oil’ based on the ingredient proportions in the AUSNUT 2011–13 Recipe File). Where an AUSNUT 2011–13 recipe did not exist, these were determined based on ingredient descriptions in the AUSNUT 2011–13 database, or based on ingredient proportions from a sample of food labels (e.g., Peanut butter, smooth & crunchy, added sugar & salt). In the case of single ingredient foods without an appropriate GWP* item match, an average of GWP* values for similar foods (using the AUSNUT 2011–13 food classification categories) was calculated (for example as there was no GWP* item for Mulberry, raw, an average of all GWP* items aligning with foods in the AUSNUT 2011–13 minor food group ‘berry fruit’ was calculated). The systematic process of applying GWP* values to AUSNUT 2011–13 foods was conducted by two researchers independently, with any disagreements resolved via consensus and discussion with the research team. Through application of GWP* values to the AUSNUT 2011–13 food composition database via a systematic process, the GWP*-AUSNUT 2011–13 database was created. This database will allow for the estimation of the climate impact from dietary data collected in Australia, both retrospectively and in future studies, to identify the climate footprint of different dietary patterns or to provide insight into dietary changes required to decrease greenhouse gas emissions.
Excess sodium consumption, mostly from dietary salt, causes high blood pressure and an increased risk of cardiovascular disease(1). In parallel, insufficient potassium intake also contributes to raised blood pressure(2). Switching regular salt for potassium-enriched salt, where a proportion of the sodium chloride is replaced with potassium chloride, is a promising public health intervention to address both these issues(3). However, the supply chain to support increased use of potassium-enriched salt in Australia is not well understood. The objectives of this study were to investigate how the salt supply chain operates in Australia and to obtain food industry stakeholder perspectives on the technical barriers and enablers to increased potassium-enriched salt use. Twelve interviews with industry stakeholders (from food companies, salt manufacturers and trade associations) were conducted and thematically analysed using a template analysis method. Two top-level themes were developed: ‘supply chain practices’ and ‘technical barriers and enablers’. The potassium-enriched salt supply chain was described as less well-established than the low-cost production and distribution of regular salt in Australia. However, food companies reported not having difficulty sourcing potassium chloride. For Australian food industry stakeholders, cost, flavour and functionality were perceived as key barriers to increased uptake of potassium-enriched salt as a food ingredient. Stakeholders from food companies were hesitant to use potassium-enriched salt due to concerns about bitter or metallic flavours and uncertainty whether it would provide the same microbial/shelf-life functions or textural quality as regular salt. However, potassium-enriched salt manufacturers had divergent opinions stating potassium-enriched salt was a suitable functional replacement for regular salt and that flavour differences observed may be due to the incorrect use of potassium chloride as opposed to use of a purpose-made potassium-enriched salt. Stakeholders identified that establishing a stable and affordable supply of potassium-enriched salt in Australia and increased support for food technology research and development would enable increased use. To improve uptake of potassium-enriched salt by the Australian food industry, future efforts should focus on strengthening potassium-enriched salt supply chains and improving appeal for food industry to use in manufacturing and for consumers to purchase. Public health advocacy efforts should ensure that industry is equipped with the latest evidence on the feasibility and benefits of using potassium-enriched salt as an ingredient. Ongoing engagement is critical to ensure that industry is aware of their responsibility and opportunity to offer healthier foods to consumers by switching regular salt to potassium-enriched salt within foods.
Chronic diseases such as cardiovascular disease and diabetes are a leading cause of disability and death in Australia. These conditions are often heralded by biomarkers from blood and urine samples which indicate risk or presence of disease. The development of chronic conditions remains influenced by a set of modifiable risk factors, which includes diet. The level of food processing has recently been linked with disease risk factors and poor health outcomes(1), yet there is limited research into the direct associations between food processing and chronic disease biomarkers. This study aims to investigate the associations between varying levels of ultra-processed food consumption and chronic disease biomarkers. Participants ≥ 18 years with biomedical data who participated in both the National Nutritional and Physical Activity Survey and the National Health Measures Survey 2011–2013 were included in this secondary analysis. Chronic disease biomarkers were categorised as normal or abnormal according to cut-off values from the Royal College of Pathologists of Australasia. Dietary intake was classified according to the NOVA system for level of food processing. Data were then stratified into quintiles of daily energy share of ultra-processed foods. Associations between chronic disease biomarkers across quintiles of energy share of ultra-processed foods were examined. A significant positive trend was found between ultra-processed food consumption and high-density lipoprotein (HDL) cholesterol (p < 0.01). An inverse association was observed between ultra-processed food consumption and total cholesterol (p < 0.001). The highest consumers of ultra-processed foods were more likely to be younger, less educated, more disadvantaged, not meeting physical activity guidelines, and currently smoking (all ps < 0.001). In conclusion, ultra-processed food consumption was associated with significant changes in total and HDL cholesterol levels. This provides insight into possible interactions at a biochemical level and may help to guide future dietary recommendations on ultra-processed foods for disease management and prevention.
Interest in the consumption of food containing live microbes (LM) as a component of dietary patterns has accelerated, due to potential positive contributions to health and chronic disease risk, including cardiovascular disease (CVD)(1,2). There are different patterns of LM consumption, including through the intake of probiotics or fermented foods or via a broader spectrum of foods that may harbour microbes, such as raw, unpeeled fruits and vegetables(3). To date, no study has quantitatively assessed potential intake of LM in a sample of Australians. The aim was to quantify presence of LM for common foods and beverages consumed in Australia, using the Australian Eating Survey® (AES) and AES-Heart®(4,5 food frequency questionnaires as the dietary assessment tool. Quantification of potential live microbial content (per gram) was conducted in accordance with the methodology outlined by Marco et al.(3). Briefly, foods were assigned to categories with LM ranges defined as low (Low; < 104 CFU/g), medium (Medium; 104–107 CFU/g), or high (High; > 107 CFU/g) for level of live microbes(3). These categories were based on the expected prevalence of viable microorganisms within different food matrices. Specifically, pasteurised food products are characterised as having microbial concentrations Low < 104 CFU/g. In contrast, fresh fruits and vegetables, consumed unpeeled exhibit a microbial range considered medium (Medium; 104–107 CFU/g), while unpasteurised fermented foods and probiotic supplemented foods exhibit significantly higher microbial content (High > 107 CFU/g). Based on this methodology, the estimated quantities of live microbes in 400 foods and beverages (including individual products and mixed dishes) within the AES and AES-Heart®(4,5 FFQs were determined and summarised across 22 food groups using the 2-digit codes from the 2011–2013 AUSNUT database(6). Preliminary results indicate the Low group was the most represented, out of the 400 foods 369 belong to this category. The food groups that represent the highest percentages in the Low group were vegetable products and dishes (13.8%) followed by meat, poultry, and game products and dishes (13.6%). The Medium group was composed by 25 items, with the most representative food groups being fruit products and dishes (48%). In the High group, the representative food groups were dairy and meat substitutes (e.g., soy yoghurt; 66.7%) and milk products and dishes (33.3%). The creation of this database will facilitates new research opportunities to investigate relationships between intake of live microbes and health outcomes, including CVD. Future research into how dietary pattern rich in live microbes related to chronic disease risk factors, such as reduced BMI, blood pressure, plasma lipids and glucose, in the Australian population could offer new insights into risk factor management through LM dietary interventions.
Optimal nutrition supply to the developing foetus is paramount in achieving appropriate foetal growth and development. The Australian dietary guidelines advise about the amounts and types of foods for pregnancy(1). However, previous studies in reproductive aged women(2) and in pregnant women(3) showed suboptimal adherence to dietary recommendations. There is no evidence on the experience of sourcing nor uptake of the dietary guidelines among pregnant women. The aim of this study is to qualitatively explore women’s knowledge and understanding of nutrition information for pregnancy, including the current Australian dietary guidelines for pregnancy. Twelve pregnant women were recruited from a longitudinal study from the first through third trimester of pregnancy. Purposive sampling was adopted with an intention to recruit for diverse health information seeking habits. Semi-structured interviews were conducted with women at different trimesters, transcribed verbatim, and analysed thematically. Three themes were generated regarding information sourcing, uptake and evaluation. (i) Women had limited knowledge about the pregnancy dietary guidelines, leaving them to source pregnancy related nutrition information elsewhere. (ii) Women described other healthy eating advice that contributed to confusion and potential incompatibility with their dietary beliefs and lifestyle practices. (iii) Women shared that they were capable of seeking and evaluating the identified dietary advice, but the inconsistency across information sources contributed to over-cautious behaviour and dietary restrictions. Our findings suggest there is a general lack of awareness of the official dietary guidelines for pregnancy. To optimise pregnancy nutritional intake, efforts should be made to increase utilisation of the Australian dietary guidelines for pregnancy and to support uptake of dietary advice among pregnant women.
Claims relating to foods’ nutrition content and potential health benefits have been shown to influence consumer preferences and purchases regardless of the nutritional quality of the product(1). In Australia, permitted claims include nutrition content claims, which refer to the presence or absence of a nutrient, and health claims, which refer to health benefits of foods or nutrients in a product. Health claims include general level health claims, which refer to normal processes and functions, and high level health claims, which refer to a disease or biomarker of a disease. Products that display a health claim must meet the Nutrient Profiling Scoring Criterion (NPSC), however this is not required for products to make a nutrition content claim. The aim of this study was to examine the use of nutrition content and health claims made on Australian ready meal products and assess the proportion of products displaying claims that meet the NPSC. Analysis of the ready meal category in the 2023 FoodSwitch database, a repository of Australian food packaging images and label data for over 28,000 foods developed by The George Institute for Global Health, was conducted(2,3). Foods in the ready meal category were identified and data from the nutrition information panel was collated to calculate whether they met the NPSC. Nutrition content and health claims were extracted from product images and categorised according to claim type (nutrition or health claim) and claimed nutrient or attribute. The proportion of products meeting the NPSC was then calculated overall and by claim type (nutrition content vs health claims). Data were available for 777 ready meal products. Of these, 682 (87.8%) met the NPSC. In total, 2051 nutrition content or health claims were identified across the ready meal products, with 1909 (93.1%) of these categorised as nutrition content claims. The remaining 142 claims identified were general level health claims, with no high level health claims identified. Almost all (n = 1857, 97.3%) nutrition content claims and all general level health claims were made on products which met the NPSC. The most common claims related to protein, energy and fibre content. The use of claims was prevalent across the ready meal food category in Australia, with claims relating to nutrient content being most common. While most claims were made on products that met the NPSC, there is a need for further research to ensure the NPSC appropriately distinguishes between healthy and less healthy food products. This will ensure consumers are equipped to make informed decisions when purchasing food products.
Nutrition represents a promising strategy for increasing antioxidants in the brain, with potential implications for mitigating illnesses with oxidative stress-related neuropathology(1). Emerging evidence indicates that bioactive compounds, including phenolics and betalains—responsible for the red, yellow, and purple hues in fruits—may decrease oxidative damage(2). Therefore, research into novel plant-based sources of phenolics and betalains with potential antioxidant properties is needed. This study aimed to examine the neuroprotective effects of key fruit extracts and to correlate effects with their phytochemical and antioxidant profile. Dragon fruit (DF), queen garnet plum (QGP) jaboticaba (JB), green apple (GA), blueberry (BlueB), blackberry (BlackB), watermelon (WM), and apricot (AP) extracts were analysed for their phenolic, flavonoid, anthocyanin and betalain concentrations, and antioxidant capacity (Oxygen Radical Absorbance Capacity (ORAC)). The neuroprotective efficacy of the fruits (10, 25, 50, 100 μg/mL) was then examined in-vitro using H2O2-induced oxidative stress in SH-SY5Y neuroblastoma-like cells. Cells were treated with the fruit extracts either prior to H2O2 administration (to examine protective effects), or after the H2O2 stressor (to determine treatment effects), with cell viability examined using MTT assays. Statistical analyses determined differences between fruits and the controls (healthy (untreated) and H2O2 controls) using one-way ANOVAs and post-hoc Tukey comparisons. Correlations were examined using Spearman’s correlation tests. QGP and BlueB were significantly higher in phenolics and anthocyanins (p < 0.01), QGP highest in flavonoids (p < 0.01), and DF was highest in betalains (p < 0.001) and ORAC (p < 0.01), compared to the remaining fruits. Pre-treatment with DF and JB prevented H2O2-induced loss in cell viability, retaining control-like levels (p > 0.05 vs healthy controls). GA pre-treatment also exhibited significant neuroprotective effects (p < 0.01 vs H2O2 alone) but could not restore control levels (p < 0.01 vs healthy controls). The ability of DF, JB and GA to treat existing damage to cell viability induced by H2O2 was then examined; however, the extracts were ineffective as a treatment (p > 0.001 vs H2O2 alone and healthy controls). Interestingly, there were moderate correlations between cell viability and both ORAC (r2 = 0.680, p < 0.001) and betalain concentration (r2 = 0.446, p < 0.05). This study revealed novel sources of bioactive compounds, including characterisation of betalain concentrations, in these fruits. The results demonstrate an ability of DF and JB to prevent oxidative stress in neuronal-like cells, but not treat damage after it has occurred. This finding agrees with the evidence that implementing a healthy diet rich in bioactive compounds, such as betalains and phenolics, may support brain health(1). The data also support a link between increased betalains, antioxidant capacity and neuroprotection; however, further research into the mechanisms underpinning the beneficial protective effects of DF and JB are required.
Cardiometabolic pregnancy complications, including gestational diabetes mellitus (GDM), hypertensive disorders of pregnancy (HDP), intrauterine growth restriction (IUGR) and preterm birth (PTB) are prevalent pregnancy complications that adversely affect maternal and neonatal health during pregnancy, and increase women’s risk of future type 2 diabetes mellitus (T2DM) and cardiovascular disease (CVD)1–5. Pregnancy and postpartum, including intrapartum periods, are critical windows of opportunity to deliver care to support sustained behaviour change(6). There is currently a gap in lifestyle (diet and physical activity) interventions specific to cardiometabolic disease risk awareness and prevention during and following pregnancy(5,7). These are key life stages where early risk factors for cardiometabolic disease may present, women are actively engaged in the healthcare system and their health priorities are shifting as they transition into parenthood. Early intervention in pregnancy may enable commencement of pharmacological and/or lifestyle intervention to reduce the risk or severity of cardiometabolic pregnancy complications(8), whereas postpartum intervention may enable commencement of sustainable lifestyle change for reduction of long-term cardiometabolic risks(9). There are a range of settings where pregnant and postpartum women receive healthcare, including hospitals, primary care clinics, community health institutions and online platforms(8,10,11). The optimum timing and setting to deliver an intervention to these high-risk women is not known. Designing interventions to align with the needs and priorities of stakeholders is a critical first step in developing an acceptable intervention. The aims of this research were to explore stakeholder perspectives and prioritise the optimal timing and setting to deliver a lifestyle intervention to improve long-term cardiometabolic health amongst women at high-risk of or diagnosed with a cardiometabolic pregnancy complication. An embedded mixed-methods research design was utilised. Facilitator-led workshops were used to prioritise the preferred timing (pregnancy or postpartum) and setting (hospital, general practice, community health program, maternal and child health services or online) for an intervention. Women with prior GDM, HDP, IUGR and/or PTB (n = 9), and research partners (n = 15) (obstetricians, endocrinologists, community health representatives, researchers, midwife, general practitioner, dietitian) participated. Workshops were audio recorded, transcribed verbatim and thematically analysed using template analysis. Online polls were used to assess participants preferred timing and setting for an intervention. Women preferred a postpartum intervention delivered online, whereas research partners preferred a pregnancy intervention delivered via hospital antenatal care. Both groups suggested commencing interventions during pregnancy and continuing postpartum. Participants recommended ensuring interventions consider healthcare system barriers to intervention delivery, equity and sustainment, as well as consumer-specific barriers to intervention engagement and lifestyle change during pregnancy and postpartum. Commencing patient-centred interventions during pregnancy and continuing postpartum should be considered to support continuity of care and improve health outcomes across both life stages for this high-risk group of women.
Short-term, immersive international placements are common and have been recognised for facilitating cultural learning, intercultural sensitivity, global-mindedness, and critical thinking. These outcomes are not guaranteed and the impact and inclusion of these experiences in an already comprehensive curriculum remain to be carefully evaluated. This study explores the impact of 4-week international placements on professional and personal development (short-term and enduring) of participating final-year undergraduate and postgraduate dietetic and undergraduate nutrition students. Recent nutrition and dietetic graduates (n = 8) of the program, implemented in underserved communities in the Philippines, Indonesia, and Vanuatu for four weeks during 2022–2021, were interviewed using semi-structured questions until saturation, followed by thematic analysis using a six-step process(1). Graduates were interviewed about their in-country experience and its impact on their professional practice, within 12–18 months of graduating. Findings were further analysed against the national competency standards for dietitians in Australia. Participants expressed significant cultural awareness (appreciation of the strength of other cultures, stronger communitarian values of hosts), greater critical and systems thinking (about socio-economic determinants of health, systems-level thinking) and deeper capacity for empathy and compassion (emotional quotient development). When viewed against the national competency standards, participants did not understand how international placements from developed to developing countries can perpetuate colonisation principles and global-mindedness was not identified as a strong theme. These findings suggest that immersive, international placements result in profound, transformational, and enduring learning that extends into participants’ professional life, especially respect for cultural safety and the development of systems thinking. There is an alignment with accrediting professional peak body competencies for cultural competency, professional practice and collaborative practice with findings suggesting areas for strengthening the nutrition and dietetic curriculum include enhancing learning outcomes of global-mindedness thinking and decolonisation principles.