To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
One in five Australian children are food insecure(1), and the majority are consuming inadequate vegetables and too many energy dense nutrient poor foods(2). A healthy school lunch program provided to all children will ensure fair and equitable access to essential nourishment during a school day that can improve academic achievement(3), attention, behaviour and concentration(4) and mental health and wellbeing(5). However, prior to implementing a school meal program, opinions and thoughts of key stakeholders, including teachers, should be explored. The aim of this study was to explore Victorian primary school teachers’ perceptions and opinions of current lunch practices and school provided lunch programs.An online survey of primary school teachers in Victoria was administrated via Qualtrics. Frequencies and percentages of responses were calculated and Chi Square tests were used to explore associations with demographic variables (gender, school type (government, independent, catholic, other)). A total of n = 322 Victorian primary school teachers completed the survey (95% female, 81% government schools). All year levels (Prep–grade 6) were represented. Thirty percent agreed and 45% were unsure whether there should be a school provided lunch program. Most teachers (91%) believed that the school provided lunches would allow children to eat healthy foods and 73% believed it would be convenient for parents. Perceived barriers included; cost for parents (81%), teachers do not want to serve food (75%) and time it would take to serve the food (70%). There was no difference between teachers in government and independent schools in preferences for a school lunch program or perceived potential benefits. However, more teachers from independent schools believed it would take too much time to serve food to all children (81%) compared to teachers from government schools (68%; Pearson’s Chi-Square test, p = 0.045). Additionally, more teachers from government schools thought that delivering a school provided lunch program may reduce teaching time, than teachers from independent schools (52% vs 37% respectively: Pearson’s Chi-square test, p = 0.041).While most teachers agreed school provided lunches would provide children an opportunity to eat healthy food for lunch and would be convenient for parents, less than a third agreed there should be a lunch program and many were unsure. Concerns included taking too much time to serve food and thinking they would need to serve the food themselves. This may be due to a lack of understanding of school provided lunch programs. Further research can investigate effective ways to mitigate some of the identified barriers when designing a school meal program.
Deliberative processes are an antidote to despair about the inadequacies of politics-as-usual, but the “deliberative wave” (OECD 2021) of these initiatives around the globe has the potential, in some contexts, to be the latest face of colonization. In Aotearoa New Zealand, one project has worked since 2019 to design a climate assembly that enacts Te Tiriti o Waitangi (1840) obligations to honour Māori political authority. This article outlines the project's three innovations to the citizens’ assembly design that centre Māori forms of governance and reflect Māori deliberative protocols, and highlights three important distinctions to how a group of tangata Tiriti (people of the Treaty) has worked in partnership with tangata whenua (people of the land). Each feature has been vital to becoming Te Tiriti-led despite a context of ongoing colonization, with this place-based assembly having major implications for deliberation theory and practice worldwide.
While the introductory role of Prov 1:1–7 is well recognised, its relationship to subsequent sections has received less attention. This essay argues that Prov 1:1–7 introduces, not the entire book, but specifically the first collection in chapters 1–9. Building on Arthur Keefer's analysis, it posits that a single audience, ‘the wise’ in v. 5, is exhorted to listen to instruction and thereby acquire a sense of direction, with the expectation that, in doing so, they will be equipped to attain three primary aims: (1) to enhance understanding oriented towards the fear of Yhwh, (2) to cultivate moral virtue and (3) to instruct the next generation to do likewise. The introduction's programmatic function is then demonstrated as these aims are traced throughout the first collection.
To examine the association of posttraumatic headache (PTH) type with postconcussive symptoms (PCS), pain intensity, and fluid cognitive function across recovery after pediatric concussion.
Methods:
This prospective, longitudinal study recruited children (aged 8–16.99 years) within 24 hours of sustaining a concussion or mild orthopedic injury (OI) from two pediatric hospital emergency departments. Based on parent-proxy ratings of pre- and postinjury headache, children were classified as concussion with no PTH (n = 18), new PTH (n = 43), worse PTH (n = 58), or non-worsening chronic PTH (n = 19), and children with OI with no PTH (n = 58). Children and parents rated PCS and children rated pain intensity weekly up to 6 months. Children completed computerized testing of fluid cognition 10 days, 3 months, and 6- months postinjury. Mixed effects models compared groups across time on PCS, pain intensity, and cognition, controlling for preinjury scores and covariates.
Results:
Group differences in PCS decreased over time. Cognitive and somatic PCS were higher in new, chronic, and worse PTH relative to no PTH (up to 8 weeks postinjury; d = 0.34 to 0.87 when significant) and OI (up to 5 weeks postinjury; d = 0.30 to 1.28 when significant). Pain intensity did not differ by group but declined with time postinjury. Fluid cognition was lower across time in chronic PTH versus no PTH (d = −0.76) and OI (d = −0.61) and in new PTH versus no PTH (d = −0.51).
Conclusions:
Onset of PTH was associated with worse PCS up to 8 weeks after pediatric concussion. Chronic PTH and new PTH were associated with moderately poorer fluid cognitive functioning up to 6 months postinjury. Pain declined over time regardless of PTH type.
Natural disasters can increase the risk of infection by severely disrupting access to basic needs, including clean water and sanitation. Hand hygiene, one of the simplest and most effective ways to prevent infections, often becomes a challenge in such situations. The study focused on individuals living in temporary housing following the earthquakes in Turkey on February 6, 2023.
Objective:
The main objective of this study was to assess the prevalence of hand hygiene practices and the factors affecting these behaviors among individuals affected by disasters.
Methods:
Data were collected from more than 3,600 randomly selected participants living in container cities in four provinces: Adana, Osmaniye, Hatay, and Gaziantep. Both quantitative and qualitative research methods were used to ensure a comprehensive understanding of hand hygiene behaviors. A detailed questionnaire was used to assess factors such as frequency of hand washing, access to water, and use of hygiene products. In addition, focus group discussions were conducted to explore individual and environmental factors influencing hygiene practices.
Results:
The results showed that although most individuals were aware of the importance of hand hygiene, several barriers, such as limited access to clean water, psychological stress, and a lack of hygiene supplies, hindered their ability to maintain proper hygiene. The frequency of hand washing increased slightly after the disaster, but challenges such as forgetfulness, time constraints, and skin irritation from inadequate hygiene products were common.
Conclusion:
This study provides important insights into the prevalence of and factors influencing hand hygiene practices in post-earthquake container cities in Turkey. Findings suggest that although individuals have a basic awareness of the importance of hand hygiene, multiple barriers, including access to water, hygiene supplies, and psychological stress, significantly impact their ability to maintain proper hygiene practices after a disaster. This study highlights the critical need for continued education, improved access to hygiene supplies, and psychosocial support to sustain hygiene behaviors in post-disaster settings. By addressing both physical and psychological barriers, public health interventions can be more effective in reducing the risk of infectious diseases in disaster-affected populations. Furthermore, the study emphasizes the importance of preparedness for future disasters by ensuring hygiene resources are readily available and individuals are equipped with the knowledge and skills to maintain hygiene under adverse conditions.
Unhealthy diet-related behaviour is linked to an increased risk of colorectal cancer (CRC) and therefore people at increased-risk of CRC are advised to follow healthy dietary recommendations. Assessing disparities in diet quality based on sociodemographic factors could help to tailor dietary interventions(1). We aimed to determine the relationship between diet quality and sociodemographic factors in people at increased risk of CRC. This was a cross-sectional study including adults at increased risk of CRC due to a prior history of colorectal neoplasia and/or a known significant family history of CRC. Participants completed a survey including the Australian Eating Survey (AES)(2), and collection of demographic characteristics including age, gender, education, and socioeconomic indices (SEI) from Oct 2023 to July 2024. The AES survey was used to calculate diet quality using the Australian Recommended Food Score (ARFS)(2). The ARFS was calculated by summing the eight sub-scales that includes vegetables, protein foods, breads/cereals, dairy foods, water, and spreads/sauces. The total ARFS ranges from 0–73, with a higher score indicating a higher diet quality. Associations between diet quality and sociodemographic factors were determined using a log Poisson regression model with robust variance estimation. 1940 individuals (52% female) completed the survey. The median age was 67.44 years (IQR: 59.56 ± 72.66), with 11.49% (n = 223) aged under 50 years, 86.0% (n = 1669) aged 50–79 years and 2.5% (n = 48) aged over 79 years. The mean (± SD) ARFS was 28.76 ± 10.48 points. The ARFS did not significantly differ with gender (males: 29.0 ± 10.57; females: 28.6 ± 10.36), family history of CRC (family history: 28.7 ± 10.55; no family history: 28.8 ± 10.44), or SEI (higher tertile: 28.76 ± 10.41; lower tertile: 28.92 ± 10.65) (p > 0.05). Diet quality was associated with age, with ARFS lower in younger (18–49y) (28.72 ± 10.18) than older (80–89y) participants (31.19 ± 8.5) (p < 0.05). Regarding dietary components, dairy intake was lower in females than males (Relative Risk (RR) = 0.94, 95% confidence interval (CI) 0.90–0.99), while individuals with the middle SEI tertile had lower fruit intake compared with the highest tertile (RR = 0.94, 95% CI 0.84–0.99), and those left school before year 12 had lower vegetable intake, compared to those with tertiary education (RR = 1.07, 95% CI 1.01–1.13). This study has shown that individuals at elevated risk for CRC have a quality of diet that is poorer than the general population, with greater disparities seen in young individuals. Further differences were observed in dairy, vegetable and fruit intakes based on sex, education, and socioeconomic status. There is a need for further promotion of dietary interventions in people at elevated risk for CRC.
Iron deficiency is the most common nutritional deficiency globally. Premenopausal women are at particular risk due to increased requirements for iron associated with menstrual blood loss and pregnancy. To prevent iron deficiency, recommended intakes have been developed based on physiological requirements for absorbed iron and iron bioavailability. However, iron bioavailability is difficult to estimate as it depends on the composition of the diet and an individual’s absorptive efficiency. Several algorithms have been proposed to estimate iron bioavailability from diets based on the form of the iron and the presence of absorption modifiers. These algorithms can be complex and often underestimate bioavailability. Recently, a new approach was developed by Dainty et al.(1,2), which is based on calculated iron requirements, total dietary iron intakes, and the distribution of serum ferritin concentration values in the population. This model has been used by the European Food Safety Authority to set recommended iron intakes for adults(3). In contrast, the recommended iron intakes for Australian adults are based on iron bioavailability estimates from the US Institute of Medicine, which were primarily derived from 15 free living US adults(4). Therefore, the aim of this study was to predict dietary iron absorption in a representative sample of premenopausal Australian women using the model developed by Dainty et al.(1,2) Dietary iron intake and serum ferritin data from the 2011–13 Australian National Nutrition and Physical Activity Survey and National Health Measures Survey were analysed in 503 premenopausal women aged 18–49 years. Women were excluded if they were pregnant or lactating, had elevated C-reactive protein, consumed iron-containing supplements, or misreported energy intake. Dietary iron intake was assessed via two non-consecutive 24-hour recalls. Usual daily iron intake was determined by the Multiple Source Method. Dietary iron absorption was estimated using the predictive model developed by Dainty et al.(1,2) and the Institute of Medicine’s distribution of individual dietary iron requirements(4). Mean (SD) usual dietary iron intake was 10.4 (2.6) mg/d. The prevalence of serum ferritin < 15 μg/L was 14.1% (95% CI: 10.2%, 19.3%), and < 30 μg/L was 37.0% (95% CI: 31.8%, 42.5%). Predicted dietary iron absorption at serum ferritin concentrations of < 15 μg/L was 29.5%, and at serum ferritin concentrations of < 30 μg/L it was 19%. Our findings do not support the bioavailability assumption of 18% used to develop the Australian recommended iron intakes for premenopausal women based on the need to maintain serum ferritin concentrations of 15 μg/L. Our results may be useful in revising the recommended iron intakes for Australian premenopausal women.
Food is a key lever for human and planetary health(1). Shifting to more plant-based foods supports environmentally sustainable, healthy and affordable diets(1). Taste preferences are formed in early childhood(2), presenting an opportunity for influencing plant-based food intake throughout the lifespan. Early Childhood Education and Care (ECEC) are important food environments due to high attendance rates for long hours(3), where children receive half of their daily nutritional needs(4). This study aimed to understand plant-based vs animal-based protein food provision in ECEC, their contribution to key nutrients, and their costs. Two weeks’ menus and recipes were collected from Victorian ECEC between 2018 and 2019 and entered into Foodworks10 for nutritional analysis. Desktop analysis categorised meals (lunches and snacks) by protein type as animal-based (red meat, white meat, fish, eggs, dairy, processed meat), plant-based (legumes, protein-enriched plant milk, seeds), or combined (both). Recipe items were priced at a metropolitan supermarket in March 2024 to determine cost per child per day and cost per child per lunch meal. A restricted maximum-likelihood mixed-effects model was used to estimate mean differences in lunch meal costs between the different meal protein types, adjusted for serving size. Iron bioavailability was assessed using previously published algorithms. Total daily energy, protein, calcium and iron were compared to 50% of the Australian Recommended Daily Intake for 2–3 year olds(5). Eighteen centres provided menus (n = 180 days, 540 meals). Preliminary findings indicated that 73% of meals contained animal-based protein, 7% a combination of animal and plant, and 4% plant-based protein. Animal-based protein meals most often contained dairy foods (64%, n = 253), followed by red meat (13%, n = 53). Plant-based protein meals mostly contained legumes (85%, n = 17). Mean (± SD) iron provision was below recommendations (2.86 mg ± 1.47 mg). Total protein (26 g ± 12 g) and calcium provision (271 mg ± 137.21 mg) were above recommendations. Mean food cost per child per day was AUD 2.46 (± AUD 1.09) and mean lunch meal cost per child was AUD 1.36 (± AUD 0.84). Animal-based lunches were AUD 0.45 more expensive than plant-based (p ≤ 0.01, 95% CI: AUD 0.15–AUD 0.73). These findings highlight very low provision of plant-based proteins in ECEC menus. Low red meat and iron provision suggests that plant-based protein should not displace current red meat on menus. High dairy and more than sufficient calcium may indicate that ‘meat-free meals’ are predominantly dairy-based, providing an opportunity for plant-based proteins in these meals. Plant-based protein lunches were a third cheaper than animal-based counterparts, suggesting an affordable option. Young children attending ECEC settings are currently missing the opportunity for exposure to plant-based proteins as healthy, environmentally sustainable and affordable additions to menus.
Colorectal cancer is a prevalent global health issue. In Australia, it ranks as the third most common newly diagnosed cancer, with around 15,000 new cases annually(1). Despite treatment advances, high incidence and mortality rates highlight the need for effective prevention and new therapies. Polyphenols, abundant in plant-based foods, have shown promise in inhibiting cancer cell growth and inducing apoptosis, offering a dietary approach to reduce cancer risk and improve outcomes. Whole-grain cereals like sorghum are recognised sources of phenolic compounds and can scavenge free radicals(2). This study aimed to evaluate the role of sorghum-derived polyphenols in modulating the major cancer development pathways. Also, the impact of processing techniques (cooking and fermentation) on sorghum polyphenols and cancer was evaluated. Polyphenols were extracted from the raw, cooked, fermented, and fermented-cooked sorghum flour samples(3). The phenolic content was measured using benchtop chemical assays including the DPPH radical scavenging assay and the ferric-reducing ability of plasma assay. UHPLC analysis coupled with Online ABTS characterised the polyphenols present in these extracts and provided their antioxidant activities. Using these extracts, a resazurin red cytotoxicity assay was performed on HT-29 colorectal cancer cells to determine the optimal concentrations for the downstream experiments. HT-29 cells were incubated with the black sorghum phenolic extracts (500 ug/mL and 2000 ug/mL) for 12 and 24 hours. Following this, the gene expression of several cancer regulatory genes (APC, KRAS, TTN, GLUT-1, HIF-1a and HIF-1b) was evaluated by qPCR. Treatment of HT-29 cells with raw sorghum phenolic extracts significantly (p < 0.05) upregulated APC and TTN genes at 12 and 24-hour time points and the KRAS gene at 24-hour time points compared to the control. This indicates the impact of sorghum-derived polyphenols on genome mutation and instability in cancer development pathways. Also, treatment at 500 ug/mL significantly (p < 0.05) upregulated the expression of GLUT-1 suggesting the impact on dysregulated cellular metabolism cancer development pathway. Processed sorghum phenolic extracts also significantly regulated KRAS gene expression. Overall, the results from this study showed that sorghum polyphenols modulate the expression of key cancer development pathway-associated genes in HT-29 cells. The findings underscore the potential of dietary polyphenols in cancer prevention and highlight the need for further research to optimise their use and understand their mechanisms of action in vivo.
Pre-school children’s dietary intake in Australia is substandard, with only 18% of children aged 2–3 years meeting the recommended intake for vegetables, and more than one-third of their daily kilojoules coming from energy-dense, nutrient-poor foods. Several child eating behaviour traits (e.g., food fussiness, enjoyment of food, satiety responsiveness and food responsiveness) are associated with the dietary intake of pre-school children(1). However, the associations between child eating behaviour traits and overall dietary quality in pre-school children have not been examined, which is important as children do not consume food groups or nutrients in isolation. It is also important to understand how biological factors such as age may influence child eating behaviours, given that eating behaviour traits such as food fussiness can develop and change with age(2). Therefore, the aims of this study were to examine the associations between preschool children’s eating behaviour traits and their dietary quality and to examine the moderating effect of children’s age on these associations. Cross-sectional survey data was collected online from mothers of pre-school aged children (2–5 years) from across Australia. The Children’s Eating Behaviour Questionnaire (CEBQ) measured four child eating behaviour traits: food fussiness, enjoyment of food, food responsiveness and satiety responsiveness. A validated thirteen-item food frequency questionnaire measured child dietary quality; 5 items measured healthy foods/behaviours, and 8 measured discretionary foods/unhealthy behaviours, with a maximum score of 65(3). Linear regression assessed associations between child eating behaviour traits and dietary quality, including interactions between child eating behaviour traits and child age. Of the 1367 respondents, half of the children were male (50.2%) and the mean age of the children was 3.3 years (SD = 1.0). The mean child dietary quality score was 51.9 (out of 65, range 21 to 64). Enjoyment of food was positively associated with dietary quality (B coefficient: 2.51, p < 0.001), whilst food fussiness and satiety responsiveness were inversely associated with dietary quality (B coefficients: −2.59 and −2.25, respectively, p < 0.001), and food responsiveness was not related to diet quality. Child age moderated associations between food fussiness and dietary quality (B coefficient: −0.38, p = 0.025), but not the other eating behaviour traits. The difference in dietary quality between lower and higher food fussiness was most pronounced among 5-year-old children. In conclusion, the findings from this study suggests that future interventions targeting poor dietary quality of pre-school children should consider targeting children with lower food enjoyment or higher food fussiness or satiety as possible ways to improve child dietary quality. Future interventions should also have a particular focus on strategies to reduce food fussiness for older preschoolers, as well as fussiness prevention strategies for younger preschoolers.
Diabetes-related foot ulcers (DFU) are common, with 56,000 Australia presenting with DFUs every year(1). Optimal nutrition is critical in wound healing, however, in DFU there is limited data available on nutritional status and healing outcomes. Therefore, the aim is to summarise the work leading to the development of a co-designed intervention for individuals with DFU and determine the dietary intake of individuals with DFU. Three separate studies were conducted: (1) qualitative interviews within individuals with DFU, (2) qualitative interviews with health care professionals working with people with DFU, and (3) comparison of current dietary intakes of individuals with DFU against international guidelines. To explore the individual with DFU perspective, a qualitative study using a reflexive thematic approach of conversational style interviews was undertaken. A targeted heterogenous sample with active or recent history of DFU, were recruited from a high-risk foot service in New South Wales, Australia. To gain an understanding of the perspectives of Australian health professionals involved in the care of individuals with DFU regarding nutrition assessment and management, semi-structured interviews were conducted nation-wide. To compare dietary intake, descriptive analysis was conducted. Dietary intake was collected using the Australian Eating Survey food frequency questionnaire and used to generate nutrient intake data. Nineteen interviews with individuals with DFU identified negative experiences with dietitians, seeing them as judgemental. Dietary misconceptions were common, with many having an unhealthy perception of food and no participants previously given personalised dietary advice for wound healing. However, it was evident that participants were willing to do anything to improve their wound healing, including a dietary intervention. Participants expressed a strong preference for personalised, face-to-face dietary advice. A total of 19 health professionals participated in interviews. Major barriers to implementation of nutrition assessment and management were identified: inadequate time, lack of knowledge and lack of clinical guidance. Facilitators included: professional development, a standardised clinical pathway and screening tool, and a resource addressing wound healing and diabetes management. One hundred and fifteen participants with DFU were included in the dietary analysis. Most individuals with DFU did not meet current consensus guidelines for optimal dietary intake for wound healing. Inadequate intake of protein, vitamin A, vitamin C, vitamin E and zinc was identified 46%, 45%, 26%, 86%, and 37% of participants, respectively. This body of work highlighted that individuals with DFU are interested in receiving personalised medical nutrition therapy, however they are not currently meeting higher wound healing nutrient requirements. Furthermore, health care professionals are not confident in supporting these individuals. These findings suggest that co-designing interventions for individuals with DFU needs to utilise co-design of the patient and clinician perspective to increase acceptability of the nutrition intervention.
Higher-quality Australian diets are reported to taste more bitter(1), have healthier nutritional profiles and align more closely with the recommendations of the Australian Dietary Guidelines(2). Greater consumption of bitter foods may benefit health, but most research has focused on green leafy vegetables(3). However, there are other foods and beverages (F&Bs) that taste bitter and could increase the bitterness of diets if consumed in greater amounts(2). Yet, strategies to increase bitter F&B consumption and enhance the bitterness of diets remain largely underexplored. An online-based cross-sectional survey of Australian adults was conducted (in July and August 2023) to explore barriers, facilitators, and strategies associated with willingness to try or increase consumption of bitter F&Bs. Eight non-discretionary bitter F&Bs available in the Australian market (including coffee, tea, soda water, Brussels sprouts, rockets, grapefruit, walnuts, and eggplant) were selected. The design of survey questions was guided by conceptual models of food choice. Respondents were asked about their familiarity with and consumption habits of bitter F&Bs and their willingness to incorporate more bitter F&Bs into their diets. Respondents were grouped into those who had never tried bitter F&Bs, non-consumers or consumers, who were further categorised into low-, moderate- or high-consumers. This analysis focused on respondents with low bitter F&B consumption, non-consumers, and individuals who had never tried bitter F&Bs, as the potential to increase consumption was greatest. This study enrolled 879 respondents across Australia. Respondents had previously tried an average of six of the eight bitter F&Bs (median = 6). Most respondents (85.4%) were willing to increase their consumption of bitter F&Bs. While the bitter taste was consistently reported as the main barrier to greater consumption, the reported facilitators and strategies varied between consumer groups and the different F&Bs. More than half of the respondents (61.1%) had never tried bitter vegetables (i.e., Brussels sprouts and rockets). For this group, ‘nutrition education’ (selected by 34%) and ‘appealing presentation’ (selected by 25.9%) were the most commonly selected facilitator and preferred strategy, respectively. Non-consumers of other bitter foods in the survey reported ‘price’ (selected by 43.8%) and ‘convenience’ (selected by 16.5%) as the most important facilitator and strategy, respectively. While ‘food availability’ (selected by 39.3%) was the common facilitator among low-consumers of bitter beverages, ‘easier preparation’ and ‘altering the taste’ (selected by 19.9% and 17.3%) were the most preferred strategies. This study provides valuable insights into the acceptability of bitter F&Bs among Australian adults. These findings could help tailor dietary interventions to groups of individuals based on their consumption habits of particular bitter F&Bs to support increased consumption. Further research is needed to understand whether improving bitter F&B consumption increases the bitterness of diets overall and whether this is associated with improved health outcomes.
Low vitamin D status (circulating 25-hydroxyvitamin D [25(OH)D] concentration < 50 nmol/L) affects nearly one in four Australian adults(1). The primary source of vitamin D is sun exposure; however, a safe level of sun exposure for optimal vitamin D production has not been established. As supplement use is uneven, increasing vitamin D in food is the logical option for improving vitamin D status at a population level. The dietary supply of vitamin D is low since few foods are naturally rich in vitamin D. While there is no Australia-specific estimated average requirement (EAR) for vitamin D, the Institute of Medicine recommends an EAR of 10 μg/day for all ages. Vitamin D intake is low in Australia, with mean usual intake ranging from 1.8–3.2 μg/day across sex/age groups(2), suggesting a need for data-driven nutrition policy to improve the dietary supply of vitamin D. Food fortification has proven effective in other countries. We aimed to model four potential vitamin D fortification scenarios to determine an optimal strategy for Australia. We used food consumption data for people aged ≥ 2 years (n = 12,153) from the 2011–2012 National Nutrition and Physical Activity Survey, and analytical food composition data for vitamin D3, 25(OH)D3, vitamin D2 and 25(OH)D2(3). Certain foods are permitted for mandatory or voluntary fortification in Australia. As industry uptake of the voluntary option is low, Scenario 1 simulated addition of the maximum permitted amount of vitamin D to all foods permitted under the Australia New Zealand Food Standards Code (dairy products/plant-based alternatives, edible oil spreads, formulated beverages and permitted ready-to-eat breakfast cereals (RTEBC)). Scenarios 2–4 modelled higher concentrations than those permitted for fluid milk/alternatives (1 μg/100 mL) and edible oil spreads (20 μg/100 g) within an expanding list of food vehicles: Scenario 2—dairy products/alternatives, edible oil spreads, formulated beverages; Scenario 3—Scenario 2 plus RTEBC; Scenario 4—Scenario 3 plus bread (which is not permitted for vitamin D fortification in Australia). Usual intake was modelled for the four scenarios across sex and age groups using the National Cancer Institute Method(4). Assuming equal bioactivity of the D vitamers, the range of mean usual vitamin D intake across age groups for males for Scenarios 1 to 4, respectively, was 7.2–8.8, 6.9–8.3, 8.0–9.7 and 9.3–11.3 μg/day; the respective values for females were 5.8–7.5, 5.8–7.2, 6.4–8.3 and 7.5–9.5 μg/day. No participant exceeded the upper level of intake (80 μg/day) under any scenario. Systematic fortification of all foods permitted for vitamin D fortification could substantially improve vitamin D intake across the population. However, the optimal strategy would require permissions for bread as a food vehicle, and addition of higher than permitted concentrations of vitamin D to fluid milks/alternatives and edible oil spreads.
This paper is a continuation of a project to determine which skew polynomial algebras $S = R[\theta; \alpha]$ satisfy property $(\diamond)$, namely that the injective hull of every simple S-module is locally Artinian, where k is a field, R is a commutative Noetherian k-algebra and α is a k-algebra automorphism of R. Earlier work (which we review) and further analysis done here lead us to focus on the case where S is a primitive domain and R has Krull dimension 1 and contains an uncountable field. Then we show first that if $|\mathrm{Spec}(R)|$ is infinite then S does not satisfy $(\diamond)$. Secondly, we show that when $R = k[X]_{ \lt X \gt }$ and $\alpha (X) = qX$ where $q \in k \setminus \{0\}$ is not a root of unity then S does not satisfy $(\diamond)$. This is in complete contrast to our earlier result that, when $R = k[[X]]$ and α is an arbitrary k-algebra automorphism of infinite order, S satisfies $(\diamond)$. A number of open questions are stated.
This paper introduces a distributed online learning coverage control algorithm based on sparse Gaussian process regression for addressing the problem of multi-robot area coverage and source localization in unknown environments. Considering the limitations of traditional Gaussian process regression in handling large datasets, this study employs multiple robots to explore the task area to gather environmental information and approximate the posterior distribution of the model using variational free energy methods, which serves as the input for the centroid Voronoi tessellation algorithm. Additionally, taking into consideration the localization errors, and the impact of obstacles, buffer factors and centroid Voronoi tessellation algorithms with separating hyperplanes are introduced for dynamic robot task area planning, ultimately achieving autonomous online decision-making and optimal coverage. Simulation results demonstrate that the proposed algorithm ensures the safety of multi-robot formations, exhibits higher iteration speed, and improves source localization accuracy, highlighting the effectiveness of model enhancements.
In Hot Mess: Mothering Through a Code Red Climate Emergency, Sarah Marie Wiebe delves into the important but often overlooked intersection of motherhood and the ongoing climate emergency. The book offers a thoughtful exploration of how new mothers experience the climate crisis and emphasizes the necessity of centring care in the discourse on environmental justice. Wiebe's main argument revolves around the need for pushing past individualism to truly centre care and community at the heart of climate crisis mitigation efforts. She draws from her own experiences as a new mother, as well as the experiences of other communities in Hawaii and rural settings across Canada, to illustrate the tangible impacts of climate change on maternal health and community well-being.
Nuts are nutrient-rich, energy-dense foods that are associated with better diet quality in children(1), yet intake in Australian children remains low(2). Prospective studies have demonstrated positive associations between nut consumption and cognitive performance in children(3), while randomised controlled trials (RCTs) assessing nut consumption and cognitive performance in adults have reported inconsistent findings(4). This 2-phase cross-over RCT examined the feasibility of Australian children eating an almond-enriched diet (30 g almonds, 5 days per week) compared with a nut-free diet for 8 weeks each. Associated changes in diet quality, lifestyle factors and cognitive performance were also measured. Forty children (48% female, 8–13 years) who were low habitual nut consumers (< 30 g/day) and free from nut allergies and cognitive, behavioural or medical conditions that could affect study outcomes were enrolled. Feasibility outcomes included retention, compliance with study foods and changes in ratings of liking and palatability of almonds. Other outcomes were assessed before and after each 8-week diet phase, separated by a 2-week washout. Parent/guardian–child dyads completed questionnaires about diet (diet quality score), physical activity, and sleep behaviour. Sleep quality and length were recorded for 7 nights prior to clinic visits. At each visit sleepiness was captured (Karolinska Sleepiness Scale) before children completed a computerised test battery (COMPASS) to assess cognitive performance across attention/concentration, executive function, memory, processing speed and verbal fluency domains. Analyses were performed using SPSS 26.0 software with statistical significance defined as p < 0.05. Data were analysed using mixed effects models, with diet and time as fixed effects, a random effect of ID and controlling for diet order, age, sex and sleepiness. Retention was excellent with all participants completing the study and mean compliance with almonds was 98%. Mean liking and palatability ratings declined after 8 weeks (−23 points, p = 0.006) but remained favourable. There were no significant changes in diet quality, physical activity or sleep (behaviour, length or quality) during the study. Changes in cognitive performance over time and between diets ranged from trivial to small (Cohen’s d = 0.01–0.28) for all tests, failing to reach significance except for simple reaction time (faster response over time, d = −0.1, F(1,115.7) = 4.455, p = 0.037) and Peg and Ball response time (faster after nut-free diet, d = 0.28, F(1,115.4) = 4.176, p = 0.043). This study demonstrated that it was feasible to conduct an almond-enriched dietary intervention in Australian children, with excellent retention and compliance to study requirements. Whilst significant changes were limited for scientific outcomes, this study was not designed to be powered for these outcomes. Rather, these data will be valuable for determining required sample sizes in future studies assessing nut interventions and cognitive performance in children.
The role of nutrition in diseases such as diabetes, cancer and cardiovascular disease has been widely explored; however, less is known about the role nutrition plays in the development and progression of chronic obstructive pulmonary disease (COPD). Despite limited research, studies have identified favourable associations between diets high in fruits and vegetables and a reduction in COPD incidence and severity. There are several potential mechanisms through which consuming adequate fruits and vegetables may decrease the risk and severity of COPD(1). These mechanisms include protection of the lungs due to increased consumption of antioxidants and soluble fibre(1,2). The aim of this systematic review was to synthesise evidence on the effects of fruit and vegetable intake on COPD risk. A systematic search was completed across six databases up to July 2023. Studies reporting COPD risk and fruit and vegetable intake in COPD vs non-COPD were assessed for inclusion. 26 studies met our inclusion criteria and, of these, 21 were eligible for meta-analysis. This review found that both fruit and vegetable consumption are linked to a reduced risk of developing COPD. The meta-analysis confirmed that higher intakes of fruit and vegetables were significantly associated with greater reduction in COPD incidence. Specifically, individuals who consumed higher amounts of fruit had a 17% lower risk of COPD compared to those with lower fruit intake (Odds ratio, OR = 0.83; 95% CI: 0.73–0.94, n = 2107, p = 0.004, I2 = 84%). Over a mean follow-up period of 13.07 years, the protective effect of fruit consumption against COPD appeared to be even more pronounced, with a 25% reduction in risk observed (hazard ratio, HR = 0.75; 95% CI: 0.67–0.84, n = 3770 p = < 0.00001, I² = 0%). Vegetable consumption was also associated with a significant reduction in COPD risk, with a 24% lower risk observed in individuals with higher vegetable intake compared to those with lower intake (OR = 0.76; 95% CI: 0.66–0.88, n = 2107, p = 0.0003, I² = 97%). Further analysis focusing on fibre content of fruits and vegetables demonstrated that high fruit fibre intake was associated with a 27% reduction in the risk of developing COPD (HR = 0.73; 95% CI: 0.65–0.83, n = 3901, p < 0.00001, I² = 0%) and higher vegetable fibre intake was associated with a 12% reduction in COPD risk (HR = 0.88; 95% CI: 0.79–0.99, n = 3901, p = 0.03, I² = 62%). The results from this review strongly support the beneficial effects of fruit and vegetable consumption in reducing the risk of COPD. Further work is warranted to understand the mechanisms that lead to these benefits.
Despite the healthful nature of plant-based diets (PBDs) there is potential for nutrient inadequacies(1). This study aimed to compare dietary intakes and nutritional adequacy in Australians following plant-based diets compared a regular meat-eating diet (RME) in a cross-sectional study of adults (n = 240) aged 30–75yrs. Participants were habitually consuming dietary patterns for ≥ 6 months; vegan, lacto-vegetarian, pesco-vegetarian, semi-vegetarian or RME (n = 48 per group). Dietary intake was assessed using a validated food frequency questionnaire and dietitian-administered diet histories. Multivariable regression was used to adjust for potential lifestyle and demographic confounders. Compared to RMEs, vegans and lacto-ovo vegetarians had significantly lower dietary intakes of protein (percentage energy intake, EN%), saturated fat, trans fat, cholesterol, vitamin B12, iodine, riboflavin, niacin, sodium, and long chain omega-3 polyunsaturated fatty acids (LCn-3PUFA), and higher carbohydrate (EN%), dietary fibre, vitamin E, folate, magnesium, iron, and n-6PUFA, whereas, pesco-vegetarians and semi-vegetarians had intermediate intakes. Individuals adhering to PBD consumed significantly more vegetables, fruit (vegan only), legumes/nuts, and less discretionary choices compared to RMEs. All dietary patterns met adequate intake for protein, exceeded for fat, were below for carbohydrate (EN%) and had adequate serves of fruit and vegetables, but not grains. Including plant-based alternatives, vegans, lacto-ovo vegetarians, and semi-vegetarians had inadequate serves of ‘meat/poultry/eggs/beans/nuts’, and semi-vegetarians and RMEs had inadequate serves of dairy. Vegans and lacto-vegetarians had nutritional inadequacies in vitamin B12, LCn-3PUFA, iodine, and in addition calcium among vegans, pesco-vegetarians in iodine, and semi-vegetarians and RMEs in LCn-3PUFA. PBDs, specifically vegans and lacto-ovo vegetarians, while significantly higher in beneficial nutrients and wholefood groups than RMEs, may lead to nutritional inadequacies if not planned appropriately.