To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Natural remission from common mental disorders (CMDs), in the absence of intervention, varies greatly. The situation in India is unknown.
Aims
This study examined individual, village and primary health centre (PHC)-level determinants for remission across two rural communities in north and south India and reports natural remission rates.
Method
Using pre-intervention trial data from 44 PHCs in Andhra Pradesh and Haryana, adults ≥18 years were screened for CMDs. Screen-positive people (Patient Health Questionnaire-9 Item (PHQ9) or Generalised Anxiety Disorder-7 Item (GAD7) score ≥10, or a score ≥2 on the self-harm PHQ9 question) were re-screened after 5–7 months (mean). Remission was defined <5 scores on both PHQ9 and GAD7 and <2 score on self-harm. Multilevel Poisson regression models with random effects at individual, village and PHC levels were developed for each state to identify factors associated with remission. Time to re-screening was included as offset in regression models.
Results
Of 100 013 people in Andhra Pradesh and 69 807 people in Haryana, 2.4% and 7.1%, respectively, were screen positive. At re-screening, remission rate in Andhra Pradesh was 82.3% (95% CI 77.5–87.4%) and 59.4% (95% CI 55.7–63.3%) in Haryana. Being female, increasing age and higher baseline depression and anxiety scores were associated with lower remission rates. None of the considered village- and PHC-level factors were found to be associated with remission rate, after adjusting for individual-level factors.
Conclusion
Natural remission for CMDs vary greatly in two Indian states and are associated with complex, multilevel factors. Further research is recommended to better understand natural remission.
Observational studies suggest higher intake of cruciferous vegetables (e.g., broccoli, cauliflower, kale) is associated with lower chronic disease risk(1,2). Glucosinolates (GSL) and cysteine sulfoxides such as S-methyl cysteine sulfoxide (SMCSO) are sulfur-containing compounds found in high amounts in these vegetables(3). Currently, no data exists on SMCSO levels in Australian-grown cruciferous vegetables and limited data exists for glucosinolates (GSL). The levels of SMCSO retained in cruciferous vegetables after various domestic cooking methods is unknown, and measurement of SMCSO and GSL levels in cooked Australian-grown cruciferous vegetables is limited. This study sought to (1) quantify SMCSO and GSL in Australian-grown cruciferous vegetables and (2) identify the most preferable cooking methods to retain levels in these vegetables. Using liquid chromatography mass spectrometry, we quantified SMCSO and ten GSL in seven cruciferous vegetables before and after steaming. We further quantified levels in broccoli before and after microwaving, stir-frying, and boiling. Each cooking method; steaming (3 minutes), microwaving (2 minutes), boiling (3 minutes), stir-frying (4 minutes); was chosen so vegetables remained firm and not overcooked to mimic healthy cooking recommendations(4). Student t-tests were used to compare the differences in raw and steamed levels for all vegetables, and analysis of variance with Tukey post-hoc assessed the differences in raw and cooked broccoli (i.e., steamed, microwaved, boiled, stir-fried). Overall, SMCSO contributed greater dry weight (0.6–1.9%) than total GSL combined (0.3–1.2%). SMCSO levels from lowest to highest were Chinese cabbage < white cabbage < cauliflower < kale < red cabbage < broccoli < Brussels sprouts (6–19 mg/g dry weight [DW]) and GSL levels were cauliflower < Chinese cabbage < red cabbage < kale < broccoli < white cabbage < Brussels sprouts (3–12 mg/g DW). SMCSO increased after steaming (1–24%) in all vegetables except white cabbage (−31%), kale (−18%), and Chinese cabbage (−5%), but only reached statistical significance in Brussels sprouts (+16%, p < 0.05). Most vegetables increased total GSL (ranging 1–34%) after steaming, except kale (−38%) and Chinese cabbage (−8%). Stir-frying and boiling broccoli led to significant losses in SMCSO (−34% and −50%, respectively) and in the two dominant GSL in broccoli; glucoraphanin (−47% and −52%, respectively) and glucobrassicin (−46% and –51%, respectively) (all p < 0.05). We have quantified SMCSO and GSL levels in a selection of Australian-grown cruciferous vegetables (broccoli, kale, Brussels sprouts, cauliflower, red, white, and Chinese cabbages) before and after cooking. SMCSO and GSL levels were relatively stable after light steaming. Additionally, light steaming or microwaving were the most preferable methods to retain SMCSO and GSL levels in broccoli. Boiling or stir-frying broccoli were the least favourable. These results have important implications when estimating intake of these beneficial sulfur-containing compounds.
Parents are pivotal in shaping healthy eating, physical activity and screentime behaviours in the early years(1). Early Childhood Education and Care (ECEC) services provide an ideal setting for parent communication initiatives to promote positive lifestyle behaviours in young children(2). This study aimed to determine the feasibility, acceptability and potential efficacy of the Healthy Adventures Book (HAB) pack in increasing parent and carer capacity to support positive dietary intake, physical activity and screen use behaviours of their 3–5-year-old children. ECEC services in western Sydney (n = 136) and families with 3–5-year-old children (n = 258) participated in the study. Families were provided with a HAB pack to take home, consisting of a scrapbook containing health information, a vegetable-shaped toy and story book. Families were encouraged to read the information and story book, and to support their child to complete the activity in the book. A quasi-experimental mixed-methods design was used. Parents completed pre- and post-intervention questionnaires that included questions on demographics, and readiness and confidence to support behaviour change. Process data were collected from parents and ECEC directors. Semi-structured interviews were conducted with parents post-intervention. Changes in parent readiness and confidence were analysed using Mann Whitney U tests. Thematic analysis was conducted on parent interview data. There was a significant improvement in parent confidence to support physical activity from pre- to post-intervention (p < 0.01). No significant changes were found for other behaviours. Process evaluation showed high acceptability, with 93% of parents reporting children were excited to use the pack, 91% finding it easy to complete, and 86% finding it useful for learning about healthy behaviours. All ECEC directors agreed the pack was well-received, easy to implement, appealing to families, and facilitated conversations about health behaviours. Qualitative analysis revealed six key themes: whole family involvement, easy access to relevant health information, reinforcement of key health behaviours, vegetable intake, screen time, and continuation of learning. Parents reported the pack encouraged family engagement, provided useful strategies, and reinforced health messages. However, parents expressed that they would like ongoing support to maintain behaviour changes. Study limitations included a small sample size, no control group, and potential selection bias of already health-conscious families. In conclusion, the HAB pack was feasible and acceptable to both ECECs and families, demonstrating potential as a health promotion tool, particularly for encouraging physical activity. Further, it has the capacity to improve communication between the ECEC setting and home environment to ensure consistency of health messaging to children. More research is needed to determine efficacy and explore strategies for sustained behaviour change.
A clear understanding of nutrient intake at a national scale is important to ensure food security into the future. National nutrition surveys are expensive and slow, with the latest data in Australia from 2011–13. However, nutrients available for consumption in Australia can be easily accessed from the food balance sheets produced by the Food and Agriculture Organization of the United Nations (FAO), with the latest data from 2021. This project compares nutrient intake and nutrient supply data from 2011–2013 to determine whether nutrient supply data is an acceptable surrogate measure for estimating national nutrient intake levels. To compare national nutrient intake and supply, data were collected from the Australian Health Survey 2011–13(1); the 2011–2013 FAO food balance sheets(2); and Nutrient Reference Values (NRVs) for Australia and New Zealand for 23 essential nutrients(3). Nutrient supply data were adjusted with both consumer waste and inedible portions, and inedible portions alone. Nutrient intake and nutrient supply per capita per day were converted to a relative percentage of the target value from the NRVs. One-sample, two-tailed t-tests were conducted to identify statistical differences between intake and supply of individual nutrients. A p-value < 0.01 was considered a significant difference between nutrient intake and nutrient supply. For 18 nutrients there was a significant difference between nutrient intake and supply adjusted for inedible portions only. For 15 nutrients there was a significant difference between nutrient intake and nutrient supply adjusted for both consumer waste and inedible portions. There was no difference between intake and supply adjusted for inedible portions of calcium, dietary fibre, iodine, riboflavin, and long- chain omega-3 fatty acids. For r calcium, magnesium, zinc, iron, iodine, riboflavin, long-chain omega-3 fatty acids, and folate equivalents there was no difference between intake and supply adjusted for consumer waste and inedible portions. When supply data was adjusted for both consumer waste and inedible portions this reduced the differences between intake and supply, making it a better representation of nutrient intake data. This study found that nutrient supply data from the FAO, even after adjustment for inedible portions and consumer waste, was not suitable for estimating nutrient intake of the Australian population. A key limitation was the unavailability of consumer food waste data specific to Australia (the data used was representative of Oceania), which may further reduce the differences between supply and intake data. Assessing national dietary intake is challenging(4,5) but current survey practice cannot easily be replaced for understanding the nutrient intake of Australians.
Aerodynamic investigations are crucial for the efficient design of Lighter-than-Air (LTA) systems. This study explores the aerodynamic characteristics of conventional and multi-lobed airships, motivated by the growing interest in LTA systems due to advancements in materials science, energy sources, aerodynamics, propulsion technology and control systems. The study employs the k-epsilon turbulence model, which is well-suited for turbulent flow simulations, and the Semi-Implicit Method for Pressure Linked Equations (SIMPLE) algorithm, known for its effectiveness in pressure-velocity coupling in fluid dynamics simulations. The results indicate that multi-lobed airships offer enhanced aerodynamic efficiency over conventional designs. Detailed analyses of lift and drag coefficients provide insights into aerodynamic performance, guiding the optimisation of airship designs for improved efficiency. The findings of this study support the development of more aerodynamically efficient airship designs, which can serve as cost-effective, energy-efficient and quieter alternatives to traditional aircraft, particularly for applications such as surveillance, cargo transport and scientific research.
Micronutrient malnutrition is a public health concern in many developing countries including Sri Lanka. Rural poor households are more vulnerable to micronutrient malnutrition due to their monotonous rice-based diet, which lacks dietary diversification(1). Despite the potential of home gardens on increased food access and diversity, their contribution to household dietary diversity remains unclear. This study aimed to investigate the impact of home gardens on diet diversity among rural Sri Lankan households. Low-income households with children under five were randomly selected from the Samurdhi beneficiary list, and 450 households having a home garden agreed to be interviewed. We collected information on types of crops and livestock produced over the past 12 months and their utilisation. We also collected the socio-demographic characteristics of the households. We measured household dietary diversity using the Household Dietary Diversity Score (HDDS) based on FAO guidelines. Multiple linear regression was used to identify the predictors of HDDS. Complete data sets were only available for 411 households and were included in the analysis. The HDDS ranged from 3 to 10 with a mean of 6.4 (±1.37 SD) indicating a moderate level of dietary diversity. However, only 20.4% of the households met the adequacy threshold, which is higher than the third quartile(2). Cereals, and fats and oils were the only food groups consumed by all the households. Although many households produced fruits (67.2%) and reared livestock (48.2%), the consumption of these groups were the lowest among the 12 food groups. Predictors of HDDS included monthly household income which had a strong positive relationship, especially earnings above 35,000 LKR (β = 1.02; S.E = 0.246; p = 0.000). Surprisingly, living far from the market was associated with increased HDDS (β = 0.026; S.E = 0.008; p = 0.004). Conversely, living further away from the main road reduced the HDDS (β = −0.133; S.E = 0.049; p = 0.007). Growing staples reduced the HDDS (β = −0.395; S.E = 0.174; p = 0.023), whereas growing leafy vegetables increased the diet diversity (β = 0.394; S.E = 0.154; p = 0.010). Selling homegrown products also increased HDDS (β = 0.276; S.E = 0.136; p = 0.043). However, other covariates such as the education level of the female adult, household food security status, home garden yield (kg), and livestock richness, which showed significant correlation in the bivariate analysis did not significant in the multiple regression analysis. Although all households in this district engage in some form of home gardening, 79.6% of households did not have adequate dietary diversity. There is a need to understand how home gardens can better contribute to dietary diversity.
Poor diet is a risk factor for chronic noncommunicable diseases and related mortalities(1). The community food environment is one of the determinant factors for dietary quality(2). In high-income countries, the dietary impact of the community food environment is more pronounced in low socioeconomic areas(3). This study aimed to assess the healthiness of food outlets and its association with Socioeconomic Index for Areas (SEIFA) and Local Government Areas (LGA) in the Illawarra Shoalhaven region, Australia. A desk-based cross-sectional study was conducted using a list of registered food outlets obtained from councils of local government areas. Food Environment Score was used to assess the healthiness of food outlets that classify food outlets as healthy, less healthy, and unhealthy(4). The Index of Relative Advantage and Disadvantage (IRSAD) at statistical area level two was used to define SEIFA and was extracted from the Australian Bureau of Statistics (ABS) 2021 census data. Logistic regression was conducted to identify the association between the healthiness of food outlets with LGA and SEIFA. Of the 1924 food outlets, 52.4% (n = 1008) were in Wollongong, 14.1% (n = 272) in Shellharbour, 8.3% (n = 160) in Kiama, and 25.2% (n = 484) in Shoalhaven LGA. Out of 1924 food outlets, 281 (14.6%) were categorised as healthy, 790 (41.1%) as less healthy, and 853 (44.3%) as unhealthy. Wollongong had 2 times more unhealthy food outlets than Shoalhaven as compared to healthy and less healthy food outlets (AOR 2.0 (95% CI: 1.5, 2.5)), Shellharbour had 70% more unhealthy food outlets than Shoalhaven as compared to healthy and less healthy food outlets (AOR 1.7 (1.3, 2.3)), and Kiama had 70% more unhealthy food outlets than Shoalhaven as compared to healthy and less healthy food outlets (AOR 1.7 (1.1, 2.5)). IRSAD 5 had 40% fewer unhealthy food outlets than IRSAD 3 as compared to healthy and less healthy food outlets (AOR 0.6 (0.4, 0.8)) and IRSAD 4 had 50% fewer unhealthy food outlets than IRSAD 3 as compared to healthy and less healthy food outlets (AOR 0.5 (0.3, 0.8)). Large proportion of food outlets were categorised as unhealthy and less healthy. There were disparities in the healthiness of food outlets across LGAs and SEIFA. Intervention strategies need to be designed to increase the availability of healthy food outlets and limit unhealthy food outlets, particularly in low socioeconomic areas.
Parents and teachers have expressed concerns about the adequacy of time allocated to eat lunch at primary schools in Australia(1). Short school lunch durations can result in negative outcomes such as insufficient food consumption, resulting in hunger and inadequate energy and nutrition(2). A recent study reported that students consumed less fruits and vegetables when given 10 minutes to eat compared to 20 minutes(3), leading to increased food waste. We aimed to explore parents’ perceptions regarding time-related aspects associated with school lunch including the sufficiency of time to eat school lunch, children requesting and parents providing quick-to-eat food, and the perception that healthy food takes longer time to eat. Additionally, we aimed to explore whether these time-related perceptions and children’s age are associated with how often children finish their lunches. An online survey was conducted in 2022 to explore Victorian parents’ perceptions regarding primary school lunches including the aspects mentioned above. Frequencies and percentages were calculated for all variables. Chi-square tests were used to explore the relationship between parents’ perceptions of time-related aspects and child’s age and how often children finish their lunches. Out of 359 parents, 29% reported that their child sometimes, rarely, or never finishes their lunches. When asked about reasons for this, 20% chose ‘not enough time is provided to finish lunch’ and 19% chose ‘my child is more interested in playing than eating during lunchtime’. About half of parents strongly agreed or agreed (SA/A) that the allocated time at their child’s school is not sufficient to eat school lunch (48%). Fifty percent of parents SA/A that their child asks them to pack easy-to-eat food and 60% SA/A that they provide such food for school lunches. However, the majority of parents (62%) strongly disagreed or disagreed (SD/D) that healthy foods take longer than less healthy food to eat during school lunchtime. More parents who SA/A with the statement ‘the allocated time at my child’s school is not sufficient to eat school lunch’ reported their child finishes lunch sometimes, rarely or never compared to parents who SD/D with this statement (36% vs 16%, Chi-sq = 11.372, p = 0.003). Parents’ perceptions regarding other time-related aspects were not associated with finishing lunches. More parents of children in prep to grade 2 compared to parents of children in grades 3 to 6 reported their child finishes their lunch sometimes, rarely or never (73% vs 49%, Chi-sq = 16.813, p < 0.001). The findings indicate that parents have concerns about the time allocated to eat lunch at primary schools. Increasing the time allocated to eating school lunches would help to ease these concerns and allow children, especially the younger primary school children, to eat comfortably and finish their lunches if they wish to do so.
Neuropsychological disorders, including anxiety, depression, and dementia, are significant public health problems among older adults. While psychotropics are effective treatments, long-term treatment often has adverse side effects(1). Many patients often seek healthy food consumption as an alternative preventive strategy. Dietary fibre has been suggested for many health benefits, including cardiometabolic health and anti-inflammation, which may influence neurological health through the gut-brain axis(2). However, fibre’s role in neuropsychological health outcomes in older people is unclear. This study examined the potential role of dietary fibre intake and consumption of fibre-rich foods in neurological health outcomes in older Australians. We utilised data from the Ageing Study (MAS) of 1,037 participants aged 70–90(3). At baseline, dietary fibre, whole grains, fresh fruit, vegetables, and nuts and legumes consumption was estimated using the Cancer Council of Victoria food frequency questionnaire. The intake amount was further derived into tertiles (T), with T1 in the lower 33rd%tile and T3 in the upper 33rd%tile. Depressive symptoms (Geriatric Depression Scale), anxiety symptoms (Goldberg Anxiety Scale), and psychological distress (Kessler Psychological Distress Scale) were assessed. Linear regression models were used to estimate beta coefficients for the associations cross-sectionally. Incident dementia was defined using diagnostic criteria, clinical assessments, and a consensus panel review. Nine hundred and sixty-three participants were followed up from the baseline (2005) until wave 4 (2011) [median: 5.8 (IQR: 3.1–5.9) years; 97 incident cases). Incident depression was defined as diagnoses by healthcare professionals and treatments for depression. Eight hundred and nine participants were followed up from the baseline (2005) until wave 3 (2009) [median: 3.9 (IQR: 1.9–4.0) years; 109 incident cases). Cox proportional hazard models were used to estimate hazard ratios (95% CIs). All models were adjusted for demographic characteristics, lifestyle factors, and health history. Among 963 participants (mean age: 78.5; 5.8% females) in the cross-sectional analysis, compared with T1, higher vegetable intake was associated with fewer depressive symptoms (T2: β = 0.52; T3: β= −0.53; both p < 0.05), psychological distress (T2: β = −0.59; T3: β = −1.13; both p < 0.05), and anxiety symptoms (T3: β = −0.37; p = 0.03). Combined intake of vegetables and fruit was inversely associated with fewer psychological distress symptoms (T2: β = −0.55; p = 0.06; T3: β = −1.3; p < 0.05). In the highest tertile, dietary fibre was associated with fewer depressive symptoms (T3: β = −0.47; p = 0.04). In the longitudinal analysis, dietary fibre intake was associated with a 43–56% lower risk of incident dementia (T2 vs T1: adj.HR = 0.57; 95% CI: 0.31–1.03; T3 vs T1: adj.HR = 0.44; 95% CI: 0.19–1.01). Intakes of whole grains, fruit, nuts and legumes were not associated with the outcomes assessed. In a cohort of older Australians, dietary fibre intake appeared to be protective in reducing depressive symptoms cross-sectionally and the risk of incident dementia longitudinally. Additionally, vegetable consumption was associated with fewer symptoms related to depression, anxiety, and distress cross-sectionally.
Telomere length is a biomarker of ageing(1). A shorter telomere length is associated with an increased risk of age-related diseases and mortality. Oxidative stress and inflammation are predominant mechanisms leading to telomere shortening(2). Diets and food groups high in antioxidant and anti-inflammatory properties are shown to be protective against telomere shortening(3). The nut and seed food group is rich in nutrients such as unsaturated fats, vitamins, and minerals, and contains antioxidants and anti-inflammatory phytochemicals. Evidence is emerging on the beneficial effects of nuts and seeds in the prevention and management of age-related chronic conditions. This review aims to evaluate the role of nut and seed intake on telomere length in humans using the evidence from observational and interventional studies. Four databases, including Medline, CINAHL, Embase and Web of Science, were systematically searched from inception to 12 March 2024 for observational and interventional studies assessing the intake of nut or seed or applied nut or seed interventions and measured telomere length as an outcome in adult human participants (age ≥ 18 years). The quality assessment of the included studies was performed using the Academy of Nutrition and Dietetics Evidence Analysis Library® November 2022: Quality Criteria Checklist. Nine observational and four interventional studies were included. A positive association between nut and seed intake and telomere length was reported in three of the nine observational studies. None of the interventional studies reported a significant positive effect of nuts on telomere length. Three of the observational and interventional studies were classified as high quality, and the remaining studies were of neutral quality. Meta-analysis was not warranted due to the high heterogeneity in the telomere length measurements across the studies. The findings are inconsistent across these studies, and the evidence is insufficient to establish a beneficial role of nut and seed intake on telomere length. Larger epidemiological studies and adequately powered long-term randomised controlled trials are needed to establish the positive role of nut and seed on telomere length. However, nut and seed should continue to be recommended as a part of a healthy diet, given their proven benefits against age-related conditions.
Specialised training opportunities in paediatric cardiology are rare for advanced practice providers, creating an educational gap for novice practitioners. Standardised curricula have been cited as a beneficial intervention to optimally prepare these providers for highly specialised fields. We sought to understand the current onboarding practices of advanced practice providers in paediatric acute care cardiology to identify opportunities for curricular improvement.
Materials and methods:
A survey developed by a task force by the Pediatric Acute Care Cardiology Collaborative (PAC3) was distributed across PAC3 programmes in May 2023 to evaluate the onboarding practices of advanced practice providers at paediatric heart centres nationwide.
Results:
Survey responses reflected orienting practices at 19 paediatric heart centres representing varying programme and team sizes. Of the respondents, 32% felt their current model does not meet all the needs of the new team member. Key successful onboarding elements included a structured curriculum with goals and objectives, dedicated education time and materials, standardised assessments, and individualised learning in the presence of a supportive team. All respondents agreed that an online curriculum would be beneficial.
Conclusions:
There is no national standardised educational pathway for advanced practice providers entering paediatric acute care cardiology practice. There are opportunities to develop a formalised curriculum with structured learner assessment at a national level, which could be modified at the institution or learner level to enhance current onboarding practices.
This study re-examines the fiscal collapse of late-Qing China by analyzing how the imperial household’s financial practices destabilized the dynasty’s governance equilibrium. Focusing on the post-1853 period, it argues that the Taiping Rebellion’s devastation of salt tax networks and customary revenue streams triggered a systemic rupture in the Qing’s dual patrimonial-bureaucratic fiscal structure. Deprived of traditional income, the Imperial Household Department abandoned its century-old fiscal segregation from the Board of Revenue, initiating coercive fund transfers in 1857 that persisted until 1908. These transfers eroded bureaucratic control over public expenditures while enabling unchecked imperial extraction through semi-privatized channels. Contrary to previous scholarship emphasizing provincial-central tensions, this study highlights how the imperial household’s ultra-bureaucratic prerogatives subverted fiscal discipline, replacing quota-based budgeting with ad hoc requisitions. The resulting institutional dysfunction – marked by path-dependent rent-seeking and stifled fiscal innovation – exacerbated the regime’s inability to reconcile patrimonial demands with bureaucratic rationalization. By exposing the collapse of the Qing’s historic governance dialectic, this study reframes the dynasty’s fiscal disintegration as a crisis of autocratic institutional design rather than mere resource scarcity, offering new insights into late-imperial state failure.
Malnutrition is prevalent in older adults and frequently coexists with sarcopenia(1), a condition characterised by low muscle mass and physical performance(2). Malnutrition and sarcopenia are associated with adverse outcomes in older adults including mortality(3), and thus require early detection. The Mini Nutrition Assessment (MNA) is a validated nutrition screening and assessment tool in older adults(4), but its ability to predict poor muscle mass and physical performance is unclear. This study aimed to determine the association between MNA-determined (risk of) malnutrition and muscle mass and physical performance in community-dwelling older adults. This is a cross-sectional analysis of baseline data from the Capacity of Older Individuals after Nut Supplementation (COINS) study, a randomised controlled trial investigating the effect of peanut butter on functional capacity in older adults. Participants were generally healthy and at risk for falls (simplified fall risk screening score ≥ 2). Participants were screened for malnutrition risk (MNA-Screening score range 0–14; at-risk or malnourished if < 11), followed by assessment (MNA-Assessment score range 0–16) to obtain MNA-Total score (range 0–30; at-risk or malnourished if score < 23.5). Skeletal muscle mass index (SMMI) was derived from bio-impedance analysis. Physical performance including muscle strength, gait speed, balance and power, was objectively measured using multiple standard tests. Linear regression analyses were performed and adjusted for age and sex. A total of 120 participants were included (70% females, age 74.8 ± 4.5 years). MNA-Screening, MNA-Assessment and MNA-Total scores were (median [IQR]) 14 [12–14], 14 [13.5–15] and 27.5 [26.0–28.4] respectively. Malnutrition (or risk) was found in 18 (15.0%) and 8 (6.7%) participants according to MNA-Screening and MNA-Total, respectively. A higher MNA-Screening score was associated with higher knee extension strength [β = 1.36 (Standard Error, SE 0.65) kg, p = 0.039]. A higher MNA-Assessment score was associated with higher gait speed [β = 0.04 (0.01) kg/m2, p = 0.007] and shorter timed up-and-go test time [β = −0.19 (0.09) seconds, p = 0.035]. MNA-Total score was not significantly associated with muscle mass or physical performance. Malnutrition status as determined by MNA-Screening score (but not MNA-Total score) was associated with lower muscle mass (SMMI [β = −0.52 (0.26) kg/m2, p = 0.043]), but not strength and physical performance. In summary, the MNA-Screening score was predictive of muscle mass and strength, whereas the MNA-Assessment score predicted physical performance, particularly gait speed, in community-dwelling older adults at risk of falls. Periodic malnutrition screening by MNA may help early detection of poor muscle mass and function in generally healthy older adults.
Chronic obstructive pulmonary disease (COPD) is a heterogeneous lung condition affecting 1 in 7 Australians ≥ 40 years(1) and is the third leading cause of death worldwide(2). COPD exacerbations are the leading cause of preventable hospital admission in Australia(3), with viral and bacterial infections being the primary cause. The gut and lungs share a mucosal immune system known as the gut-lung axis(4). The gut contains the bodies largest community of microorganisms, known as the gut microbiota, and disruption to the gut microbiota is implicated in chronic diseases such as Inflammatory Bowel Disease, obesity and type 2 diabetes(5). Dietary intake modulates the gut microbiota via several pathways. Importantly, gut microbiota produce metabolites via digestion, and these microbial metabolites can modulate the immune system, and exert both beneficial and detrimental effects systemically. Therefore, we aimed to assess the potential of gut-derived microbial metabolites to modify immune cell responses to stimuli known to induce COPD exacerbations. Gut microbial metabolites, including seven tryptophan metabolites, and one secondary bile acid were selected by literature review. In control adults (n = 8), peripheral blood mononuclear cells (PBMC) were isolated via Lymphoprep™ density gradient centrifugation and seeded 2 × 106 cells/mL. Metabolite dose-curves (1–100 μM) were generated to determine optimal concentration. Post-3-hour metabolite incubation, PBMCs were stimulated with either lipopolysaccharide (LPS) (24 hr) or influenza-A (H1N1) (48 hr). PBMC production of anti-viral and anti-inflammatory cytokines IFN-γ, IL-6, TNF-α and IL-1β was assessed by DuoSet® ELISA. Cell viability post-metabolite incubation was confirmed via MTT assay. Cytokine dose-curves following metabolite treatment and stimulation were produced (n = 4/metabolite). Metabolite concentrations were selected based on reduction of LPS-induced TNF-α and IL-1β secretion, reduced influenza-A-induced IL-6 secretion, and increased influenza-A-induced IFN-γ secretion. Optimal metabolite concentrations included: secondary bile acid lithocholic acid (1 μM), and tryptophan metabolites nicotinamide (5 μM), indole-3-acetic acid (1 μM), kynurenic acid (5 μM), 3-hydroxyanthranilic acid (10 μM). Tryptophan metabolites cinnabarinic acid was dose-dependently pro-inflammatory, and tryptamine and 2-picolinic acid had no effect. MTT assays (n = 3/metabolite) found metabolites were not cytotoxic at the concentrations tested. These data show that gut-derived microbial metabolites modify immune cell responses to infection. These findings will inform further research which will examine the effects of microbial metabolites in immune cells from adults with COPD and assess the role of the gut microbiota in infectious COPD exacerbations.
One-fifth of the 101 million stroke survivors worldwide experience another stroke within the following five years. Research indicates that lifestyle risk factors account for 90% of stroke (similar to recurrent stroke) risk, and improving diet quality is a promising strategy(1). Synthesised research suggests that adopting a Mediterranean-style diet(2) with reduced sodium intake(3) can enhance cardiovascular health, which may be beneficial for secondary stroke prevention. Our previous ENAbLE trial, a co-designed telehealth diet and exercise intervention, demonstrated improvements in diet quality among stroke survivors. However, post-stroke effects (such as fatigue, hemiplegia, memory issues, aphasia, and dysphasia) along with poor culinary nutrition skills, limit the full adoption of the Mediterranean-style diet. Culinary nutrition combines cooking skills and nutrition knowledge to help individuals create nutritious and fulfilling meals. A recent scoping review revealed only two culinary nutrition programs designed for stroke survivors, with none co-designed specifically for Australian survivors. This research aims to co-design a Mediterranean-style diet based culinary nutrition program to enhance post-stroke nutrition. Using an Integrated Knowledge Translation model(4), three lived experience research partners and six clinical researchers engaged in the co-design process. Preliminary interviews with stakeholders highlighted gaps in current stroke care. We conducted two focus groups with potential end-users, including six stroke survivors and seven clinicians. Data were analysed thematically, and prototypes were developed iteratively with end-users. Consumers (lived experience research partners and end-users) identified a significant gap in practice during the early post-discharge period, emphasising the need for self-paced remote culinary nutrition resources. They identified several facilitators (e.g., simple recipes, easy translation of nutrition knowledge, aphasia-friendly design) and barriers (e.g., fatigue, muscle weakness, cooking for one or for a household) in meal preparation. Consumers noted that while recipe books reduce the cognitive load of meal preparation, no stroke-specific resources are currently available. As a result, we developed a co-designed recipe book titled Cook Well After Stroke, incorporating feedback from end-users. This book features Mediterranean-style diet recipes for both single-serving and household/batch cooking. It is designed with plain English, requiring minimal ingredients, equipment, and cooking skills. Recipes are highly adaptable to common Australian ingredients, encouraging users to repeat the recipes with available resources. It includes nutrition principles to create balanced meals, increase protein and vegetable intake, and reduce sodium consumption. Each section includes hints and hacks tailored to post-stroke effects, providing strategies for one-handed cooking, softer diet textures, and energy conservation. The recipe book will be integrated into a co-designed online cooking program, which will be trialled with stroke survivors.
Dietary intake modulates the gut microbiota by providing fermentation substrates. Both microbiota-accessible nutrients and digestible food components have been shown to modulate microbial abundance and function(1). A range of dietary assessment methods are used to investigate diet-microbe interactions, with two commonly used methods being food frequency questionnaires (FFQ) to assess ‘habitual’ dietary intake and food recalls which measure recent intake proximal to sampling of microbiota. This study aimed to compare diet-microbiome associations identified from habitual and proximal dietary intake aligned with stool microbiota sampling in a healthy adult cohort. Military trainees (n = 35), and non-military personnel (junior doctors during hospital placement; n = 21) self-reported proximal dietary intake using digital (Easy Diet Diary) or paper-based 24-hr recalls. Habitual intake was assessed using the Comprehensive Nutrition Assessment Questionnaire (CNAQ)(2) FFQ. Both measures were assessed at baseline and study completion. Diet recalls matched to the same week of FFQ were analysed using Foodworks 10(3). Stool samples were collected for metagenomic shotgun sequencing and annotated against the Microba Life Sciences platform. MaAsLin2 identified linear associations between nutrients and microbe abundance, controlling for total energy intake and individual variation with repeated measures. Thirty dietary variables common to both dietary assessment methods were used in analysis. Mean daily intakes for total energy and macronutrients were not significantly different between habitual and proximal data. Nutrients that differed between methods were polyols (p < 0.001), sugar (p = 0.006), sodium (p = 0.03), alcohol (p < 0.001), vitamin A equivalents (p < 0.001), b-carotene equivalents (p < 0.001) and dietary fibre (p = 0.01). Associations between nutrient intake and microbes also differed between dietary collection methods. Most significant associations were found with nutrients measured by 24-hr recall. Mean (M) proximal intake of polyols (M = 0.9 g, standard deviation (SD) = 1.8 g) was significantly associated with increased relative abundance of Akkermansia spp. and CAG460 spp. but not with habitual intake (M = 3.4 g, SD = 3.2 g). Proximal alcohol intake (M = 2.5 g, SD = 8.8 g) was associated with CAG1427 spp. and Collinsella spp., which was not identified with habitual intake (M = 4.4 g, SD = 6.7 g). In contrast, habitual sugar intake (M = 149 g, SD = 103 g) was associated with Bacteroides spp. and Blautia spp. This association was not evident for proximal intake (M = 112 g, SD = 68 g), suggesting that some diet-microbiota associations may depend on the dietary assessment method used. These findings demonstrate the relevance of considering both habitual diet and proximal intake when conducting diet-microbiome research. Further analysis will investigate the role of these microbes and further associations between these nutrients and the functional capacity of the microbiota.
Current clinical guidelines for people at risk of heart disease in Australia recommend nutrition intervention in conjunction with pharmacotherapy(1). However, Australians living in rural and remote regions have less access to medical nutritional therapy (MNT) provided by Accredited Practising Dietitians (APDs) than their urban counterparts(2). The aim of the HealthyRHearts study was to trial the delivery of MNT by APDs using telehealth to eligible patients of General Practitioners (GPs) located in small to large rural towns in the Hunter New England region(3) of New South Wales, Australia. The study design was a 12-month pragmatic randomised controlled trial. The key outcome was reduced total cholesterol. The study was place-based, meaning many of the research team and APDs were based rurally, to ensure the context of the GPs and patients was already known. Eligible participants were those assessed as moderate-to-high risk of CVD by their GP. People in the intervention group received five MNT consults (totalling two hours) delivered via telehealth by APDs, and also answered a personalised nutrition questionnaire to guide their priorities and to support personalised dietary behaviour change during the counselling. Both intervention and control groups received usual care from their GP and were provided access to the Australian Eating Survey (Heart version), a 242-item online food frequency questionnaire with technology-supported personalised nutrition reports that evaluated intake relative to heart healthy eating principles. Of the 192 people who consented to participate, 132 were eligible due to their moderate-to-high risk. Pre-post participant medication use with a registered indication(4) for hypercholesterolemia, hypertension and glycemic control were documented according to class and strength (defined daily dose: DDD)(5). Nine GP practices (with 91 participants recruited) were randomised to the intervention group and seven practices (41 participants) were randomised to control. Intervention participants attended 4.3 ± 1.4 out of 5 dietetic consultations offered. Of the132 people with baseline clinical chemistry, 103 also provided a 12-month sample. Mean total cholesterol at baseline was 4.97 ± 1.13 mmol/L for both groups, with 12-m reduction of 0.26 ± 0.77 for intervention and 0.28 ± 0.79 for control (p = 0.90, unadjusted value). Median (IQR) number of medications for the intervention group was 2 (1–3) at both baseline and 12 months (p = 0.78) with 2 (1–3) and 3 (2–3) for the control group respectively. Combined DDD of all medications was 2.1 (0.5–3.8) and 2.5 (0.75–4.4) at baseline and 12 months (p = 0.77) for the intervention group and 2.7 (1.5–4.0) and 3.0 (2.0–4.5) for the control group (p = 0.30). Results suggest that medications were a significant contributor to the management of total cholesterol. Further analysis is required to evaluate changes in total cholesterol attributable to medication prescription relative to the MNT counselling received by the intervention group.
The colonisation of Australia around 250 years ago resulted in significant disruptive changes to the lifestyle and diet of Aboriginal and Torres Strait Islander peoples. Traditional foods high in micronutrients, including vitamin D, have been largely replaced with energy-dense foods(1). Sun exposure—a primary source of vitamin D—may be reduced due to changes in clothing and housing structure(2). Consequently, there is a high prevalence of vitamin D deficiency (serum 25-hydroxyvitamin D concentration < 50 nmol/L) and low vitamin D intake among Aboriginal and Torres Strait Islander peoples(2,3). There is a need for a public health strategy to improve vitamin D status. Since few foods naturally contain vitamin D (e.g., fish, eggs, and meat), food fortification could be a suitable public health strategy to increase vitamin D intake without changing consumption behaviour. In Australia, besides food mandated for fortification (e.g., edible oil spreads), few foods permitted for voluntary fortification are routinely fortified. We aimed to model vitamin D food fortification scenarios among Aboriginal and Torres Strait Islander peoples. We used nationally representative food consumption data from the 2012–2013 National Aboriginal and Torres Strait Islander Nutrition and Physical Activity Survey (n = 4,109) and analytical vitamin D food composition data(4) to model four food fortification scenarios. Scenario 1 modelled the addition of the maximum permitted amount of vitamin D to all foods permitted for fortification in Australia: i) dairy products and alternatives, ii) butter/margarine/oil spreads, iii) formulated beverages (e.g., water with added sugar, vitamins and minerals), and iv) selected ready-to-eat breakfast cereal. Scenarios 2a–c included vitamin D concentrations higher than permitted in fluid milks/alternatives (1 μg/day) and butter/margarine/oil spreads (20 μg/day). Scenario 2a: i) dairy products and alternatives, ii) butter/margarine/oil spreads, iii) formulated beverages. Scenario 2b: as per Scenario 2a plus selected ready-to-eat breakfast cereals. Scenario 2c: as per Scenario 2b plus bread (not permitted for vitamin D fortification in Australia). Vitamin D fortification of a range of staple foods could potentially increase vitamin D intake among Aboriginal and Torres Strait Islander peoples by ~3–6 μg/day. Scenario 2c showed the highest potential median vitamin D intake increase from baseline of 2 μg/day to ~8 μg/day. Across all scenarios, the vitamin D intake of all participants remained below the Australian Tolerable Upper Intake Level of 80 μg/day. Our findings demonstrated that vitamin D fortification of a range of staple foods could potentially increase vitamin D intake among Aboriginal and Torres Strait Islander peoples in Australia. However, the most impactful vitamin D fortification strategy (Scenario 2c) would require a revision of the Australia New Zealand Food Standards Code to permit the addition of higher amounts of vitamin D than currently permitted and the inclusion of bread as a food vehicle for fortification.