To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Mental Health Act assessment or interview is a commonplace process in psychiatric services during which significant decisions are made about a person’s care and liberty. Individuals have reported negative experiences of being subjected to these assessments, at times even influencing their ongoing relationship with healthcare and recovery. A 2018 independent review of the Mental Health Act 1983 (MHA) for England and Wales identified numerous areas for practice improvement, including the identification of epistemic injustice as part of current MHA processes. Nevertheless, the assessment process has received little attention on how it is conducted, with scant clinical guidance, training or research available on the subject. In this article the authors propose seven principles that assessors can incorporate into interviews to improve the way in which the MHA assessment is conducted. These principles have been drawn from a dialogical and relational approach to psychiatric care called Open Dialogue. A dialogical approach to MHA assessment could improve experiences of being assessed, information gathered, and, by preserving therapeutic relationships, give better longer-term outcomes.
Management of mental health disorders often include nutritional therapy, and guidelines for monitoring require pathology tests. This includes but not limited to individuals with alcohol and other drugs (AOD) and weight-control issues in cases of metabolic syndrome (obesity, hypertension, dyslipidaemia, and diabetes), which involves cardiology or cardiovascular medicine management. The extent of compliance to evidence-based practice including laboratory tests(1), such as routine full blood count(2), as well as electrolytes and liver function tests are considerations in evaluation of nutritional management and monitoring. The primary objective of this review is to determine compliance to guidelines in case studies involving nutritional management. The secondary objective is evaluation of the pathology results in cases of cardiovascular disease management guidelines. This was a systematic literature review and meta-analysis, which were adopted in identifying and selecting the articles appraised. Search was unlimited in years of publication. Initial search engine was PubMed, for brevity. Appraisal tool was a simple objective questionnaire based on evidence-base practice in nutritional perspective of AOD management using a reference template. Additional grey literature search was done to provide nuance to the systematic review. Compliance to evidence-based practice was quantified by calculating the percentage of expected ‘yes’ responses. On pathology tests, the focus was predominantly on coagulation profile, haematology, lipid profile and liver function tests. Among the > 548,000 titles initially identified, only three were selected for the critical appraisal and three additional documents were selected from the grey literature search. All six articles appraised, showed 98% compliance to pathology guidelines. The laboratory evidence-based monitoring was implied in five, of which four were related to cardiology and four reports indicated or inferred laboratory monitoring of dyslipidaemia, only. None of the articles mentioned coagulation profile, haematology or liver function tests. This discourse advances that for almost 30-years, there has been knowledge of a strong link between nutritional management and cardiovascular disease management including in mental healthcare, which can be assessed with eWBV from pathology(3). There is excellent compliance to evidence-based practice in research reports involving nutritional management in mental health cases. However, laboratory evidence-based monitoring for cardiovascular medicine seems incomplete. In cognizance of cardiovascular disease management guidelines, this incompleteness may be a matter of discretion.
Eggs are a unique food that are high in cholesterol, but low in saturated fat. Egg consumption recommendations have fluctuated over time due to the belief that increased intake of dietary cholesterol raises plasma low density lipoprotein cholesterol (LDL-C) and therefore cardiovascular disease risk(1). Research suggests it is saturated fat, rather than dietary cholesterol, that is implicated in this association, yet controversy over egg consumption remains(1,2). This study aimed to evaluate the independent effects of dietary cholesterol (from eggs) and saturated fat intakes on LDL-C. Sixty-one adults with LDL-C less than 3.5 mmol/L (39 ± 2 years, BMI 25.8 ± 0.8 kg/m2) were enrolled in a randomised controlled counter-balanced, three-arm cross-over study(3). Participants consumed three isocaloric diets for five weeks each in randomised order: a high-cholesterol (600 mg)/low-saturated fat (6%) diet including two eggs per day (EGG) diet, a low-cholesterol (300 mg)/high-saturated fat (12%) without eggs (EGG-FREE) diet, and a control diet (CON) high in both cholesterol (600 mg) and saturated fat (12%) including one egg per week. Each diet phase included eight detailed daily meal plans with recipes, which were used on rotation for the five-week period. Throughout each dietary phase participants attended three diet review consults (via video conference or phone) and received individualised dietary advice from a dietitian. Dietary intake (5-day diaries analysed using Foodworks, Xyris Software, Australia), and lipid and lipoprotein levels were measured at study entry, and at the end of each diet phase. Treatment effects were analysed using linear mixed effects models. Results are reported as mean ± standard error. Forty-eight participants completed all three diets, with dietary analyses demonstrating target cholesterol and saturated fat intakes were generally met for each diet. Notably, saturated fat intake was 2% higher than target for all diets (CON and EGG 14%; EGG-FREE 8%). Compared to CON, plasma LDL-C concentration was significantly lower following the EGG diet (2.83 ± 0.08 mmol/L vs 2.68 ± 0.08 mmol/L, p = 0.02), but not the EGG-FREE diet (2.75 ± 0.08 mmol/L, p = 0.52). Across all three diets there was a significant within-individual relationship between dietary saturated fat intake and LDL-C concentration (β = 0.35, p = 0.002), but there was no significant relationship with dietary cholesterol intake (β = −0.006, p = 0.42). Our findings indicate that dietary saturated fat, not cholesterol, is responsible for elevating plasma LDL-C concentrations. Consuming two eggs per day within a low-saturated fat diet does not adversely affect plasma LDL-C.
Within contemporary feminism, perspectives on men being feminist range from those who are “viscerally opposed” to those who argue that engaging with men more systematically is “the most consequential move feminists can make.” While some feminist political scholars have called to expand the feminist agenda to include analyses of men as gendered, resistance to this expansion is significantly entrenched, and men who identify as feminist are frequently regarded with distrust. Yet, if feminist efforts are to transform deeply entrenched gendered power structures, there is good reason to engage fully with the many ways all conceptions and practices of gender work to maintain and/or challenge current power structures. This article offers a relational approach to feministing—that is, an approach grounded in becoming feminist through praxes centred on uncovering points of solidarity across and within gender identity categories, the pursuit of coalition-oriented politics and the prioritization of accountability through action not identity.
Nutrition plays a key role in brain development in the first 1000–2000 days of life(1). Furthermore, fussy eating is broadly defined as the inconsistent rejection and acceptance of both familiar and unfamiliar foods(2). Fussy eating is reportedly found in 10–15% of 2–3-year-old children, although typically starts to decrease in prevalence by 4 years old(3). Despite this decrease in the general population, children with neurodevelopmental disabilities, such as autism spectrum disorder, can see a protracted continuation of fussy eating and reject approximately 30% more foods than typically developing children(4). Consequently, for some children with neurodevelopmental disabilities maintaining adequate nutritional intake can be a challenge. Key nutrients for optimal neurodevelopment include iron, omega-3, protein, zinc and folate(1), and underconsumption of these nutrients can lead to poorer developmental outcomes in some children(1). Limited empirical evidence has been published on fussy eating in children with neurodevelopmental disabilities and no studies has examined the effects of poor diet variety on their emotional regulation. This study aims to systematically review current evidence to determine the association between fussy eating, diet variety and experiences of emotional regulation in children with neurodevelopmental disability. The search strategy was designed with the use of database specific index phrasing and was modified and tested several times before the formal search day. The systematic literature review was conducted across Medline, Scopus, Embase, Cinahl and Google Scholar. Studies included were dated from 2014–2024 and must have included children with a diagnosed or suspected neurodevelopmental condition and be aged been 1 to 9 years. Articles were excluded if children were following diets restricted by caregivers such as vegetarianism or ketogenic diets. Studies which focused primarily on a psychological outcome were also excluded due to the scope of the research not being within a nutrition related field. All articles were stored in Endnote, with duplicates removed before screening. The search results yielded 500 articles, following screening 10 full text articles met all inclusion criteria. Following data extraction, results illustrated that children with neurodevelopmental disabilities and observed fussy eating behaviours can experience difficulties regulating emotions. Additionally, diet variety was found to be consisting primarily of processed grains and meats, with minimal wholegrain and vegetable intake. Further research is needed to understand the aetiology and causative pathways between fussy eating, diet variety and emotional regulation for children with neurodevelopmental disability to better inform potential dietary interventions in this population.
Immunoceuticals are natural products used to enhance immunity(1). Lactoferrin (Lf) is an immunoceutical supplement which has been shown to have immunomodulatory properties(2). The immuno-protective functions of Lf are of interest in older adults, as immune function declines with increasing age(3). This study examined the effects of oral Lf supplements on ex vivo immune cell responses to respiratory virus infection, circulating immune cell subsets, and systemic inflammation. Healthy adults (≥ 50 years old, n = 103) were randomised to High dose (600 mg/d) or Low dose (200 mg/d) Lf or placebo, in a 4-week, parallel, double-blinded trial. Ex vivo cytokine release of interferon (IFN)-α2, IFN-γ, interleukin (IL)-6 and tumour necrosis factor (TNF)-α) in isolated peripheral blood mononuclear cells (PBMCs) infected with rhinovirus A-16 (RV-16) and influenza A virus (H1N1), circulating immune cell subsets, and plasma IL-6, C-reactive protein (CRP) and TNF-α were assessed and analysed by multivariate regression models. Analysis included 102 participants at baseline, and 96 participants at follow up. High dose Lf decreased RV-16-induced IL-6 (p = 0.001 vs placebo), and increased RV-16-induced IFN-α2 (p = 0.041, vs low dose) in PBMCs. H1N1-induced IL-6 decreased following Low dose Lf and placebo (p = 0.009, p = 0.021 vs baseline), while High dose Lf increased H1N1-induced TNF-α (p = 0.023 vs low dose, p = 0.049 vs placebo) and decreased H1N1-induced IFN-γ (p = 0.032 vs baseline) in PBMCs. High dose Lf increased total T cells (p = 0.031), CD4+ T cells (p = 0.028) and BDCA-1 cells (p = 0.016), and decreased γδ T cells (p = 0.046) compared to placebo, while Low dose Lf reduced circulating neutrophils (p = 0.044), natural killer cells (p = 0.045), activated CD8+ T cells (p = 0.031), and γδ T cells (p = 0.031) compared to placebo. High dose Lf decreased plasma IL-6 and CRP compared to Low dose Lf (p = 0.004, p = 0.026), but not placebo. There was no difference between intervention groups in the number of adverse events. This 4-week trial in healthy, older adults showed both High and Low dose Lf interventions enhanced ex vivo immune cell responses to respiratory virus infection, with decreased pro-inflammatory cytokine and increased anti-viral cytokine release observed. Low dose Lf reduced the frequency of both pro-inflammatory and cytotoxic innate immune cells, while increased T-cell populations following High dose Lf indicate improved cellular adaptive immune responses which provide protection against infection, tumours and chronic disease. Effects on systemic inflammation were only seen following High dose Lf, suggesting higher doses of lactoferrin are required to address this outcome. Oral lactoferrin supplements are generally regarded as safe, and appear to have immunoceutical benefits in healthy, older adults.
With the food system estimated to be responsible for approximately one-third of greenhouse gas emissions(1) there is an urgent need to transition to healthy and more environmentally sustainable diets. Plant-based ‘milks’ are associated with lower greenhouse gas emissions than dairy milks(2) and many Australian consumers are making the substitution(3). The 2013 Australian Dietary Guidelines advise that plant-based ‘milks’ fortified with at least 100 mg of calcium per 100 ml (e.g., soy, rice or other cereal) can replace dairy milk in the diet(4). This study aimed to assess the likely population-wide nutritional implications of replacement of dairy milk with the main categories of plant ‘milks’ available in Australian supermarkets in November 2023. We used computer simulation modelling of data from the 2011–2 National Nutrition and Physical Activity Survey (n = 12,153 persons aged 2+ years)(5). Dairy milk (including from hot drinks) was replaced with each category of plant ‘milk’ and the likely impact on usual intake of key nutrients supplied by dairy milk was assessed across eight age groups (National Cancer Institute method). Mean usual protein intake was relatively unchanged when dairy milk was replaced by soy ‘milk’ but replacement by rice ‘milk’ led to reductions of 4–5% in older adults (71+ years), increasing the proportion of older men with an inadequate intake from 14% (95% margin of error 5.1) to 20% (8.1). Nine out of 11 categories of plant ‘milks’ were not fortified with riboflavin. Replacement of dairy milk with these products would likely reduce mean usual riboflavin intake by 11% in older adults, increasing the proportion with an inadequate usual intake from 20% (6.2, 5.8) to 30–31% (9.9, 6.3). Nine out of 11 plant milk categories were not fortified with vitamin B12, and replacement of dairy milk with these products would likely reduce usual intake by 10-49% depending on the population group, leading to the proportion of females aged at least 14 years with an inadequate usual intake of vitamin B12 to increase from between 5 (2.2) and 8% (4.0), depending on age, to between 11 (3.4) and 17% (5.4). All categories of plant milks were unfortified with iodine. As a result, replacement of dairy milk with plant ‘milks’ by females aged at least 14 years would likely reduce mean intake by 7–15% and increase the proportion with an inadequate intake from between 6 (4.2) and 12% (4.7), depending on age, to 15 (8.1) to 24% (6.0). In conclusion, replacement of dairy milk with most types of plant-based milk has the potential to adversely impact protein, riboflavin, vitamin B12 and iodine intakes by the Australian population. Advice about switching to plant-based milks needs to consider the population group concerned and a range of nutrients, not just calcium.
Shift workers in Australia constitute approximately 16% of the workforce, with nearly half working a rotating shift pattern(1). Whilst poor dietary habits of shift workers have been extensively reported, along with increased risk of metabolic health conditions such as obesity, cardiovascular disease and diabetes compared to non-shift workers(2,3,4), studies on shift working populations rarely control for individual and lifestyle factors that might influence dietary profiles. While rotating shift work schedules have been linked with higher energy intake than daytime schedules(5), little is known about the impact of different night shift schedules (e.g., fixed night vs rotating schedules) on the diets of shift workers, including differences in 24-hour energy intake and nutrient composition. This observational study investigated the dietary habits of night shift workers with overweight/obesity and compared the impact of rotating and fixed night shift schedules on dietary profiles. The hypothesis was posited, that shift workers’ diets overall would deviate from national nutrition recommendations, and those working rotating shift schedules compared with fixed night schedules would have higher energy consumption. Participants were from the Shifting Weight using Intermittent Fasting in night shift workers (SWIFt) trial, a randomised controlled weight loss trial, and provided 7-day food diaries upon enrolment. Mean energy intakes (EI) and the percentage of EI from macronutrients, fibre, saturated fat, added sugar, alcohol, and the amount of sodium were evaluated against Australian adult recommendations. Total group and subgroup analysis of fixed night vs rotating schedules’ dietary profiles were conducted, including assessment of plausible and non-plausible energy intake reporters. Hierarchical regression analysis were conducted on nutrient intakes, controlling for individual and lifestyle factors of age, gender, BMI, physical activity, shift work exposure, occupation and work schedule. Overall, night shift workers (n = 245) had diets characterised by high fat/saturated fat/sodium content and low carbohydrate/fibre intake compared to nutrition recommendations, regardless of shift schedule type. Rotating shift workers (n = 121) had a higher mean 24-hour EI than fixed night workers (n = 122) (9329 ± 2915 kJ vs 8025 ± 2383 kJ, p < 0.001), with differences remaining when only plausible EI reporters were included (n = 130) (10968 ± 2411 kJ vs 9307 ± 2070 kJ, p < 0.001). These findings highlight poor dietary choices among this population of shift workers, and higher energy intakes of rotating shift workers, which may contribute to poor metabolic health outcomes often associated with working nightshift.
Obesity affects > 30% of Australian adults and is a rapidly growing health concern, on both national and global scales(1). Obesity is associated with an excess of nutrients in the circulation, particularly saturated fatty acids which are thought to contribute to chronic low grade systemic inflammation(2). People with obesity are also known to have an increased risk of severe respiratory viral disease, highlighted by both the recent SARs-CoV-2 pandemic and the 2009 Influenza A H1N1 pandemic(3,4). Previously, our group has shown that consuming a meal high in saturated fatty acids can increase activation of the NLRP3 inflammasome in the airways of adults with asthma(5), while others have shown increased NLRP3 activation is implicated in the pathogenesis of severe inflammation observed during the peak of IAV-induced lung disease(6). We sought to determine the impact of dietary saturated fatty acids on the immune response of airway epithelial cells to Influenza A Virus, and to examine if this is a factor for severe respiratory viral disease outcomes. We pre-treated BCI-NS1 cells, an airway epithelial cell line, with either media or physiologically relevant concentrations of the saturated fatty acids, palmitic acid (250 μM), stearic acid (1000 μM) or pentadecanoic acid (50 μM) for 3 hours at 37°C (5% CO2). Cells were then washed with phosphate buffered saline and infected with Influenza A (H1N1pdm09) (Multiplicity of Infection 0.5) and incubated for 48 hours at 37°C (5% CO2). Cell culture supernatants were collected and assayed by Enzyme-Linked Immunosorbent assay for Interleukin (IL)-6 and Interferon (IFN)-λ. Pre-treatment with saturated fatty acids reduced IFN-λ production of virus infected cells (following palmitic acid pre-treatment IFN-λ was 7.9 pg/mL ± 4.5 (SD); n = 6, p < 0.01, following stearic acid pre-treatment IFN-λ was 10.3 pg/mL ± 7.7 (SD); n = 6, p < 0.01, and following pentadecanoic acid pre-treatment IFN-λ was 11.3 pg/mL ± 8.1 (SD); n = 6, p < 0.01, compared to cells pre-treated with media alone (42.7 pg/mL ± 14.0 (SD); n = 6). IL-6 production was unchanged by pre-treatment with saturated fatty acids prior to H1N1pdm9 infection. As previously mentioned, the excess saturated fatty acids are correlated with chronic low-grade systemic inflammation(2). This is thought to be of contribution to the worsened infection-induced outcomes in response to Influenza A. We conclude that dietary saturated fatty acids circulating in people with obesity may impair the anti-viral response of airway epithelial cells and further contribute to severe outcomes in respiratory viral disease experienced by those with obesity.
Canada’s National Microbiology Laboratory offers diagnostic testing of Creutzfeldt-Jakob disease (CJD) and related prion diseases. Since 2016, the highly sensitive and specific end-point quaking-induced conversion assay (EP-QuIC) of CSF samples has been used for antemortem CJD diagnostic testing alongside tests for surrogate biomarkers 14-3-3 and hTau. To assess EP-QuIC’s utility, we undertook a retrospective study of Canadian CJD diagnostic testing conducted between 2016 and 2024.
Methods:
Using CJD CSF test results collected between 2016 and 2024, we analyzed the CJD incidence in Canada, estimated based on positive EP-QuIC tests. Multivariate regression models were used to further evaluate CJD CSF testing between CJD subtypes, genders, age groups and codon 129 genotypes.
Results:
From 2016 to 2024, the CJD incidence across Canada was estimated at 1.51 cases per million population per year. CJD incidence did not vary significantly across provinces, although a slight increase in CJD incidence was detected in New Brunswick due to increased sampling rates. EP-QuIC offered higher test sensitivity than both surrogate biomarker tests. Analysis of biomarker abundances and test positivity rates across biochemical subtypes revealed significant differences. We also detected variation in CSF test positivity rates across age groups and a trend of increasing biomarker abundance with age within EP-QuIC-negative cases. No significant variation was detected between males and females.
Conclusion:
EP-QuIC exhibits exceptional specificity and sensitivity for antemortem diagnosis of CJD, providing a valuable tool for the diagnosis of human prion diseases and for improved surveillance.
Vegan and vegetarian dietary patterns are known to beneficially modulate risk factors for cardiovascular disease; however, the current literature does not differentiate between various plant-based diets(1). This study aimed to examine the association between various plant-based diets and plasma lipids and glycaemic indices compared to a regular meat-eating diet. A cross-sectional study of Australian adults (n = 230) aged 30–75yrs habitually consuming the following were recruited: vegan, lacto-vegetarian, pesco-vegetarian, semi-vegetarian, or regular meat-eater. Multivariable regression analyses was used to adjust for covariates. Compared to regular meat-eaters, vegans had significantly lower total cholesterol (−0.77 mmol/L, 95% CI −1.15, −0.39, p < 0.001), low-density lipoprotein cholesterol (LDL-C, −0.71 mmol/L, 95% CI −1.05, −0.38, p < 0.001), non-high-density lipoprotein cholesterol (non-HDL-C, −0.75 mmol/L, 95% CI −1.11, −0.39, p < 0.001), total cholesterol/HDL-C-ratio (−0.49 mmol/L, 95% CI −0.87, −0.11, p = 0.012), fasting blood glucose (FBG, −0.29 mmol/L, 95% CI −0.53, −0.06, p = 0.014), haemoglobin A1C (−1.85 mmol/mol, 95% CI −3.00, −0.71, p = 0.002) and insulin (−1.76 mU/L, 95% CI −3.26, −0.26, p = 0.021) concentrations. Semi-vegetarians had significantly lower LDL-C (−0.41 mmol/L, 95% CI −0.74, −0.08, p = 0.041) and non-HDL-C (−0.40 mmol/L, 95% CI −0.76, −0.05, p = 0.026) and lacto-ovo vegetarians had significantly lower FBG (−0.34 mmol/L, 95% CI −0.56, −0.11, p = 0.003) compared to regular meat-eaters. There were no differences in HDL-C and triglycerides between plant-based and regular-meat diets. Plasma lipaemic and glycaemic measures as a collective were more favourable among vegans, whereas among lacto-ovo vegetarians and semi-vegetarians, only some measures were favourable.
Polycystic ovary syndrome (PCOS) is a common endocrine condition(1) associated with an increased risk of developing type 2 diabetes (T2D) and cardiovascular disease. Healthy lifestyle habits are critical in the management of PCOS(1), however, the public health system provides limited support for the lifestyle management of PCOS. Diabetes Victoria delivers a free program, ‘Life!’, to Victorian Residents at high risk of developing T2D, including those with PCOS. Life! is currently designed to meet general high-risk population needs, not specifically designed with PCOS in mind. This study aimed to evaluate the current Life! Program’s design, content and delivery against the needs of those with PCOS through co-design workshops with previous Life! participants who have PCOS (n = 14) and program facilitators (n = 5). A series of mixed-methods workshops were used to assess the current program, design an ideal program and prioritise unmet needs. Co-design was informed by the Linking and Amplifying User-Cantered Networks through Connected Health (LAUNCH) Roadmap(2) and the TiDiER checklist(3). Online worksheets, polls, open-ended questions and annotation of current material were used to aid participation and input. Four workshops (WS) were conducted online: two 3-hr (WS 1) and two 2-hr (WS 2). All were audio recorded and transcribed. Data was thematically synthesised using template analysis and findings from WS 1 were used to inform WS 2. Those with PCOS participated in the Life! program between 2018 and 2023, were aged between 24 and 52 years and the majority (93%) had a BMI greater than 25kg/m2. Program facilitators included dietitians, diabetes educators and a physiotherapist with 80% having more than 10 years’ experience. Overall participants wanted less generic and more PCOS-centric topics, less of an emphasis placed on weight loss with an equal focus on a range of health outcomes. Recommended topics included PCOS-centric lifestyle advice across diet, physical activity, sleep and mental health with an emphasis on how (practical strategies) and why (mechanistic understanding) healthy lifestyle behaviours should be applied. Participants desired the tone and sentiment used in the program’s language and imagery to be inclusive, gentle, non-stigmatising and positive. Participants desired a flexible approach to program delivery (a mix of in-person, online, one-on-one and group). One-on-one sessions were desired when receiving individualised advice and discussing sensitive topics, while group sessions were preferred for peer support, learning activities and reflections. People with PCOS desire a PCOS-centric lifestyle program with a focus on meaningful health outcomes, reducing the focus on weight loss, blended delivery, tailored and practical strategies with long-term support. Results will inform the development of a tailored lifestyle program that aims to better engage those with PCOS. Future community-based PCOS programs and clinicians are strongly recommended to incorporate these findings to improve engagement and consumer satisfaction.
Improving neonatal piglet survival is a key driver for improving pig production and enhancing animal welfare. Gestational diabetes is a risk factor for neonatal morbidities in humans, such as hypoglycaemia and respiratory distress(1). There is limited knowledge on the association of gestational diabetes with neonatal survival in commercial pigs. An early study suggested that the diabetic condition of late-gestating sows was positively correlated with the first-week newborn piglet mortality(2). Genetic selection in recent decades for heavier birth weight may have increased the prevalence or severity of gestational diabetes in pigs, considering the positive correlation between gestational diabetes and birth weight. We hypothesised that the diabetic condition of late gestating sows positively correlates with the neonatal piglet mortality rate in sows with modern genetics. Mixed-parity sows (1.5 ± 1.6 parity for mean ± standard deviation (SD); Large White × Landrace) from a commercial piggery in Australia were randomly selected and participated in an oral glucose tolerance test (OGTT) during two seasons (118 sows in winter and 118 sows in summer). On the d109 day of gestation, sows were fed 3.0 g dextrose per kg of metabolic body weight after fasting overnight. Tail blood glucose concentrations were measured using a glucometer (Accu-Chek ®, Roche Diabetes Care Australia Pty) at −10, 0, 10, 20, 30, 40, 50, 60, 70, 80, 90, 105, 120 minutes relative to dextrose feeding. The glucose increment (2.5 ± 1.29 mM for mean ± SD) during OGTT was calculated using the maximum concentration substrating the fasting concentration of blood glucose. The 24-hour piglet mortality rate (5% ± 8.8% for mean ± SD) was calculated as the ratio between piglets that died during the first 24 hours and the total number of born alive on a litter basis. The effect of sow glucose increment, season (winter vs summer), glucose increment × season, number of piglets born alive, and sows parity on the 24-h piglet mortality rate as analysed using a Generalised Linear Model (SPSS 27th Version, IBM SPSS Statistics, Armonk). Results showed that the 24-hour piglet mortality rate was numerically higher in winter than in summer although insignificant (5.7% vs 4.2%, p = 0.41). The glucose increment of gestating sows was positively correlated with the 24-hour piglet mortality rate during winter but not summer, as evidenced by an interaction trend between glucose increment and season (p = 0.059). The regression coefficient suggested that every extra unit (mM) of glucose increment during OGTT corresponded to a 1.4% increase in the 24-hour piglet mortality rate in winter. In conclusion, the diabetic condition of late-gestating sows is a risk factor for neonatal piglet mortality in winter. Developing nutritional strategies to mitigate the diabetic condition of late-gestating sows may benefit neonatal piglet survival.
Risks associated with sub-optimal dietary intake are one of the leading factors contributing to mortality and morbidity worldwide(1). Regular and comprehensive data on the intakes of individuals play a key role in informing the development of public health nutrition policy and programs. In low- and lower-middle-income countries (LLMICs), proxy measures of individual intake, such as food balance sheets, are often used. To-date there has been limited application of technology-assisted dietary assessment methods due to resource constraints(2). This study reports on the relative validity of the Voice-Image Solution for Individual Dietary Assessment (VISIDA) system in a sample of Cambodian women and children. The VISIDA system is a dietary assessment system developed specifically for LLMICs. Intake data is collected in the form of image-voice food records (via a smartphone app) where images and voice recordings of food for consumption are collected prior to eating along with any leftovers at the end of the meal. The image-voice food records are then uploaded to the VISIDA web application and processed by trained analysts to produce estimates of individual nutrient intake. Mothers and one of their children (aged ≤ 5 years) from Siem Reap province, Cambodia were recruited. Intake data was collected for each participant using two dietary assessment methods over three recording periods. The mother used the VISIDA image-voice smartphone app to collect intake data on three days for herself and her child in week 1, with this repeated in week 4. In between VISIDA recording periods, intake data was collected on the mother and her child using interviewer-administered 24-hour recalls collected on three days. A linear mixed model approach was used to evaluate differences between the estimated nutrient intakes for the recording periods for mothers and children. The nutrient intakes for a total of 119 mothers and 91 children were included in the analysis. For both mothers and children, intakes for the majority of nutrients were higher from the 24-hour recalls than intakes reported using the VISIDA system. When 24-hour recall intakes were compared to each VISIDA recording period, mothers had a higher number of nutrients with differences that were statistically significantly than children. In general, the VISIDA system produced lower estimates of nutrient intakes for Cambodian mothers and children when compared to intakes from 24-hour recalls. Further evaluations of the VISIDA system in other LLMICs would provide additional insights into the performance of this method for assessing individual dietary intakes in this context.
Adults living with obesity have a higher risk of eating disorders and disordered eating behaviours such as binge eating(1,2). However, the prevalence of disordered eating/eating disorders in adults presenting for obesity treatment is unknown and this information is needed to guide service provision. This systematic review aimed to estimate the prevalence of disordered eating/eating disorders in adults presenting for obesity treatment. Embase, MEDLINE and PsycINFO were searched to March 2024. Eligible studies (k) measured disordered eating/eating disorders in adults with overweight/obesity presenting for obesity treatment and included ≥ 325 participants to ensure a representative sample. Prevalence estimates were synthesised using random effect meta-analysis. 81 studies were included (n = 92,002, 75.9% female, median (IQR) age 44 (6) years, BMI 45 (11) kg/m2. Most studies were conducted in the United States (k = 44) and Italy (k = 15). Most prevalence data related to binge eating disorder or binge eating severity. The pooled prevalence of binge eating disorder, assessed by clinical interview, was 17% (95% CI: 12–22, 95% prediction interval (PI): 0–42, k = 19, n = 13447, τ2 = 0.01) using DSM-IV criteria and 12% (95% CI: 5–20, 95% PI: 0–40, k = 9, n = 7680, τ2 = 0.01) using DSM-V criteria. The pooled prevalence for severe binge eating (Binge Eating Scale score > 25) was 12% (95% CI: 8–16, 95% PI: 0–31, k = 18, n = 12136, τ2 = 0.01). For binge eating disorder, measured by clinical interview, the prevalence range for females and males was 14.9 to 27.0% (k = 12), and 4.0 to 24.1% (k = 3) respectively. For moderate to severe binge eating (Binge Eating Scale score ≥ 18) the prevalence for females and males ranged from 20.0 to 32.8%, and 7.1 to 77.5% (k = 2). Three studies reported prevalence by ethnicity. The prevalence of severe binge eating (Binge Eating Scale scores ≥ 27) was 9.5 to 41.7% in white populations (k = 2), 7.5 to 35.8% in black populations (k = 2), and 5.7% in Hispanic populations (k = 1). One study reported binge eating disorder, assessed by clinical interview, for white, black and Hispanic populations and reported prevalence of 15.3%, 11.3% and 11.4% respectively. Overall, there was high variability in the prevalence of binge eating and binge eating disorder in adults presenting for obesity treatment, with available data indicating prevalence can range up to 42%. It is important to identify which population level factors drive this heterogeneity to inform service provision however, the limited data highlights a significant knowledge gap in the reporting of eating disorders in underrepresented populations which needs to be addressed.
Coventry Cathedral and the Dresden Frauenkirche, both destroyed in the Second World War, are often mentioned in the same breath, treated as architectural, commemorative, and religious equivalents. Nothing could be further from the truth. While the ruins of Coventry Cathedral were transformed into a site of—and memorial to—postwar reconciliation, the Frauenkirche was neither a revered shrine nor an unintentional monument, but simply a gutted structure suspended in limbo for some forty years. It was only in the course of the 1980s, and especially in the aftermath of German reunification, that the Frauenkirche ruins became invested with specific meaning. Support from Britain and, above all, Coventry, was crucial in this process. Methodologically, the article fuses memory studies with church/architectural history and comparative/transnational research.
Climate change is a pressing global issue, with food systems contributing significantly to greenhouse gas (GHG) emissions, biodiversity loss, and freshwater depletion(1). A major challenge is to feed a projected 10 billion people by 2050 whilst minimising environmental impact(2). Numerous factors influence food choices, including convenience, affordability, taste, nutrition, accessibility, cooking skills, and cultural norms. The growing demand for convenience has transformed the food landscape, with availability of ready-to-eat meals and prepackaged products rapidly increasing, potentially impacting health if not integrated into balanced diets. The aim was to quantify and compare the environmental and financial impact of two diets: a heart-healthy Australian diet (HAD) aligned with national dietary guidelines and a typical Australian diet (TAD) reflecting current population intakes. Both plans were designed for convenience, using ready-to-eat meals and minimal preparation options for a randomised, cross-over, feeding trial. The environmental footprint of each dietary pattern was calculated using the Global Warming Potential (GWP*) metric(3), considering individual foods, multi-ingredient foods, and mixed dishes. Prices were obtained from a large Australian supermarket. The study focused on two-week meal plans designed to meet the nutritional needs of a reference 71-year-old male (9000 kJ). Results showed that the HAD produced 23.8% less CO2 equivalents (CO2-e) per day than the TAD (2.16 kg vs 2.83 kg CO2-e). Meat and discretionary foods were the main contributors to the TAD’s environmental footprint, while dairy and vegetables were the most significant contributors to the HAD’s footprint. The potential impact of widespread adoption of the HAD is substantial. For example, if half of the adult population switched to the HAD, it could lead to a reduction of approximately 2.6 billion kg of CO2 emissions annually, equivalent to the emissions of 1.2 million passenger cars per year(4), which would require over 256 million trees to offset the amount of CO2e produced(5). However, the HAD was 51% more expensive than the TAD, presenting a significant barrier to adoption. Strategies to reduce costs of convenient healthy food are needed. Future studies should expand the GWP* database and consider additional environmental dimensions to comprehensively assess the impact of dietary patterns. Current findings have implications for menu planning within feeding trials and for individuals seeking to reduce their carbon footprint while adhering to heart-healthy eating guidelines.
Approximately 15% of Australia’s workforce are shift workers, who are at greater risk for obesity and related conditions, such as type 2 diabetes and cardiovascular disease.(1,2,3) While current guidelines for obesity management prioritise diet-induced weight loss as a treatment option, there are limited weight-loss studies involving night shift workers and no current exploration of the factors associated with engagement in weight-loss interventions. The Shifting Weight using Intermittent Fasting in night shift workers (SWIFt) study was a randomised controlled trial that compared three, 24-week weight-loss interventions: continuous energy restriction (CER), and 500-calorie intermittent fasting (IF) for 2-days per week; either during the day (IF:2D), or the night shift (IF:2N). This current study provided a convergent, mixed methods, experimental design to: 1) explore the relationship between participant characteristics, dietary intervention group and time to drop out for the SWIFt study (quantitative); and 2) understand why some participants are more likely to drop out of the intervention (qualitative). Participant characteristics included age, gender, ethnicity, occupation, shift schedule, number of night shifts per four weeks, number of years in shift work, weight at baseline, weight change at four weeks, and quality of life at baseline. A Cox regression model was used to specify time to drop out from the intervention as the dependent variable and purposive selection was used to determine predictors for the model. Semi-structured interviews at baseline and 24-weeks were conducted and audio diaries every two weeks were collected from participants using a maximum variation sampling approach, and analysed using the five steps of framework analysis.(4) A total of 250 participants were randomised to the study between October 2019 and February 2022. Two participants were excluded from analysis due to retrospective ineligibility. Twenty-nine percent (n = 71) of participants dropped out of the study over the 24-week intervention. Greater weight at baseline, fewer years working shift work, lower weight change at four weeks, and women compared to men were associated with a significant increased rate of drop out from the study (p < 0.05). Forty-seven interviews from 33 participants were conducted and 18 participants completed audio diaries. Lack of time, fatigue and emotional eating were barriers more frequently reported by women. Participants with a higher weight at baseline more frequently reported fatigue and emotional eating barriers, and limited guidance on non-fasting days as a barrier for the IF interventions. This study provides important considerations for refining shift-worker weight-loss interventions for future implementation in order to increase engagement and mitigate the adverse health risks experienced by this essential workforce.
Optimal infant feeding practices in the first 2000 days of life offer protection against chronic diseases in later life(1). A study focusing on Indian-born mothers in Australia found that while they generally followed health-promoting infant feeding practices such as prolonged breastfeeding and delayed introduction of solids, they also displayed sub-optimal practices such as early introduction of water, fruit juice, cows’ milk, and formula(2), suggesting nonadherence to the Australian Infant Feeding guidelines(3), potentially increasing the risk of overweight/obesity or chronic disease in later life. Therefore, this study explored cultural beliefs, knowledge, attitudes and perceptions of Indian-born mothers living in Australia that may influence these practices. Thirteen Indian-born mothers (n = 13) with a child aged between 1.5–5 years born in Australia were purposively sampled from participants of an Australia-wide online survey involving Indian-born mothers. Purposive sampling was based on a mix of variables such as length of stay in Australia, language spoken at home, and education. These mothers took part in semi-structured interviews over Zoom. Zoom transcripts were analysed using a reflexive thematic approach to generate themes with NVivo. All mothers were married, and 85% were aged between 35 and 39 years. All mothers had lived in Australia for at least five years, with 54% for at least ten years. Most mothers (54%) had postgraduate qualifications and an annual household income of more than $156,000 (39%). Most mothers were Hindu (92%), and the main language spoken at home was Marathi (38%). Most fathers belonged to Indian ethnicity (85%). Themes relating to infant feeding and infant growth were identified.1) Cultural beliefs about breastfeeding positively influenced breastfeeding initiation and duration; 2) Maternal beliefs and attitudes negatively influenced formula feeding practices; 3) Acculturation positively influenced exposure to cows’ milk, honey, and pre-lacteal feeds; 4) Maternal knowledge about feeding guidelines and cultural beliefs positively influenced solid introduction and types of solids introduced; 5) ‘Tiny baby’ perception of mothers often influenced by health care professionals, negatively influenced infant feeding practices; 6) Reliance on mothers and mothers-in-law for feeding advice due to cultural disconnect between infant feeding guidelines and; 7) Lack of support after hospital discharge negatively influenced breastfeeding journey. In conclusion, Indian immigrant mothers in this study expressed their need to have culturally tailored support and consistent advice from healthcare professionals during the solid introduction and better support structures after hospital discharge to enable their optimal breastfeeding journey.
Previous studies have shown the health benefits of daily total protein intake(1), yet temporal protein patterns in the population have rarely been investigated. The currently available studies have examined the associations between total protein intake at eating occasions (EOs) with cardiometabolic(2) and muscular health(3) but have not accounted for different protein sources. This study aimed to describe temporal patterns of total, plant, and animal protein intake at EOs in Australian adults, and to examine these patterns according to their sociodemographic and eating pattern characteristics (e.g., meal and snack frequencies, amount of protein intake). Using the 2011–12 Australian National Nutrition and Physical Activity Survey data, this study included adults aged ≥ 19 years who completed one 24-hour dietary recall (n = 6741). Total, animal and plant protein intake at self-reported EOs was estimated using the AUSNUT 2011–13 nutrient database and Australian Dietary Guidelines (ADG) food classification system(4). Plant protein included grains, nuts, and other plant-based, protein-containing foods, while animal protein consisted of meats, dairy, and other animal-source foods. Separate latent variable mixture models were used to identify temporal patterns of total, animal, and plant protein based on hourly intakes of total, animal, and plant protein, respectively. Pearson’s Chi-square test (for categorical variables) and one-way analysis of variance (for continuous variables) were used to examine the differences in participant characteristics between latent classes of temporal protein patterns. Three latent classes for men’s and women’s intake of total, animal, and plant proteins were identified. Class 1 was characterised by high probabilities of consuming protein at the usual Australian mealtime (e.g., dinner at 18:00–19:00h), and participants in this class were significantly older than the other two classes (all, p < 0.001). Class 2 had a high probability of eating protein an hour later than the mealtime of Class 1 and the highest protein intake from meals (all, p < 0.001), except for men’s total protein and women’s plant protein. Participants in Class 2 of total (all, p < 0.001), animal (all, p < 0.001), and plant protein (women only, p = 0.02) were characterised by high income and employment status. Participants in Class 3 had the lowest meal frequency (all, p < 0.001) and the lowest total, animal, and plant protein intakes from meals (all, p < 0.001), but the highest intakes from snacks (p < 0.001), except for women’s animal protein intake. Most adults in Class 3 of total (men only, p < 0.001) and animal protein (all, p < 0.001) also had high education level, lived in urban areas, and were not married. Three temporal protein patterns with distinct characteristics were identified in this study. Future studies need to investigate whether these temporal protein intake patterns are associated with health outcomes.