We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Characterizing the structure and composition of clay minerals on the surface of Mars is important for reconstructing past aqueous processes and environments. Data from the CheMin X-ray diffraction (XRD) instrument on the Mars Science Laboratory Curiosity rover demonstrate a ubiquitous presence of collapsed smectite (basal spacing of 10 Å) in ~3.6-billion-year-old lacustrine mudstone in Gale crater, except for expanded smectite (basal spacing of 13.5 Å) at the base of the stratigraphic section in a location called Yellowknife Bay. Hypotheses to explain expanded smectite include partial chloritization by Mg(OH)2 or solvation-shell H2O molecules associated with interlayer Mg2+. The objective of this work is to test these hypotheses by measuring partially chloritized and Mg-saturated smectite using laboratory instruments that are analogous to those on Mars rovers and orbiters. This work presents Mars-analog XRD, evolved gas analysis (EGA), and visible/shortwave-infrared (VSWIR) data from three smectite standards that were Mg-saturated and partially and fully chloritized with Mg(OH)2. Laboratory data are compared with XRD and EGA data collected from Yellowknife Bay by the Curiosity rover to examine whether the expanded smectite can be explained by partial chloritization and what this implies about the diagenetic history of Gale crater. Spectral signatures of partial chloritization by hydroxy-Mg are investigated that may allow the identification of partially chloritized smectite in Martian VSWIR reflectance spectra collected from orbit or in situ by the SuperCam instrument suite on the Mars 2020 Perseverance rover. Laboratory XRD and EGA data of partially chloritized saponite are consistent with data collected from Curiosity. The presence of partially chloritized (with Mg(OH)2) saponite in Gale crater suggests brief interactions between diagenetic alkaline Mg2+-bearing fluids and some of the mudstone exposed at Yellowknife Bay, but not in other parts of the stratigraphic section. The location of Yellowknife Bay at the base of the stratigraphic section may explain the presence of alkaline Mg2+-bearing fluids here but not in other areas of Gale crater investigated by Curiosity. Early diagenetic fluids may have had a sufficiently long residence time in a closed system to equilibrate with basaltic minerals, creating an elevated pH, whereas diagenetic environments higher in the section may have been in an open system, therefore preventing fluid pH from becoming alkaline.
Background: Status dystonicus is characterized by frequent or prolonged severe episodes of generalized dystonia. The phenomenology, etiology, and outcome is heterogenous and poorly characterized, making a standardized management approach challenging. We characterized demographics of children with status dystonicus in British Columbia admitted to the pediatric intensive care unit (PICU), management patterns, and outcomes. Methods: Clinical records at our PICU were searched via ICD-10 codes. We included cases admitted 2014-2024 who had dystonia severity grade 3-5, dystonia worse than baseline, and age >30 days old. Results: Seventy-nine records were screened; 41 admissions from 19 unique patients were included. Mean age was 7.6±4.2 years; 53% were female. Most unique patients had a genetic etiology (n=8, 42%). The presenting complaint per admission was often not dystonia (n=24, 59%); infection was the most common trigger (n=23, 56%) followed by pain (n=6, 15%). Patients received several anti-dystonia medications (mean 6.9±2.5), including clonidine, benzodiazepines, ketamine, and others. Mean PICU stay was 11.0±10.8 days; 37% had multiple PICU admissions. Two patients (4.9%) died from status dystonicus complications. Conclusions: Status dystonicus is a life-threatening emergency commonly triggered by pain and infection in patients with dystonia. Given the considerable morbidity and mortality, multi-disciplinary teams should consider standardized treatment guidelines for these complex patients.
Background: Treatment-resistant obsessive compulsive disorder (trOCD) is a condition characterized by intrusive thoughts (obsessions) and uncontrollable behaviours (compulsions) unresponsive to conventional therapies. Lesioning both anterior limbs of the internal capsule is effective in ablating the circuitry underlying trOCD pathophysiology. The newest capsulotomy method is MR-guided focused ultrasound (MRgFUS). Here we measured neural networks changes of trOCD patients after MRgFUS capsulotomy using resting state functional MRI (rs-fMRI). Methods: Yale-Brown Obsessive-Compulsive Scale (YBOCS) scores and rs-fMRI data were collected in 6 trOCD patients preoperatively, postoperatively at 3-months and 1-year, along with rs-fMRI from 6 age and sex-matched controls. Independent component analysis, dual regression using the FMRIB software library, and node-node approaches were used with the CONN Toolbox. We also performed a systematic review of existing studies about trOCD resting state networks. Results: TrOCD patients demonstrated significant improvement 1-year postoperatively (mean YBOCS reduction of 41 ± 7%). Dual regression analysis 3-months postoperatively showed significantly greater sensorimotor network signal in controls compared to trOCD groups. Node-node analysis in trOCD found connectivity changed in networks associated with the cortico-striato-thalamo-cortico loop, particularly the salience and limbic networks at 1-year postoperatively. Conclusions: TrOCD patients who underwent MRgFUS capsulotomy demonstrated differences in sensorimotor and cortico-striatal connectivity and significant clinical improvement postoperatively.
Cardiovascular diseases (CVDs) are the leading cause of death worldwide(1). As poor diet quality is a major contributor to CVD burden; dietary intervention is recommended as a first-line approach to CVD prevention and management(2). Personalised nutrition (PN) refers to individualised nutrition care based on genetic, phenotypic, medical, and/or behavioural and lifestyle characteristics(3). Medical nutrition therapy by dietitians shares many of these principles and can be categorised as PN(4). PN may be beneficial in improving CVD risk factors and diet, however, this has not previously been systematically reviewed. The aim of this systematic review was to evaluate the effectiveness of PN interventions on CVD risk factors and diet in adults at elevated CVD risk. A comprehensive search was conducted in March 2023 across Embase, Medline, CINAHL, PubMed, Scopus and Cochrane databases, focusing on randomised controlled trials (RCTs) published after 2000 in English. Included studies tested the effect of PN interventions on adults with elevated CVD risk factors (determined by anthropometric measures, clinical indicators, or high overall CVD risk). Risk of bias was assessed using the Academy of Nutrition and Dietetics Quality Criteria checklist. Random-effects meta-analysis were conducted to explore weighted mean differences (WMD) in change or final mean values for studies with comparable data (studies with dietary counselling interventions), for outcomes including blood pressure (BP), blood lipids, and anthropometric measurements. Sixteen articles reporting on 15 unique studies (n = 7676) met inclusion criteria and were extracted. Outcomes of participants (n = 40–564) with CVD risk factors including hyperlipidaemia (n = 5), high blood pressure (n = 3), BMI > 25kg/m2 (n = 1) or multiple factors (n = 7) were reported. Results found potential benefits of PN on systolic blood pressure (SBP) (WMD −1.91 [95% CI −3.51, −0.31] mmHg), diastolic blood pressure (DBP) (WMD −1.49 [95% CI −2.39, −0.58] mmHg), triglycerides (TG) (WMD −0.18 [95% CI −0.34, −0.03] mmol/L), and dietary intake in individuals at high CVD risk. Results were inconsistent for plasma lipid and anthropometric outcomes. Dietary counselling PN interventions showed promising results on CVD risk factors in individuals at-risk individuals. Further evidence for other personalisation methods and improvements to methodological quality and longer study durations are required in future PN interventions.
Current clinical guidelines for people at risk of heart disease in Australia recommend nutrition intervention in conjunction with pharmacotherapy(1). However, Australians living in rural and remote regions have less access to medical nutritional therapy (MNT) provided by Accredited Practising Dietitians (APDs) than their urban counterparts(2). The aim of the HealthyRHearts study was to trial the delivery of MNT by APDs using telehealth to eligible patients of General Practitioners (GPs) located in small to large rural towns in the Hunter New England region(3) of New South Wales, Australia. The study design was a 12-month pragmatic randomised controlled trial. The key outcome was reduced total cholesterol. The study was place-based, meaning many of the research team and APDs were based rurally, to ensure the context of the GPs and patients was already known. Eligible participants were those assessed as moderate-to-high risk of CVD by their GP. People in the intervention group received five MNT consults (totalling two hours) delivered via telehealth by APDs, and also answered a personalised nutrition questionnaire to guide their priorities and to support personalised dietary behaviour change during the counselling. Both intervention and control groups received usual care from their GP and were provided access to the Australian Eating Survey (Heart version), a 242-item online food frequency questionnaire with technology-supported personalised nutrition reports that evaluated intake relative to heart healthy eating principles. Of the 192 people who consented to participate, 132 were eligible due to their moderate-to-high risk. Pre-post participant medication use with a registered indication(4) for hypercholesterolemia, hypertension and glycemic control were documented according to class and strength (defined daily dose: DDD)(5). Nine GP practices (with 91 participants recruited) were randomised to the intervention group and seven practices (41 participants) were randomised to control. Intervention participants attended 4.3 ± 1.4 out of 5 dietetic consultations offered. Of the132 people with baseline clinical chemistry, 103 also provided a 12-month sample. Mean total cholesterol at baseline was 4.97 ± 1.13 mmol/L for both groups, with 12-m reduction of 0.26 ± 0.77 for intervention and 0.28 ± 0.79 for control (p = 0.90, unadjusted value). Median (IQR) number of medications for the intervention group was 2 (1–3) at both baseline and 12 months (p = 0.78) with 2 (1–3) and 3 (2–3) for the control group respectively. Combined DDD of all medications was 2.1 (0.5–3.8) and 2.5 (0.75–4.4) at baseline and 12 months (p = 0.77) for the intervention group and 2.7 (1.5–4.0) and 3.0 (2.0–4.5) for the control group (p = 0.30). Results suggest that medications were a significant contributor to the management of total cholesterol. Further analysis is required to evaluate changes in total cholesterol attributable to medication prescription relative to the MNT counselling received by the intervention group.
Nutritional metabolomics is an emerging objective dietary biomarker method to help characterise dietary intake. Our recent scoping review identified gaps and inconsistencies in both design features and level of detail of reported dietary intervention methods in human feeding studies measuring the metabolome(1) and our cross-over feeding study protocol details dietary information for identification of metabolites that characterise ‘healthy’ and ‘unhealthy’ (typical) Australian diets(2). The current study aimed to gain consensus on core diet-related item details (DID) and recommendations for reporting DIDs to inform development of a reporting checklist. The aim of this checklist is to guide researchers on reporting dietary information within human feeding studies measuring the dietary metabolome. A two-stage online Delphi was conducted encompassing 5 survey rounds (February–July 2024). This study is approved by the University of Newcastle’s Human Research Ethics Committee (HREC; H-2023-0405). Sixty-seven experts were invited across expertise in clinical trial design, feeding study intervention implementation, metabolomics, and/or human biospecimen analyses. Twenty-eight DIDs categorised across five domains underwent consensus development. Stage 1 (2 rounds) gained consensus on a core set of DIDs, including phrasing. Stage 2 (3 rounds) gained consensus on standard reporting recommendations for each DID and acceptance of the final reporting guideline. The research team convened after every round to discuss consensus-driven results. Experts resided from Australia, New Zealand, United States, United Kingdom, Sweden, Israel, Italy and Denmark. Twenty-five completed stage 1 and n = 22 completed stage 2. After stage 1, two DIDs merged and two new DIDs were identified, totalling 29 core DIDs. At the end of stage 2, round 2, based on expert feedback, all items were organised to determine differing degrees of reporting in the methods section of publications, with additional recommendations collated for other sections, including supplementary files. The reporting guideline (DID-METAB Checklist) was generated and accepted by the expert working group in round 3, with all experts agreeing that relevant journals should include the checklist as a suggested reporting tool for relevant studies or used alongside existing reporting tools. The Delphi process gained consensus on a core set of DIDs, and consolidated expert views on the level of detail required when reporting DIDs in research. The Delphi process generated the reporting guideline (DID-METAB Checklist) which can be implemented independently or as an extension to existing guidelines such as CONSORT (at item 5) or SPIRIT (at item 11) to improve reproducibility and comparability of feeding studies. Endorsement by scientific societies and journals will be key for the dissemination strategy and optimising the utility of the tool to strengthen the evidence base of nutritional metabolomics. The DID-METAB Checklist will be a key tool to advance reporting of diet-related methodologies in metabolomics for both personalised and precision nutrition interventions in clinical practice.
Interest in the consumption of food containing live microbes (LM) as a component of dietary patterns has accelerated, due to potential positive contributions to health and chronic disease risk, including cardiovascular disease (CVD)(1,2). There are different patterns of LM consumption, including through the intake of probiotics or fermented foods or via a broader spectrum of foods that may harbour microbes, such as raw, unpeeled fruits and vegetables(3). To date, no study has quantitatively assessed potential intake of LM in a sample of Australians. The aim was to quantify presence of LM for common foods and beverages consumed in Australia, using the Australian Eating Survey® (AES) and AES-Heart®(4,5 food frequency questionnaires as the dietary assessment tool. Quantification of potential live microbial content (per gram) was conducted in accordance with the methodology outlined by Marco et al.(3). Briefly, foods were assigned to categories with LM ranges defined as low (Low; < 104 CFU/g), medium (Medium; 104–107 CFU/g), or high (High; > 107 CFU/g) for level of live microbes(3). These categories were based on the expected prevalence of viable microorganisms within different food matrices. Specifically, pasteurised food products are characterised as having microbial concentrations Low < 104 CFU/g. In contrast, fresh fruits and vegetables, consumed unpeeled exhibit a microbial range considered medium (Medium; 104–107 CFU/g), while unpasteurised fermented foods and probiotic supplemented foods exhibit significantly higher microbial content (High > 107 CFU/g). Based on this methodology, the estimated quantities of live microbes in 400 foods and beverages (including individual products and mixed dishes) within the AES and AES-Heart®(4,5 FFQs were determined and summarised across 22 food groups using the 2-digit codes from the 2011–2013 AUSNUT database(6). Preliminary results indicate the Low group was the most represented, out of the 400 foods 369 belong to this category. The food groups that represent the highest percentages in the Low group were vegetable products and dishes (13.8%) followed by meat, poultry, and game products and dishes (13.6%). The Medium group was composed by 25 items, with the most representative food groups being fruit products and dishes (48%). In the High group, the representative food groups were dairy and meat substitutes (e.g., soy yoghurt; 66.7%) and milk products and dishes (33.3%). The creation of this database will facilitates new research opportunities to investigate relationships between intake of live microbes and health outcomes, including CVD. Future research into how dietary pattern rich in live microbes related to chronic disease risk factors, such as reduced BMI, blood pressure, plasma lipids and glucose, in the Australian population could offer new insights into risk factor management through LM dietary interventions.
Shift workers in Australia constitute approximately 16% of the workforce, with nearly half working a rotating shift pattern(1). Whilst poor dietary habits of shift workers have been extensively reported, along with increased risk of metabolic health conditions such as obesity, cardiovascular disease and diabetes compared to non-shift workers(2,3,4), studies on shift working populations rarely control for individual and lifestyle factors that might influence dietary profiles. While rotating shift work schedules have been linked with higher energy intake than daytime schedules(5), little is known about the impact of different night shift schedules (e.g., fixed night vs rotating schedules) on the diets of shift workers, including differences in 24-hour energy intake and nutrient composition. This observational study investigated the dietary habits of night shift workers with overweight/obesity and compared the impact of rotating and fixed night shift schedules on dietary profiles. The hypothesis was posited, that shift workers’ diets overall would deviate from national nutrition recommendations, and those working rotating shift schedules compared with fixed night schedules would have higher energy consumption. Participants were from the Shifting Weight using Intermittent Fasting in night shift workers (SWIFt) trial, a randomised controlled weight loss trial, and provided 7-day food diaries upon enrolment. Mean energy intakes (EI) and the percentage of EI from macronutrients, fibre, saturated fat, added sugar, alcohol, and the amount of sodium were evaluated against Australian adult recommendations. Total group and subgroup analysis of fixed night vs rotating schedules’ dietary profiles were conducted, including assessment of plausible and non-plausible energy intake reporters. Hierarchical regression analysis were conducted on nutrient intakes, controlling for individual and lifestyle factors of age, gender, BMI, physical activity, shift work exposure, occupation and work schedule. Overall, night shift workers (n = 245) had diets characterised by high fat/saturated fat/sodium content and low carbohydrate/fibre intake compared to nutrition recommendations, regardless of shift schedule type. Rotating shift workers (n = 121) had a higher mean 24-hour EI than fixed night workers (n = 122) (9329 ± 2915 kJ vs 8025 ± 2383 kJ, p < 0.001), with differences remaining when only plausible EI reporters were included (n = 130) (10968 ± 2411 kJ vs 9307 ± 2070 kJ, p < 0.001). These findings highlight poor dietary choices among this population of shift workers, and higher energy intakes of rotating shift workers, which may contribute to poor metabolic health outcomes often associated with working nightshift.
Approximately 15% of Australia’s workforce are shift workers, who are at greater risk for obesity and related conditions, such as type 2 diabetes and cardiovascular disease.(1,2,3) While current guidelines for obesity management prioritise diet-induced weight loss as a treatment option, there are limited weight-loss studies involving night shift workers and no current exploration of the factors associated with engagement in weight-loss interventions. The Shifting Weight using Intermittent Fasting in night shift workers (SWIFt) study was a randomised controlled trial that compared three, 24-week weight-loss interventions: continuous energy restriction (CER), and 500-calorie intermittent fasting (IF) for 2-days per week; either during the day (IF:2D), or the night shift (IF:2N). This current study provided a convergent, mixed methods, experimental design to: 1) explore the relationship between participant characteristics, dietary intervention group and time to drop out for the SWIFt study (quantitative); and 2) understand why some participants are more likely to drop out of the intervention (qualitative). Participant characteristics included age, gender, ethnicity, occupation, shift schedule, number of night shifts per four weeks, number of years in shift work, weight at baseline, weight change at four weeks, and quality of life at baseline. A Cox regression model was used to specify time to drop out from the intervention as the dependent variable and purposive selection was used to determine predictors for the model. Semi-structured interviews at baseline and 24-weeks were conducted and audio diaries every two weeks were collected from participants using a maximum variation sampling approach, and analysed using the five steps of framework analysis.(4) A total of 250 participants were randomised to the study between October 2019 and February 2022. Two participants were excluded from analysis due to retrospective ineligibility. Twenty-nine percent (n = 71) of participants dropped out of the study over the 24-week intervention. Greater weight at baseline, fewer years working shift work, lower weight change at four weeks, and women compared to men were associated with a significant increased rate of drop out from the study (p < 0.05). Forty-seven interviews from 33 participants were conducted and 18 participants completed audio diaries. Lack of time, fatigue and emotional eating were barriers more frequently reported by women. Participants with a higher weight at baseline more frequently reported fatigue and emotional eating barriers, and limited guidance on non-fasting days as a barrier for the IF interventions. This study provides important considerations for refining shift-worker weight-loss interventions for future implementation in order to increase engagement and mitigate the adverse health risks experienced by this essential workforce.
In recent decades there has been an increased interest in the Mediterranean diet’s (MedDiet) protective capacity against age-related diseases. The MedDiet is comprised of wholefoods, with moderate to high dietary fat and a kilojoule intake of approximately 9,300kJ(1). The Mediterranean Diet Adherence Screener (MEDAS) has allowed for rapid assessment of MedDiet adherence across intervention and cohort studies globally(2). However, well-established reductions in older adults’ energy requirements often present a barrier to full MedDiet adherence(3,4). We sought to create an energy-adjusted MEDAS (E-MEDAS) for use in populations with reduced energy requirements, with a secondary analysis to determine that the strength of the relationship between E-MEDAS adherence and cardiometabolic biomarkers is not diminished through energy-adjustment. Baseline data from independently living, 60–90 year old participants enrolled in the MedWalk clinical trial were used. Estimated energy requirements (EER) were calculated for all participants (n = 161) using gender and age specific Schofield Equations, multiplied by a physical activity level (PAL) we derived from a novel method to calculate PAL’s from Actigraph and IPAQ-E data. Three distinct energy categories of E-MEDAS criteria were identified, with evenly reduced cutoff criteria across all food components. Participants with a completed MEDAS (n = 157) had their MedDiet adherence re-scored according to the reduced criteria cutoffs. Spearman’s rank correlation coefficient analyses, with 95% confidence intervals constructed by accelerated bias-corrected bootstrapping, were used to determine the strength and direction of association between both MEDAS and E-MEDAS adherence scores and 8 cardiometabolic biomarkers. The newly calculated E-MEDAS categories included Category 3 (corresponding to the original MEDAS) with a range of 9100–10500kJ (n = 30), Category 2 with a range of 7700–9100kJ (n = 81) and Category 1 with a range of 6300–7700kJ (n = 44). There was a significant (p < 0.05) weak negative correlation between the re-scored E-MEDAS and 5 cardiometabolic biomarkers; BMI (rs = −0.228, BCa 95% CI [−0.388, −0.074]), WHR (rs = −0.189, BCa 95% CI [−0.352, −0.027]), LDL (rs = −0.174, BCa 95% CI [−0.347, 0.009]), Total:HDL Ratio (rs = −0.288, BCa 95% CI [−0.429, −0.127]), Trigs (rs = −0.235, BCa 95% CI [−0.373, −0.079]. In contrast, the original MEDAS score resulted in a significant (p < 0.05) weak negative correlation in only 3 cardiometabolic biomarkers; WHR (rs = −0.167, BCa 95% CI [−0.317, −0.011]), Total:HDL Ratio (rs = −0.205, BCa 95% CI [−0.354, −0.049], and Trigs (rs = −0.217, BCa 95% CI [−0.360, −0.054]). Ultimately, we have successfully developed two categories of E-MEDAS, using a novel calculation of PALs, for use in individuals with reduced EERs. E-MEDAS scores showed a modest increase in the strength of relationship with five cardiometabolic biomarkers, indicating that reducing serves of individual components, while maintaining the overall dietary pattern does not negate the protective capacity of a MedDiet.
Objective biomarkers of a healthy and typical Australian diet could enhance dietary assessment and provide insight into how adherence to, or deviations from, dietary guidelines impact health. This study aimed to identify and compare plasma and urinary metabolites in healthy Australian adults in response to a healthy and typical dietary pattern. This was an 8-week randomised, cross-over feeding trial(1). After a two-week run-in period, participants were randomly allocated to follow each diet for two weeks, with a minimum two-week washout period in between. The Healthy Australian Diet adhered to the Australian Dietary Guidelines(2), including a balanced intake of the five food groups and meeting Acceptable Macronutrient Distribution Range targets(3). The Typical Australian Diet was formulated based on apparent consumption patterns in Australia(4). During each feeding phase, all food items were provided to ensure compliance. Both diets included different key indicator foods associated with known metabolites. Comprehensive data collection occurred at four key visits: week 0 (end of run-in; baseline 1), week 2 (post-feeding phase 1), week 4 (end of washout, baseline 2), and week 8 (post-feeding phase 2). Blood samples following a ≥ 8-hour fast were collected by an accredited pathologist, and spot urine samples were self-collected by participants at the morning appointment. Metabolomics data was obtained using Ultra-high Performance Liquid Chromatography-Tandem Mass Spectrometry (UHPLC-MS/MS) through Metabolon Inc.’s (Morrisville, USA) Global Discovery Panel. Metabolite concentrations were log-transformed. Differential changes in metabolites between intervention groups were evaluated using linear mixed-effect models, adjusting for diet sequence, feeding phase, and subject ID as a random variable to account for potential autocorrelation. Post-hoc pairwise comparisons were conducted to assess the impact effects of each diet. A total of 34 healthy Australian adults (age 38.4 ± 18.1 years, 53% females) completed all study measures. After adjusting for multiple comparisons, significant differences between TAD and HAD groups were observed for 257 plasma and 91 urine metabolites. Of these, 44 known metabolites consistently differed between dietary pattern groups in both biofluid types (plasma and urine). Several associations between specific food groups and metabolites were identified, including the externally validated metabolites associated with dark chocolate (theobromine), orange juice (proline betaine), and cruciferous vegetables (S-methylcysteine sulfoxide, S-methylcysteine). Consumption of dietary patterns aligned with Australian dietary guidelines had a measurable impact on the short-term human metabolome compared to a typical Australian dietary pattern. While some metabolites are established as biomarkers of specific foods, others may represent novel biomarkers requiring validation in future clinical trials and diverse populations. Further research should explore the relationship between these metabolites, the gut microbiome, and clinical outcomes. Additionally, studies are needed to assess the feasibility of using these biomarkers to evaluate diets in real-world settings.
Emerging research has highlighted a relationship between diet and genetics, suggesting that individuals may benefit more from personalised dietary recommendations based on their genetic risk for cardiovascular disease (CVD)(1,2). This current study aims to: (1) Measure knowledge of genetics among healthcare professionals (HCPs) working in CVD, (2) Identify HCPs’ attitudes to using genetic risk to tailor dietary interventions, and (3) Identify perceived barriers and enablers to implementing genetics to tailor dietary interventions. In a mixed-methods study, Australian HCPs (dietitians and AHPRA registered healthcare professionals) working with people with CVD were invited to complete an anonymous online survey (REDCap) and an optional interview. Recruitment occurred through social media and relevant professional organisations. Survey questions were underpinned by the theoretical domains framework(3) and data was synthesised descriptively. Semi-structured interviews were undertaken via Zoom. Interview responses were analysed using a thematic analysis approach using Braun & Clarke methodology(4). Survey responders (n = 63, 89% female, mean age 42 ± 14 years) were primarily dietitians (83%), with ≥ 10 years of experience (56%) and spent at least 20% of their time working with people with CVD (n = 55, 87%). Approximately half of respondents were aware that genetic testing for CVD exists (n = 36) and always assess family history of CVD (n = 31). Few respondents reported using genetic testing (n = 5, 8%) or felt confident interpreting and using genetic testing (n = 7, 11%) in practice. Respondents were interested in incorporating genetics into their practice to tailor dietary advice (n = 44, 70%). Primary barriers to using genetic testing included financial costs to patients and negative implications for some patients. Almost all respondents agreed genetic testing will allow for more targeted and personalised approaches for prevention and management of CVD (94%). From the interviews (n = 15, 87% female, 43 ± 17 years, 87% dietitian), three themes were identified: (1) ‘On the periphery of care’—HCPs are aware of the role of genetics in health and are interested in knowing more, but it is not yet part of usual practice; (2) ‘A piece of the puzzle’—using genetic testing could be a tool to help personalise, prioritise and motivate participants; and (3) ‘Whose role is it?’—There is uncertainty regarding HCP roles and knowing exactly whose role it is to educate patients. Healthcare professionals are interested in using genetics to tailor dietary advice for CVD, but potential implications for patients need to be considered. Upskilling is required to increase their knowledge and confidence in this area. Further clarity regarding HCP roles in patient education is needed before this can be implemented in practice.
The EMT-ATG achieved a significant milestone with its inaugural deployment during the SIDS4 Conference in Antigua. The EMT2030 strategy and Global Health Emergency Corps (GHEC) approach underscore the importance of collaborative leadership and joint efforts among all the networks to provide a comprehensive response.
Objectives:
The primary objective of the deployment was to ensure the health and safety of SIDS4 conference attendees through a coordinated and effective emergency medical response. It also aimed to demonstrate the capability of small island countries to establish and deploy fully operational and self-sufficient EMTs in coordination with other rapid response capacities, fostering a model of collaborative leadership.
Method/Description:
Training programs, conducted in collaboration with PAHO, focused on disaster response, triage, and mass casualty management. PAHO capacity building included the procurement of medical equipment, establishment of mobile medical units, and enhancement of communication systems for seamless coordination.
In preparation for deployment, ATG-EMT conducted simulation exercises and drills which involved various stakeholders, including local health authorities, security agencies, prehospital EMS, public health rapid response teams, and community volunteers.
Results/Outcomes:
The successful deployment of ATG-EMT during the SIDS4 Conference demonstrated the team’s capability to provide high-quality medical care and support at a high-profile international event. This contributed to the health and safety of over 4,500 delegates.
Conclusion:
The deployment highlights the importance of continuous training, robust capacity building, meticulous preparation in developing an effective emergency medical response system and serves as a model for small island countries aiming to enhance their disaster response capabilities.
We provide an assessment of the Infinity Two fusion pilot plant (FPP) baseline plasma physics design. Infinity Two is a four-field period, aspect ratio $A = 10$, quasi-isodynamic stellarator with improved confinement appealing to a max-$J$ approach, elevated plasma density and high magnetic fields ($ \langle B\rangle = 9$ T). Here $J$ denotes the second adiabatic invariant. At the envisioned operating point ($800$ MW deuterium-tritium (DT) fusion), the configuration has robust magnetic surfaces based on magnetohydrodynamic (MHD) equilibrium calculations and is stable to both local and global MHD instabilities. The configuration has excellent confinement properties with small neoclassical transport and low bootstrap current ($|I_{bootstrap}| \sim 2$ kA). Calculations of collisional alpha-particle confinement in a DT FPP scenario show small energy losses to the first wall (${\lt}1.5 \,\%$) and stable energetic particle/Alfvén eigenmodes at high ion density. Low turbulent transport is produced using a combination of density profile control consistent with pellet fueling and reduced stiffness to turbulent transport via three-dimensional shaping. Transport simulations with the T3D-GX-SFINCS code suite with self-consistent turbulent and neoclassical transport predict that the DT fusion power$P_{{fus}}=800$ MW operating point is attainable with high fusion gain ($Q=40$) at volume-averaged electron densities $n_e\approx 2 \times 10^{20}$ m$^{-3}$, below the Sudo density limit. Additional transport calculations show that an ignited ($Q=\infty$) solution is available at slightly higher density ($2.2 \times 10^{20}$ m$^{-3}$) with $P_{{fus}}=1.5$ GW. The magnetic configuration is defined by a magnetic coil set with sufficient room for an island divertor, shielding and blanket solutions with tritium breeding ratios (TBR) above unity. An optimistic estimate for the gas-cooled solid breeder designed helium-cooled pebble bed is TBR $\sim 1.3$. Infinity Two satisfies the physics requirements of a stellarator fusion pilot plant.
In this work, we present a detailed assessment of fusion-born alpha-particle confinement, their wall loads and stability of Alfvén eigenmodes driven by these energetic particles in the Infinity Two Fusion Pilot Plant baseline plasma design, a four-field-period quasi-isodynamic stellarator to operate in deuterium–tritium fusion conditions. Using the Monte Carlo codes, SIMPLE, ASCOT5 and KORC-T, we study the collisionless and collisional dynamics of guiding-centre and full-orbit alpha-particles in the core plasma. We find that core energy losses to the wall are less than 4 %. Our simulations shows that peak power loads on the wall of this configuration are approximately 2.5 MW m-$^2$ and are spatially localised, toroidally and poloidaly, in the vicinity of x-points of the magnetic island chain $n/m = 4/5$ outside the plasma volume. Also, an exploratory analysis using various simplified walls shows that shaping and distance of the wall from the plasma volume can help reduce peak power loads. Our stability assessment of Alfvén eigenmodes using the STELLGAP and FAR3d codes shows the absence of unstable modes driven by alpha-particles in Infinity Two due to the relatively low alpha-particle beta at the envisioned 800 MW operating scenario.
Hospital employees are at risk of severe acute respiratory coronavirus 2 (SARS-CoV-2) infection from patient, coworker, and community interactions. Understanding employees’ perspectives on transmission risks may inform hospital pandemic management strategies.
Design:
Qualitative interviews were conducted with 23 employees to assess factors contributing to perceived transmission risks during patient, coworker, and community interactions and to elicit recommendations. Using a deductive approach, transcripts were coded to identify recurring themes.
Setting:
Tertiary hospital in Boston, Massachusetts.
Participants:
Employees with a positive SARS-CoV-2 PCR test between March 2020 and January 2021, a period before widespread vaccine availability.
Results:
Employees generally reported low concern about transmission risks during patient care. Most patient-related risks, including limited inpatient testing and personal protective equipment availability, were only reported during the early weeks of the pandemic, except for suboptimal masking adherence by patients. Participants reported greater perceived transmission risks from coworkers, due to limited breakroom space, suboptimal coworker masking, and perceptions of inadequate contact tracing. Perceived community risks were related to social gatherings and to household members who also had high SARS-CoV-2 infection risk because they were essential workers. Recommendations included increasing well-ventilated workspaces and breakrooms, increasing support for sick employees, and stronger hospital communication about risks from non-patient-care activities, including the importance of masking adherence with coworkers and in the community.
Conclusions:
To reduce transmission during future pandemics, hospitals may consider improving communication on risk reduction during coworker and community interactions. Societal investments are needed to improve hospital infrastructure (eg, better ventilation and breakroom space) and increase support for sick employees.
Patients with advanced cancer and their caregivers experience a substantial amount of anxiety and distress. The purpose of this study was to assess the feasibility, acceptability, and preliminary effects of an 8-week, remotely delivered Resilient Living Program (RLP) for adult patients with advanced cancer and their caregivers.
Methods
Eligible patients included adults (≥18 years) with advanced cancer. Their caregiver had the option to participate. The RLP components included online modules, a print journal, and 4 video-telehealth-delivered sessions. Content focused on techniques for managing stress and building resilience (mindful presence, uplifting emotions, reframing experiences through practicing principles of gratitude, compassion, acceptance, meaning, and forgiveness). Feasibility and acceptability were assessed quantitatively and with semi-structured interviews conducted with a subset of participants. Effectiveness measures (anxiety, stress, quality of life [QOL], sleep, resiliency, and fatigue) were administered at baseline, week 5, week 9, and week 12.
Results
Of the eligible patients, 33/72 (46%) were enrolled. In all, 15 caregivers enrolled. Thirty participants (21 patients/9 caregivers) completed at least 3 video-telehealth sessions (63% adherence). For patients, there were statistically significant improvements in anxiety and fatigue at week 12 (p = 0.05). Other effectiveness measures (stress, QOL, sleep, resiliency) showed positive trends. Eleven participants were interviewed and qualitative analysis revealed 4 themes: Easy to Use, Learning Key Principles, Practice is Essential, and Examples of Benefits.
Significance of results
Participation in the RLP was feasible and acceptable for patients with advanced cancer and their caregivers. Participants tended to indicate that the practices were easy to integrate into their everyday lives, engendered their ability to focus on the positive, and would recommend the RLP to other individuals living with advanced cancer. Preliminary effectiveness data suggest the program may positively impact anxiety, stress, QOL, sleep, resiliency, and fatigue. A larger randomized clinical trial is warranted to confirm these preliminary findings.
Data from an RCT of IAPT Norway (“Prompt Mental Health Care” [PMHC]) were linked to several administrative registers up to five years following the intervention. The aims were to (1) examine the effects of PMHC compared to treatment-as-usual (TAU) on work-related outcomes and health care use, (2) estimate the cost–benefit of PMHC, and (3) examine whether clinical outcomes at six-month follow-up explained the effects of PMHC on work−/cost–benefit-related outcomes.
Methods
RCTs with parallel assignment were conducted at two PMHC sites (N = 738) during 2016/2017. Eligible participants were considered for admission due to anxiety and/or depression. We used Bayesian estimation with 90% credibility intervals (CI) and posterior probabilities (PP) of effects in favor of PMHC. Primary outcome years were 2018–2022. The cost–benefit analysis estimated the overall economic gain expressed in terms of a benefit–cost ratio and the differences in overall public sector spending.
Results
The PMHC group was more likely than the TAU group to be in regular work without receiving welfare benefits in 2019–2022 (1.27 ≤ OR ≤ 1.43). Some evidence was found that the PMHC group spent less on health care. The benefit–cost ratio in terms of economic gain relative to intervention costs was estimated at 5.26 (90%CI $ - $1.28, 11.8). The PP of PMHC being cost-beneficial for the economy as a whole was 85.9%. The estimated difference in public sector spending was small. PMHC effects on work participation and cost–benefit were largely explained by PMHC effects on mental health.
Conclusions
The results support the societal economic benefit of investing in IAPT-like services.