Winter Conference 2023, 5-6 December 2023, Diet and lifestyle strategies for prevention and management of multimorbidity
Abstract
A high-fungi diet differentially attenuates the gut mycobiota relative to a high meat diet; consequences for chronic disease risk?
- D.N Farsi, A. Nelson, G. Koutsidis, D.M Commane
-
- Published online by Cambridge University Press:
- 03 July 2024, E197
-
- Article
-
- You have access Access
- Export citation
-
The fungal cell wall facilitates an immune response and may be involved in intestinal immune training (1,2). It is also fermentable by the gut microbiome, thus, consumption of fungi changes gut microbial ecology (3). Yet, the specific effects of consuming fungal foods on the gut mycobiota (i.e., gut fungi) have not been well studied. An interesting case study is mycoprotein, a fungal based protein produced from Fusarium Venenatum (4). We have previously reported that mycoprotein consumption attenuates faecal water genotoxicity, a surrogate marker of colorectal cancer risk, as well as modulates faecal metabolite excretion and gut bacterial composition (5). Here, we aimed to evaluate the impact of consuming a diet high in mycoprotein on gut mycobial ecology, and to explore relationships between mycobial composition and faecal genotoxicity.
Here we leverage stool samples from Mycomeat: a randomised crossover-controlled trial, recruiting 20 healthy male adults to adhere to 2-week diets comprising 240 g/day of mycoprotein based foods or red and processed meat, separated by a 4-week washout. Internal transcriber spacer (ITS) sequencing was performed to characterise the mycobiota. Alpha diversity before and after study phases was compared using Wilcoxon tests. Beta diversity was compared by permutational multivariate analysis of variance (PERMANOVA) based on Bay-Curtis dissimilarities. Differences in mycobial taxa within and between study phases were compared using Wilcoxon tests. Changes in mycobiota composition was then regressed against faecal excretion of metabolites using mixed- effects models to understand the impact of myco-ecology on the wider colonic environment. Finally, given the abundance of mycobial genotoxins in nature, we regressed mycobial taxa against faecal water genotoxicity.
There were significant shifts in the abundance of several taxa following both diets. Notably, mycoprotein consumption was associated with an increase in the abundance of Malasseziales sp. (P = 0.02) and a reduction in Candida Albicans (P = 0.01). Meat consumption was associated with an increase in Phaeoacremonium Tuscanum (P = 0.01) and Rhodotorula Mucilaginosa (P = 0.008), and reduction in Penicillium Commune (P = 0.02). In addition, Aspergillus Caesiellus was associated with lower faecal genotoxicity (P = 0.04), whereas Penicillium Commune (P = 0.04) and Penicillium Olsonii (P = 0.03) were both associated with higher genotoxicity. Regressing mycobial taxa against faecal metabolites revealed a number of significant associations, including between Penicillium Commune and austdiol, a putative mycotoxin (P < 0.001) as well as 1-methyladenine, a methylated DNA base with cytotoxic properties (P = 0.001).
The gut mycobiota is malleable through a fungi rich diet alongside changes within the wider microbiome. Members of the gut mycobiota predicted faecal genotoxicity and faecal excretion of toxins in this model, and there may be value in further exploring the gut mycobiota and the contribution of mycotoxins in gut health and colorectal cancer risk.
Low glycaemic index diet in pregnancy and child asthma and eczema: follow-up of the ROLO trial
- S. Callanan, M. Talaei, A. Delahunt, S.O Shaheen, F.M McAuliffe
-
- Published online by Cambridge University Press:
- 03 July 2024, E198
-
- Article
-
- You have access Access
- Export citation
-
Atopic diseases, including asthma and eczema, represent a substantial public health problem in children and adolescents globally; asthma is the commonest chronic disorder of childhood(1). Research suggests that the origins of childhood asthma lie in utero, and several components of the maternal diet during pregnancy have been investigated in relation to atopic outcomes in children. Epidemiological evidence suggests that a higher intake of sugar during pregnancy is associated with a higher risk of childhood asthma and atopy(2,3). However, randomised trial evidence supporting such a link is lacking.
Aims
1. To examine whether a low glycaemic index (GI) dietary intervention during pregnancy decreases the risk of asthma and eczema in childhood.
2. To assess observationally whether maternal intake of sugar during pregnancy is positively associated with asthma and eczema in childhood.
This is a secondary analysis of children from the ROLO trial. Healthy women were randomised to receive an intervention of low GI dietary advice or routine antenatal care from early pregnancy. All women completed a 3-day food diary in each trimester of pregnancy. Estimates of maternal intake of sugar in each trimester were averaged to provide mean intakes during pregnancy. Mothers reported current doctor-diagnosed eczema in their children at 2-years of age (n=271), and current doctor-diagnosed asthma and eczema in their children at 5 (n=357) and 9-11 years (n=391) of age. Multivariable logistic regression models were used a) to test the effect of the intervention on child outcomes overall, and stratified by maternal education level (with, versus without, a complete tertiary level education), and b), in observational analyses, to analyse the relation between sugar and carbohydrate intake in pregnancy and child outcomes.
There was weak evidence overall for a reduction in asthma at 5-years of age in children whose mothers received the low GI dietary intervention during pregnancy compared to usual care [adjusted odds ratio (OR) 0.43 (95% CI 0.18, 1.03); P=0.06]. However, in stratified analyses the intervention was associated with a marked reduction in risk of asthma at 5-years of age in children born to mothers with lower educational attainment [adjusted OR 0.16 (0.03, 0.85); P=0.032]. Intake of sugar during pregnancy was positively associated with the development of asthma at any time point in childhood [adjusted OR per quartile of mean sugar intake 1.40 (0.99, 1.97), P-trend=0.048] and at 5-years of age [adjusted OR per quartile 1.55 (1.00, 2.40), P-trend=0.046]. No associations with eczema outcomes were found.
This novel study provides stronger evidence that higher sugar intake during pregnancy is associated with an increased risk of asthma among offspring. An intervention to reduce sugar intake in pregnancy may have potential as a primary prevention strategy, particularly amongst children born to mothers with lower educational attainment.
Mechanisms contributing to lactose and sucrose-induced postprandial lipaemia
- J. Gonzalez, S. Carter, B. Spellanzon, L. Bradshaw, E. Johnson, F. Koumanov, J. A. Betts, D. Thompson, L. Hodson
-
- Published online by Cambridge University Press:
- 03 July 2024, E199
-
- Article
-
- You have access Access
- Export citation
-
Fructose-containing sugars can exaggerate postprandial lipaemia and stimulate hepatic de novo lipogenesis (DNL) when compared to glucose-based carbohydrates(1). Galactose has recently been shown to increase postprandial lipaemia compared to glucose(2), but mechanisms remain uncharacterised. The aim of this study was to assess the effect and mechanisms of lactose-induced lipaemia.
Twenty-four non-obese adults (12 male and 12 female) completed three trials in a randomised, crossover design (28 ± 7-day washout). During trials, participants consumed test drinks containing 50 g fat with 100 g of carbohydrate. The control carbohydrate was a glucose polymer (maltodextrin), the experimental carbohydrate was galactose-containing carbohydrate (lactose) and the active comparator was fructose-containing carbohydrate (sucrose). Hepatic DNL was assessed by the 2H2O method and [U-13C]-palmitate was added to the test drink to trace the fate of the ingested fat. Blood and breath samples were taken to determine plasma metabolite and hormone concentrations, in addition to plasma and breath 2H and 13C enrichments. Data were converted into incremental under the curve (iAUC) and were checked for normality by visual inspection of residuals. Differences between trials were assessed by one-way ANOVA. Where a main effect of trial was detected, post- hoc t-tests were performed to determine which trials differed from lactose according to the principle of closed-loop testing.
The plasma triacylglycerol iAUC (mean ± SD) in response to maltodextrin was 51 ± 68 mmol/L*360 min. Following lactose ingestion, plasma triacylglycerol iAUC increased to 98 ± 88 mmol/L*360 min (p<0.001 vs maltodextrin), which was comparable to sucrose [90 ± 95 mmol/L*360 min (p=0.41 vs lactose)]. Hepatic DNL in response to maltodextrin was 6.6 ± 3.0%. Following ingestion of lactose, hepatic DNL increased to 12.4 ± 6.9% (p=0.02 vs maltodextrin), which was comparable to sucrose [12.2 ± 6.9% (p=0.96 vs lactose)]. Exhaled 13CO2 in response to maltodextrin was 10.4 ± 4.1 mmol/kgFFM*360 min. Following ingestion of lactose, exhaled 13CO2 was 8.8 ± 4.9 mmol/kgFFM*360 min (p=0.09 vs maltodextrin), which was lower than sucrose [11.1 ± 3.9 mmol/kgFFM*360 min (p=0.01 vs lactose)].
These data are consistent with the hypothesis that hepatic de novo lipogenesis contributes to both lactose and sucrose-induced lipaemia and provide a rationale to investigate the longer-term effects of lactose and sucrose on metabolism.
The association between selenium status and cognitive decline in very old adults: The Newcastle 85+ Study
- G. Perri, J. C Mathers, C. Martin-Ruiz, C. Parker, K. Demircan, T. S. Chillon, L. Schomburg, L. Robinson, E. J Stevenson, G. Terrera, F. F Sniehotta, C. Ritchie, A. Adamson, A. Burns, A.M Minihane, O. Shannon, T.R Hill
-
- Published online by Cambridge University Press:
- 03 July 2024, E200
-
- Article
-
- You have access Access
- Export citation
-
The trace element selenium is known to protect against oxidative damage which is known to contribute to cognitive impairment with ageing (1,2). The aim of this study was to explore the association between selenium status (serum selenium and selenoprotein P (SELENOP)) and global cognitive performance at baseline and after 5 years in 85-year-olds living in the Northeast of England.
Serum selenium and SELENOP concentrations were measured at baseline by total reflection X-ray fluorescence (TXRF) and enzyme-linked immunosorbent assay (ELISA), respectively, in 757 participants from the Newcastle 85+ study. Global cognitive performance was assessed using the Standardized Mini-Mental State Examination (SMMSE) where scores ≤25 out of 30 indicated cognitive impairment. Logistic regressions explored the associations between selenium status and global cognition at baseline. Linear mixed models explored associations between selenium status and global cognition prospectively after 5 years. Covariates included sex, body mass index, physical activity, high sensitivity C-reactive protein, alcohol intake, self-rated health, medications and smoking status.
At baseline, in fully adjusted models, there was no increase in odds of cognitive impairment with serum selenium (OR 1.004, 95% CI 0.993-1.015, p = 0.512) or between SELENOP (OR 1.006, 95% CI 0.881-1.149, p = 0.930). Likewise, over 5 years, in fully adjusted models there was no association between serum selenium and cognitive impairment (β 7.20E-4 ± 5.57E-4, p = 0.197), or between SELENOP and cognitive impairment (β 3.50E-3 ± 6.85E-3, p = 0.610).
In this UK cohort of very old adults, serum selenium or SELENOP was not associated with cognitive impairment at baseline and 5 years. This was an unexpected finding despite SELENOP’s key role in the brain and the observed associations in other studies. Further research is needed to explore the effect of selenium on global cognition in very old adults.
Pro-inflammatory diets are associated with higher C-reactive protein and lower plasma concentrations of vitamins with anti-inflammatory potential, in the EPIC-Norfolk cohort
- A.A Mulligan, M.A.H Lentjes, A.A Welch
-
- Published online by Cambridge University Press:
- 03 July 2024, E201
-
- Article
-
- You have access Access
- Export citation
-
The development of multiple long-term conditions (MLTC) has been shown to be associated with low-grade chronic inflammation(1). The Dietary Inflammatory Index (DII®) is a literature-based dietary score that was developed to measure the potential impact of diet on the inflammatory status of an individual(2). In this study, we aimed to validate the DII® score against biomarkers, including high- sensitivity C-reactive protein (hs-CRP), and plasma concentrations of vitamin C, retinol and α- tocopherol in European Prospective Investigation Into Cancer and Nutrition (EPIC)-Norfolk participants, aged 39–79 years at baseline(3).
The DII® score was calculated using a 130-item Food Frequency Questionnaire collected at baseline, between 1993 and 1997. The dietary intakes were adjusted to a 2000 kcal/day diet, to assess diet quality independently of diet quantity. Non-fasting serum cholesterol, hs-CRP, and plasma α- tocopherol, vitamin C and retinol concentrations were also measured at this time-point. Data collected via a self-administered Health and Lifestyle Questionnaire were used to establish classification of a number of variables. Analyses were conducted on sub-samples with a DII® score and measures of hs-CRP (8,034 men and 9,861 women), and concentrations of vitamin C (9,866 men and 11,702 women), retinol (3,673 men and 3,517 women) and cholesterol-adjusted α- tocopherol (3,623 men and 3,476 women). Analysis of covariance and linear regression were used to study associations across sex-specific quintiles of the DII® score (adjusted for age, BMI, smoking status, physical activity, social class and educational level), where a higher score indicates a more pro-inflammatory diet.
Mean concentrations in men and women were 2.99 and 3.09 mg/L for hs-crp, 47 and 59 μmol/L for vitamin C, 53 and 50 μg/dL for retinol and 4.34 and 4.42 μmol/mmol for cholesterol-adjusted α- tocopherol, respectively. In both men and women, mean hs-CRP was higher if the diet was more pro-inflammatory (p-trend = 0.02 in men and 0.07 in women), while concentrations of vitamin C, retinol and α-tocopherol were significantly lower (p-trend < 0.001). Positive associations for hs-CRP, but negative associations for plasma concentrations of vitamin C, retinol and α-tocopherol were evident in both men and women, after adjustments for covariates (p-trend < 0.001). The differences between Q1 and Q5 adjusted means for hs-CRP, vitamin C, retinol and α-tocopherol were +9.4%, -22.1%, -3.9% and -8.6% in men and +7.9%, -17.5%, -4.8% and -7.6% in women, respectively.
We observed statistically significant positive associations between the DII® score and hs-CRP, a well-known inflammatory biomarker, whilst significant negative associations were found for circulating concentrations of three anti-inflammatory vitamins, after adjustment for covariates. These findings indicate that the DII® score is a valid measure of the inflammatory potential of diet in these middle-aged and old adults, making it possible to study the inflammatory role of diet in MLTC development.
Assessing adherence to plant-rich dietary patterns using metabolic signatures of plant food metabolites
- Y. Li, Y. Xu, M. Le Sayec, T. D Spector, C. Menni, R. Gibson, A. Rodriguez-Mateos
-
- Published online by Cambridge University Press:
- 03 July 2024, E202
-
- Article
-
- You have access Access
- Export citation
-
Diet is an important modifiable lifestyle factor for human health, and plant-rich dietary patterns are associated with lower risk of non-communicable diseases in numerous studies. However, objective assessment of plant-rich dietary exposure in nutritional epidemiology studies remains challenging. This study aimed to develop and evaluate metabolic signatures of the most widely used plant-rich dietary patterns using a targeted metabolomics method comprising of 108 plant food metabolites.
A total of 218 healthy participants from the POLYNTAKE cohort were included, aged 51.5 ± 17.7 years, with 24h urine samples measured using ultra-high-performance liquid chromatography–mass spectrometry. The validation dataset employed three sample types to test the robustness of the signature, including 24h urine (ABP cohort, n = 88), plasma (POLYNTAKE cohort, n = 195), and spot urine (TwinsUK cohort, n = 198). Adherence to the plant-rich diet was assessed using a priori plant- rich dietary patterns. A combination of metabolites that evaluates the adherence and metabolic response to a specific diet was identified as metabolic signature. We applied linear regression analysis to select the metabolites significantly associated with dietary patterns (adjusting energy intake), and ridge regression to estimate penalized weights of each candidate metabolite. The correlation between metabolic signature and the dietary pattern was assessed by Spearman analysis (FDR < 0.05).
The metabolic signatures consisting of 42, 22, 35, 15, 33, and 33 predictive metabolites across different subclasses were found to be associated with adherence to Amended Mediterranean Score (A-MED), Original MED (O-MED), Dietary Approaches to Stop Hypertension (DASH), Mediterranean-DASH Intervention for Neurodegenerative Delay (MIND), healthy Plant-based Diet Index (hPDI) and unhealthy PDI (uDPI), respectively. The overlapping and distinct predictive metabolites across six dietary patterns predominantly consisted of phenolic acids (n = 38), including 14 cinnamic acids, 14 hydroxybenzoic acids, seven phenylacetic acids, and three hippuric acids. Six metabolites were included in all signatures, including two lignans: enterolactone-glucuronide, enterolactone-sulfate, and four phenolic acids: cinnamic acid, cinnamic acid-4'-sulfate, 2'- hydroxycinnamic acid, and 4-methoxybenzoic acid-3-sulfate. The established signatures were robustly correlated with dietary patterns in validation dataset (r = 0.13 - 0.40, FDR < 0.05).
We developed and evaluated a set of metabolic signatures that robustly reflected the adherence and metabolic response to plant-rich dietary patterns, suggesting the potential of these signatures to serve as an objective assessment of free-living eating habits.
Beans, peas and pulses for improved public and planetary health: Changing UK consumption patterns
- L. Lane, R. Wells, C. Reynolds
-
- Published online by Cambridge University Press:
- 03 July 2024, E203
-
- Article
-
- You have access Access
- Export citation
-
Beans, peas and pulses offer significant nutritional, health and environmental benefits (1,2,3, 4) and the FAO states that including pulses in agrifood systems is key to achieving the Sustainable Development Goals(5). Recommended intakes vary across national food-based dietary guidelines(6), but higher intakes are associated with benefits including increased satiety, reduced blood pressure, lower risk of cardiovascular disease and improved gut microbiota composition and activity(7). Worldwide, the average consumption of pulses is 21g per person per day(8) but published analysis of UK intakes is scarce. The aim of this review was to analyse consumption trends using two UK government datasets.
The Family Food module of the Living Costs and Food Survey details food and drink purchases from approximately 5000 households per year(9). The ‘UK – household purchases’ data include the average (mean) quantities purchased per person per week. Categories relating to beans, peas and pulses were identified, and data were presented as graphs of purchasing trends (1974–2021).
The National Diet and Nutrition Survey (NDNS)(10) assesses the nutritional status of 1000 participants (1.5 years and over) annually. Personal-level dietary data (2008-2019) were evaluated for subgroups ‘baked beans’ and ‘beans and pulses including ready meal and homemade dishes’, including fresh, frozen and canned beans and pulses, and recipes containing them. Peas/ green beans were excluded because of the nature of the data aggregation. Food-level dietary data (2018-19) were assessed for the frequency of consumption of different types of peas, beans and pulses.
Our analysis of the Family Food datasets shows that, at 28g per person per day, the current average (mean) consumption of beans, peas and pulses in the UK is suboptimal. Our parallel analysis of NDNS data showed that more than 40% of participants were not eating any beans and pulses (excluding peas).
Dietary trends are shifting. The overall consumption of beans, peas and pulses has been falling steadily since the late 1980s. This is mostly due to the drop in consumption of peas and baked beans, though these are still the most frequently consumed legume categories in the UK diet. Canned pea purchases fell from 88g per person per week in 1974 to 14g per person per week in 2020-21. Baked bean purchases peaked at 133g per person per week in 1986, dropping to 78g in 2020-21.
Purchases of other canned beans and pulses (excluding baked beans) have increased noticeably in the last decade, from 17g per person per week 2015-16 to 32g in 2020-21. Purchases of dried pulses have remained consistent, averaging 11g per person per week in 2020-21.
This analysis indicates significant scope to deliver affordable, accessible health and environmental benefits through increased consumption of beans, peas and pulses in the UK.
Doctors’ and nurses’ eating practices during shift work: Findings from a qualitative study
- K. Sum, A. Cheshire, D. Ridge, D. Sengupta, S. Deb
-
- Published online by Cambridge University Press:
- 03 July 2024, E204
-
- Article
-
- You have access Access
- Export citation
-
Doctors’ and nurses’ (DNs) wellbeing in the National Health Service is important for safe healthcare for those in need. However, their demanding duties, including irregular shift work, can significantly impact their health. Unfortunately, irregular working patterns are associated with higher sickness rates and stress among healthcare professionals due to the inherent challenges of the work(1,2). For example, shift work disrupts sleep and impairs cognitive function and performance, leading to poorer physiological and cardiovascular health(3), workforce shortages and difficulties adapting to a consistently demanding workload, which can impact patient care delivery(4). Despite the importance of workplace health and nutrition for DNs, our understanding of their dietary practices during shift work remains limited. Therefore, gaining insights into DNs’ eating habits during shifts is imperative to supporting their health. Our research aimed to understand DNs’ eating practices during their work, including the types of food consumed throughout the day.
Online semi-structured interviews (n=16) were conducted with a convenience sample of current practising medical doctors (n=11) and nurses (n=5) in England. This provided an opportunity to compare and contrast the research data between DNs on workplace nutrition. All participants did shift work, encompassing varied working patterns, including day and night shifts, short and long days and weekends. Following Braun and Clarke’s(5) approach, an inductive thematic analysis presented the findings.
Results elucidate six areas of DNs’ eating practices and dietary intake: before and during shifts, on long shifts, after shifts, during night shifts, and on non-working days. Our data suggests that DNs prioritise their clinical responsibilities over their dietary intake at work. Consequently, they often miss eating opportunities and consume caffeine to stay alert during their shifts. Furthermore, DNs viewed night shifts as involving less healthy food choices. While participants expressed their intention to eat healthily during their shifts, their clinical responsibilities made maintaining regular and nutritious dietary practices throughout the day challenging. Nevertheless, DNs value their meal after a shift as the most important, as this could be the only meal they eat throughout the day.
Our results suggest that DNs’ eating practices and dietary intake are sub-optimal to recommended dietary guidelines. It also suggests that eating practices are varied, individualised and not applicable to all, considering the many environmental and occupational factors contributing to DNs’ nutritional behaviours. Therefore, dietary workplace interventions are recommended to improve DNs’ dietary behaviours at work. Future research should explore DNs’ eating practices through follow-up interviews at various time points. This approach will provide valuable insights into DNs’ dietary and nutritional behaviours during shift work, helping to uncover additional barriers and challenges beyond DNs’ daily experiences.
Dietitians’ perspectives on clinical pathways and practice in relation to the dietary management of irritable bowel syndrome in the UK: A qualitative study
- K. Belogianni, P. Khandige, S. Silverio, S. Windgassen, R. Moss-Morris, M.C.E Lomer
-
- Published online by Cambridge University Press:
- 03 July 2024, E205
-
- Article
-
- You have access Access
- Export citation
-
Irritable bowel syndrome (IBS) is a chronic and relapsing gastrointestinal condition which negatively impacts quality of life(1). Dietary triggers are common and dietary management is central to the IBS treatment pathway with dietitians being the main education providers for patients(2). The aim of this study was to explore the perceptions of dietitians towards current practices in IBS services in clinical settings across the UK.
Qualitative semi-structured interviews were undertaken to explore current practices, barriers, and facilitators to dietetic practice and expected treatment outcomes. Eligible participants were dietitians specialising in IBS and working in the National Health System (NHS) in the UK. Interviews were held virtually. Audio was recorded and transcribed following intelligent transcription. Data were analysed using template analysis (3).
Thirteen dietitians (n=12 female) specialising in gastroenterology consented to participate in the study. Dietitians were working in various NHS Trusts across the country (Southeast England n=3; Southwest England n=3; Northwest England n=2; Northeast England n=1; West Midlands n=1; Southwest Wales n=1 and Southcentral Scotland n=2). Ten out of 13 dietitians had more than five years of experience in IBS management. Three main themes emerged: 1) Dietetic services as part of IBS referral pathways; 2) Practices in relation to dietetic services and 3) Implications of services on patients’ expectations and feelings. Each main theme had subthemes to facilitate the description and interpretation of data. The increasing number of IBS referrals to dietitians and the need for accurate and timely IBS diagnosis and specialist IBS dietitians was reported, alongside the use of digital innovation to facilitate practice and access to dietetic care. The use of Internet as a source of (mis)information by patients and the limited time available for educating patients were identified as potential barriers to dietetic practice. Dietitians follow a patient-centred approach to dietary counselling and recognise the negative implications of perceived IBS-related stigma by patients on their feelings and treatment expectations.
The study identified areas and practices which can facilitate access to dietetic services and patient- centred care in IBS management as outlined in guidelines (4).
A systematic scoping review characterising studies investigating workplace nutritional interventions in male employees
- L. Schinnenburg, R. Gibson
-
- Published online by Cambridge University Press:
- 03 July 2024, E206
-
- Article
-
- You have access Access
- Export citation
-
Non-communicable diseases (NCDs) are the predominant cause of death in the UK(1) and place an economic burden on societies(2). An unhealthy diet is one of the four main behavioural risk factors for NCDs(3) and thus, interventions targeting dietary behaviour are of particular interest in the prevention of NCDs. The workplace may be a valuable setting for these interventions as employees represent a large proportion of the population in the UK(4). In the male population, NCDs and several risk factors typically manifest at a younger age(5) and, additionally, males participate less in health-promoting activities(6).
The aim of this scoping review was to identify and characterise the evidence base to determine if a future full systematic review on nutrition interventions in the workplace to improve health and well- being in males is feasible.
The review was conducted adhering to the PRISMA guideline for Scoping Reviews(7). Three electronic databases (Ovid, PubMed, and The Cochrane Library) were systematically searched for relevant publications responding to the research question from inception. No restrictions were made in the search and all study types were eligible. Articles that were not available in English were excluded from the review. Eligible studies were reviewed using a pre-defined data extraction form and references hand-searched for relevant publications. Data synthesis was focused on describing application-oriented aspects and outcome analyses were reduced to anthropometric outcomes.
Of the 1,224 publications from the initial database search, 46 were included in the review, with an additional 15 studies identified from hand-searching, resulting in 61 included reports on 57 interventions. Four main approaches to nutrition interventions at the workplace were identified; educational, environmental, individual counselling, and meal provision/replacement. Most interventions used multicomponent approaches. One of the 61 included reports followed a qualitative design. Anthropometric outcomes were reported in the majority (83.6%) of studies, followed by bioclinical outcomes (45.9%), other outcomes were food (34.4%) and nutrient intake (22.9%), smoking habits (14.8%) and, one study reported on Quality of Life. Of the studies reporting anthropometric outcomes 69% reported to be effective in improving body weight, 47.8% BMI and 54.5% waist circumference. No determinants of successful interventions such as type, duration, workplace participation were identified.
This review suggests that nutrition interventions at the workplace are effective in improving several anthropometric outcomes. A future full systematic review is feasible but should consider narrowing the research question to account for limitations in the current evidence base as differences in reporting of design, population, intervention, and outcomes severely limited data analyses. Furthermore, to enable high-quality research, the development of a reporting tool, such as the TIDieR checklist (8) is recommended.
Characterising dietary protein intake in Irish adults on the island of Ireland
- H. Griffin, A.P Nugent, B. A McNulty, D. Wright, L. Brennan
-
- Published online by Cambridge University Press:
- 03 July 2024, E207
-
- Article
-
- You have access Access
- Export citation
-
Shifting dietary protein intakes from animal to plant-based sources is suggested as a path to sustain the world’s food consumption and maintain planetary resources (1). However, to facilitate change, it is important to characterise baseline dietary protein patterns. This study aimed to examine dietary protein intakes on the island of Ireland in order to determine population characteristics and food sources influencing protein intake.
Analyses were performed on the Northern Ireland sub cohort of the UK National Dietary Nutrition Survey (NDNS 2016-2019) (2) and the Irish National Adult Nutrition Survey (2008-2010) (3). Both surveys used a four-day food diary and a final sample of 1484 adults, aged 18-64 years was extracted (NANS; n =1274 and NI NDNS; n=210). Mean daily intakes for protein (MDI; % total energy, TE) for the total population were calculated and the population was divided into three tertile groups based on low, medium and high protein intake (%TE). Differences in population characteristics, energy MDI, key nutrients (%TE or per 10MJ) and contributing food sources were examined across these tertiles, using chi-square and one-way ANOVA with covariates (age and BMI) and correcting for multiple comparisons as appropriate (P<0.005).
Overall, 17.1% of TE was obtained from protein and 77% of participants met their protein DRV based on EFSA recommendations of 0.83g/kg/body weight (4). The difference in protein intakes between the highest and lowest tertiles was 7.8% TE (21.2 % TE vs 13.4% TE) with high protein consumers reporting lower energy intakes (1734 ±564kcal) compared to low consumers (2185± 661 kcal). High protein consumers were older (42.5 ± 12.8 years) and had a higher BMI (27.7 ± 6.0 kg/m2). They also had higher MDI of dietary fibre, calcium, zinc, sodium, iron, folate and vitamins A, C, D and B12 (per 10MJ) (p<0.001) and lower MDI of carbohydrates, fat, saturated fat (%TE) in comparison to low consumers (p<0.001). The % contribution of ‘chicken, turkey and dishes’ (18.3%), ‘beef, veal and dishes’ (12.8%) and ‘fish and fish products (7.0%) to protein intakes were significantly higher in the high versus the low consumption group (10%, 7.4%, 4.4% TE respectively; P<0.001). In contrast, those in the lowest protein intake group had a significantly higher intakes of protein coming from dietary sources including ‘burgers, sausages and meat products (9.9 vs 5.9%), ‘white bread and rolls’ (6.9 vs 3.9%), ‘potatoes (including chips)’ (4.1 vs 2.9 %) and ‘cakes, pastries, buns and fruit pies’ (1.7 vs 0.8%) compared to high consumers.
In general, animal protein sources contributed more to total daily protein intakes than plant sources, however, the pattern of protein foods differed according to level of protein intake. These findings will aid in the development of strategies to diversify protein intakes on the Island of Ireland.
Evaluation of the use of telehealth in Dietetics’ practice during the COVID-19 pandemic in the Kingdom of Saudi Arabia
- N.M Al-Mana, S.A Khalil, A.A Qari, M. Eldigire, W. Alshehri, L. Baabdullah
-
- Published online by Cambridge University Press:
- 03 July 2024, E208
-
- Article
-
- You have access Access
- Export citation
-
Telehealth, which involves the remote delivery of healthcare services through virtual technologies, has been shown to have benefits such as reducing hospitalisations and length of stay for patients (1,2). The global COVID-19 pandemic has greatly accelerated the adoption of telehealth among clinical nutritionists. For instance, in the USA, telehealth usage for nutritional care has increased from 37% to 78% among clinical nutritionists (3), while in Italy, the adoption of telenutrition services by Registered Dieticians (RDNs) has risen from 16% to 63% (4). These statistics highlight the rapid integration of telehealth into dietetics practices as a response to the global health crisis, reflecting a growing trend towards virtual care delivery as an effective method for providing nutritional care. While telehealth has made progress in Saudi Arabia (5), there is a research gap regarding its prevalence and effectiveness in dietetics practice. This study aims to evaluate the current implementation of telehealth in dietetics practice during the COVID-19 pandemic in Saudi Arabia.
In this cross-sectional, a web-based online survey was used from mid-December 2022 to mid-May 2023. The study was distributed in several Saudi Arabian regions including (central, western, eastern, south, and north) and was completed by 306 clinical Registered Dietitians (RDNs) in public and private healthcare facilities who met the study’s inclusion criteria. The survey consisted of 28 questions divided into four sections, covering sociodemographic information, past or current experiences, obstacles and challenges of telehealth usage, and the future prospects of telehealth. In this study, participants were requested to reflect on their current observations and previous experiences pertaining to the utilisation of telehealth in the field of dietetics. To ensure the validity and relevance of the survey, it was reviewed by a panel of experienced dietitians in Saudi Arabia to obtain their feedback before being used to collect data.
Our research findings reveal that 76% of RDNs in Saudi Arabia utilize telehealth in their practice. The most common obstacles reported by RDNs using telehealth include internet connectivity issues (21.9%), patient disengagement and lack of enthusiasm (21.3%), and difficulties in coordinating with patients (21%). Telehealth interventions used by RDNs primarily involve diet recall (33.7%), weight- related measurements (30.6%), lab findings (26%), and only 8% reported using telehealth for vital signs. A majority of participants (69.4%) believed that telehealth could improve patient accessibility and help reduce no-show rates (68.9%). Additionally, over 70% of participants agreed that telehealth offers them flexibility in inpatient consultations.
In conclusion, telehealth is widely utilised among RDNs in Saudi Arabia, with potential benefits such as decreased no-show rates. Further research is needed to gain a better understanding of telehealth usage among RDNs in Saudi Arabia.
Assessment of diet composition of Pakistani ethnic groups in the UK – does dietary pattern change between 1st and 2nd generations?
- M.E.B. Syeda, A.C. Hauge-Evans
-
- Published online by Cambridge University Press:
- 03 July 2024, E209
-
- Article
-
- You have access Access
- Export citation
-
Dietary acculturation in immigrant groups can impact health and may increase the risk of conditions like diabetes and cardiovascular disease among East Asian immigrants as they adopt host-country eating patterns(1). Pakistani immigration to the UK has resulted in a dynamic cultural exchange, including modifications in culinary preferences and practices among Pakistani ethnic groups. Prior research indicates that second-generation adults exhibit more signs of acculturation in their choice of diets than their first-generation counterparts (2). There is limited understanding of the food habits or the effects of acculturation on this group. The primary purpose of this study was to investigate intergenerational disparities in food preferences among Pakistani immigrants and the impact of dietary acculturation.
This cross-sectional study examined the food habits and acculturation experiences of 51 first (1G) and 51 second-generation (2G) participants of the Pakistani community living in London. Data was collected using survey questionnaires modified from previous studies (3,4). We compared traditional foods like paratha and samosa with Western options like fish and chips through a set of questions, from which a dietary score was calculated (Global scale) (5). We measured food acculturation using a 5-point scale, with higher scores indicating greater Western influence and lower scores indicating less Western acculturation. Scores ranged from 6 to 30 and were categorised as low, moderate, high, or very high. Data was analysed using SPSS (version 28.0). Chi-square and t-tests were applied to identify differences between groups with significance levels set to p<0.05.
Most of the participants were male (67%) with 38.8% aged 36-45 years. Urdu ethnicity predominated in both generations (64.8%). Significant differences in dietary restrictions for health conditions (p =.008), language (p = .001), consumption frequency of traditional Pakistani cuisine (p = .001), desserts/sweets (p = .001), chai/lassi (p = .017), popular UK meals, fizzy drinks, and inclusion of rice/flatbread (p = .003) emerged between first and second generations. Health-related dietary behaviours differed in fruits/vegetables, dairy, and meat consumption (p =.001). ‘Traditional’ and ‘Western’ dietary scores were significantly different between generations (‘Traditional’: 1G: 17.15 ± 3.52 vs 2G: 13.68 ± 4.71, p = .001; ‘Western’: 1G: 16.29± 1.98 vs 2G: 18.21 ± 3.84, p = .001). The results demonstrated a preference for traditional eating patterns by 1G, whereas a nuanced move towards Western food preferences was observed among the 2G participants, falling into the high category of global scale. 1G Participants cited language acquisition, time constraints, and financial issues as factors affecting their dietary changes.
This study found significant differences in dietary habits and preferences between first- and second- generation Pakistani immigrants in the United Kingdom. The second generation displayed a more significant shift towards Western food patterns, possibly because of socio-cultural variables, language fluency, and adaptability to the local environment.
Effects of dietary nitrate supplementation on markers of oral health: A systematic review
- S. Alhulaefi, A. Watson, S.E Ramsay, N. Jakubovics, J. Matu, A. Griffiths, R. Kimble, K. Brandt, OM. Shannon
-
- Published online by Cambridge University Press:
- 03 July 2024, E210
-
- Article
-
- You have access Access
- Export citation
-
The oral cavity is a vital part of the digestive system. Poor oral health can impact an individual’s ability to eat and has been associated with increased risk of non-communicable diseases and reduced longevity. Conversely, positive oral health has been associated with improved cardiometabolic, cognitive and systemic health and greater longevity. Consumption of dietary nitrate, which is processed in the mouth into nitrite, and is subsequently converted into nitric oxide (NO) in the body (1), has been demonstrated to reduce blood pressure, improve endothelial function, and enhance exercise performance. Interestingly, recent studies suggest that nitrate consumption could also positively modulate markers of oral health (2). To our knowledge, no systematic review has been published examining the effect of inorganic dietary nitrate on oral health. However, this could be valuable to summarise current state of the knowledge, identify effect modifiers and highlight gaps for future research. Therefore, this systematic review aims to investigate the effects of dietary nitrate supplements on markers of oral health in vivo in randomised controlled trials (RCTs).
This study was pre-registered with PROSPERO (CRD42023411159). Five databases (PubMed, The Cochrane Library, CINAHL, MEDLINE, and SPORTDiscus) were searched from inception until March 2023 to identify studies that met the following criteria: adult participants (≥ 18 years) and RCTs investigating the effects of oral dietary nitrate versus placebo on markers of oral health. A narrative synthesis of data was conducted. Risk of bias was assessed using the Cochrane Risk of Bias 2 tool.
Nine articles reporting data on 284 participants were included. Nitrate was provided via beetroot juice (six studies), a beetroot-derived supplement dissolved in mineral water (one study), and lettuce juice (two studies). The duration of the interventions ranged from one day to a maximum of six weeks. Dietary nitrate supplementation increased the relative abundance of several individual bacterial genera including Neisseria (increased in three studies) and Rothia (increased in three studies). Dietary nitrate supplementation increased salivary pH (increased in two studies) and decreased salivary acidification resulting from the consumption of a sugar-sweetened beverage (decreased in two studies). Furthermore, dietary nitrate supplementation resulted in a decrease in the gingival inflammation index in one study. Overall, the risk of bias in studies was mixed. One study had a low risk of bias, while the rest were rated as having some concerns. No study was considered to have a high risk of bias.
The results show that dietary nitrate is a potential nutritional strategy that can potentially benefit oral health by modifying the oral microbiome, altering salivary pH, and minimising gingival inflammation.
Impact of date-based energy bar intake on postprandial appetite, metabolism and thermogenesis
- H. Alfheeaid, D. Malkova, A. Alsalamah, H. Barakat
-
- Published online by Cambridge University Press:
- 03 July 2024, E211
-
- Article
-
- You have access Access
- Export citation
-
Research studies suggest that date-palm (Phoenix dactylifera L.) fruits provide a superior nutritional and health benefits, compared to other fruits (1). They are rich source of many essential nutrients including carbohydrates, dietary fibre, vitamins, minerals, phytochemicals, and antioxidants. The date palm fruits are produced in many countries around the world and about 10-15% of the total production is lost or sold extremely at low prices (2, 3). Despite these, date fruits have been rarely used as an ingredient in commercially available energy bars. The aim of this study is to investigate the impact of newly formulated date-based energy bar (DBEB) (4) intake on subjective appetite, postprandial metabolism, energy substrate oxidation and diet-induced thermogenesis (DIT).
Twenty-seven healthy male adults (mean ± SD, aged 20.8 ± 3.5 years with body weight of 66 ± 8 kg) participated in a randomised crossover design study. Each participant conducted two experimental arms and was investigated prior to (baseline) and for three-hours after consumption of either a date- based energy bar (DBEB) or an isocaloric and macronutrient matched mixed fruit-based energy bar (FBEB) as control arm. The DBEB contained significantly less fructose and glucose, but more sucrose and fibre than FBEB. Both experimental arms involved blood sampling, subjective appetite, and indirect calorimetry measurements. At the end of each experiment, an ad libitum buffet meal was provided. Data analysis used descriptive statistics, paired t-test, and two-way ANOVA.
Time-averaged composite appetite and satiety scores were not significantly different between control (FBEB) and DBEB arms. Energy intake during ad libitum buffet was also not different between arms. Metabolic rate measured at baseline and during post-ingestion of the bars were not significantly different between FBEB and DBEB arms (arm effect, P>0.05). Thermic effect of bars calculated as percentage increase in metabolic rate above RMR during whole postprandial duration was (mean ± SE) 9.5 ± 1.6 % in DBEB arm and 8.7 ± 1.3 % in the FBEB arm (arm effect, P>0.05). Rates of carbohydrate and fat oxidation were also not different between the two arms (arm effect, P>0.05) Time-averaged concentrations of blood glucose, insulin and triglycerides were similar between the study arms.
The obtained results suggest that energy bars based on dates or mixed fruits produce similar effects on postprandial appetite, fat and carbohydrate oxidation, thermic effect and cardiometabolic risk factors. Date fruits can be used as rich source for carbohydrate and energy. However, future research should investigate the impact of date-based energy bars on antioxidant capacity and other health related markers.
Degree of hydrolysis of chicken versus plant-based chicken analogues: An in vitro digestion comparison
- M. Saleh Alotaibi, S. Eldeghaidy, M. Muleya, C. Hoad, A. Salter
-
- Published online by Cambridge University Press:
- 03 July 2024, E212
-
- Article
-
- You have access Access
- Export citation
-
The adoption of plant-based meat analogues is increasing as an alternative to real meat products among consumers because they offer a more ecologically friendly and sustainable source of protein while also alleviating the ethical concerns related to livestock rearing and slaughter (1). However, there are concerns regarding plant-based meat analogues in terms of its nutritional quality, particularly their protein digestibility. This study aims to compared between chicken and plant-based chicken analogues in terms of nutritional composition and degree of protein hydrolysis.
Proximate analyses were performed for raw and cooked samples to assess protein, fat, and energy concentrations in two chicken samples (breast and thigh) and four commercial plant-based chicken (P-C1[Wheat Protein 37%, Pea Protein 10%], P-C2 [ Soya Protein 63%], P-C3 [ Soya and Wheat Protein 83%], and P-C4 [Soya Protein 30%, Pea Protein 2%]). As a first step, the proximate analyses were assessed for the averaged samples, then the product-cooking interactions was assessed using a one-way ANOVA followed by Tukey test (p <0.05). In vitro digestion was performed following the INFOGEST harmonised static in vitro digestion model (2) for cooked samples. After digestion, o- Phthaldialdehyde (OPA) assay was carried out to measure the degree of protein hydrolysis for each sample, and two-way ANOVA test was performed.
Protein content of chicken was higher compared with the plant-based chicken, whereas fat content and energy concentrations were higher in plant-based chicken. The protein content of chicken was higher for raw and cooked samples (raw: 19.8 ± 0.38 g/100 g; cooked: 30.55 ± 4 g/100g), compared with plant-based chicken (raw: 13.8 ± 5.3 g/100 g; cooked: 23.4 ± 4.5 g/100g). Plant-based chicken have a higher fat content and energy concentrations for raw and cooked samples (raw:6.52 ± 1.5 g/100 g; cooked: 9.6 ± 3.17 g/100g) and (raw:189.4 ± 28.1 g/100 g; cooked: 291.3 ± 48.1 g/100g) respectively; compared to chicken fat content (raw:4.6 ± 2.7 g/100 g; cooked: 6.1 ± 4 g/100g) and energy concentration (raw:150.9 ± 25.4 g/100 g; cooked: 228.3 ± 20.7 g/100g). The product-cooking interactions showed a significant increase (P ≤0.0001) in the protein content (raw: 15.6 ± 2.3g/100g to 19.8 ± 0.74 g/100g; cooked: 21.4 ± 0.55 g/100g to 33.06 ± 0.71 g/100g), and fat content P<0.001 (raw: 2.7 ± 0.06 g/100g to 8.3 ± 0.27 g/100g; cooked: 3.29 ± 0.19 g/100g to 13.22 ± 0.33 g/100g) in both chicken and plant-based chicken samples. No significant product-cooking interactions on energy content was found. The results from a two-way ANOVA test of the OPA and the degree of hydrolysis analyses demonstrated a significant increase in the degree of hydrolysis of chicken samples compared with plant-based chicken (P<0.0001).
The degree of hydrolysis and digestibility of chicken and chicken analogues was influenced by protein type, nutrient composition, and processing. These findings would provide substantial information for the improvement of plant-based chicken products with enhanced nutritional profiles. This work will be extended to investigate the availability and digestibility of individual amino acids.
The health impact of substituting meat with plant-based meat alternatives: findings from a Systematic Review
- L. Lindberg, R. Reid-McCann, J. Woodside, A. Nugent
-
- Published online by Cambridge University Press:
- 03 July 2024, E213
-
- Article
-
- You have access Access
- Export citation
-
Sales of plant-based meat alternatives (PBMAs) are increasing(1). While these products are becoming more popular, little is known about their impact on health(2). Therefore, the aim of this work was to systematically review the evidence on PBMA consumption and associated health outcomes.
A wider systematic review looking at the environmental impact, ingredient composition, nutritional impact and health outcomes associated with PBMAs was conducted. A search strategy combined terms “meat alternatives” AND “environment” OR “ingredients” OR “nutrition” OR “health.” Five databases were searched, MEDLINE, EMBASE, Web of Science, Scopus and Greenfile, as well as reference lists of relevant articles. All study designs reporting primary data were included, except for animal studies and in vitro studies. Non-English studies and studies published before 2011 were excluded (PROSPERO Registration Number: CRD42021250541).
2184 papers were identified, 1802 papers remained after duplicates were removed, 1536 were excluded at title and abstract screen stage, 266 full texts were assessed for eligibility and 54 papers were included in the analysis for all outcomes investigated. Ten studies examined the impact of PBMA vs. meat consumption on health outcomes. Three studies measured postprandial response to single test meals(3–5), four studies were longer-term RCTs(6–8) with two further separate publications reporting on different outcomes for the same RCT(9,10). One study was a prospective cohort(11) and one a cross-sectional study(12).
Of the single test meal studies, no significant differences were observed for glucose levels in 2/2 studies (100%)(3,5), PYY and GLP-1 levels in 2/2 studies (100%)(3,4) and self-reported hunger/fullness in 3/3 studies (100%)(3–5). Significantly lower insulin concentrations and subsequent energy intakes were both reported in 1/2 studies (50%) following consumption of mycoprotein vs. chicken meals(3).
Longer-term full and partial replacement of meat with PBMAs resulted in significantly lower body weight (kg) in 2/2 studies (100%)(6,10), significantly lower saturated fat intakes in 2/2 studies (100%) (6,10), significantly higher fibre intakes in 2/3 studies (67%)(6,7), improvements in plasma lipid profile in 2/3 studies (67%)(7,10) and positive changes in gut microbiota in 1/2 studies (50%)(8) compared to meat diet phases/control groups with no restrictions on meat intakes. There were no significant differences in protein intakes in 3/3 studies (100%)(6,7,10), energy, total fat and carbohydrate intakes in 2/2 studies (100%) (6,7), blood pressure in 2/2 studies (100%)(6,10), glucose levels in 2/2 studies (100%) (7,10) and insulin levels in 2/2 studies (100%)(7,10).
No definitive conclusions can be made on the impact of PBMAs on health outcomes due to the small number of studies and variation in study designs, outcomes measured and the type of PBMA used. From the limited evidence available, no negative health effects from PBMA consumption were observed, however, further longer-term RCTs are needed to confirm this.
Understanding the nature and scale of low-intake dehydration on ‘Medicine for Older People’ wards at University Hospital Southampton: A mixed-methods study
- S. Alsanie, K. Ibrahim, S. Lim, S. Wootton
-
- Published online by Cambridge University Press:
- 03 July 2024, E214
-
- Article
-
- You have access Access
- Export citation
-
Dehydration during hospital stays is a significant concern, particularly affecting older adults(1). The ageing process associated with pathological changes and conditions such as dementia makes older adults especially vulnerable to both chronic and acute dehydration (1,2). Recent studies indicate that conventional signs and symptoms of low-intake dehydration may not consistently indicate its presence in older inpatients, leading to missed or incorrect assessments. It can result in significant morbidity through falls, constipation, delirium, respiratory and urinary tract disorders, and even death (2,3). Diagnosing low-intake dehydration at the early stage is challenging, leading to treatment delays which further compound the negative consequences of dehydration (1,3). There is a need to determine the scope and practice of detecting and managing low-intake dehydration in ‘Medicine for Older People’ (MOP) wards at University Hospital Southampton (UHS). The primary aim of this study was to explore the current practices and challenges in detecting and managing low-intake dehydration in older inpatients within the MOP wards at UHS.
Using a sequential, explanatory mixed-method design, a prospective chart review study (phase 1) was conducted at all the MOP wards at UHS over one month. The study included 50 adults aged 65 and above admitted to the medical wards for various reasons and now deemed ‘Medically Optimised for Discharge’ (MOFD). The quantitative aspect involved reviewing the completion of a local hydration assessment tool and the proper documentation of hydration and fluid balance charts for at-risk patients. The qualitative component (phase 2) consisted of semi-structured interviews with 10 participants—four doctors and six nurses with years of experience ranging from 4 to 12 years—to understand the perceptions of hospital staff on hydration care, its barriers and facilitators.
The quantitative phase found that all patients were at risk for dehydration and underwent assessment through the hydration risk assessment, with hydration chart reviews during early, late, and night shifts. 20% did not have hydration assessment within 24h of admission and there were some missing reviews during various shifts. Most 24-hour fluid balance sheets were not completed for patients in the red category (start 24-hour fluid balance chart). Qualitative findings revealed that staff had ‘(1) experiential knowledge of hydration understanding the risks of dehydration in older adults’, ‘(2) difficulty in dehydration assessment and diagnosis due to resources’ and ‘(3) challenges related to staff levels and skills’ as well as (4) patient attributes contributing to difficulty in dehydration assessment.
The mixed-methods study underscores the importance of addressing low-intake dehydration in older inpatients on MOP wards and highlights gaps in current practices. The findings emphasise the need for improved training, awareness, and standardised protocols to prioritise hydration care among healthcare professionals and provide optimal hydration care for older inpatients.
The effect of long chain n-3 fatty acid supplementation on muscle strength in older adults: A systematic review and meta-analysis
- M. Timraz, A. Binmahfoz, T. J Quinn, E. Combet, S. Gary
-
- Published online by Cambridge University Press:
- 03 July 2024, E215
-
- Article
-
- You have access Access
- Export citation
-
Muscle strength and mass decline as we age, typically starting around 35–40 years, and can eventually lead to the development of sarcopenia(1). There are currently no effective drug treatments available for either the prevention or treatment of this condition(2) and whilst resistance exercise has efficacy(3) its effectiveness is limited due to issues with uptake and adherence(4). However, emerging research suggests that nutrition may offer a potentially effective approach to delay the age-related decline in muscle mass and function among older individuals(5), with LCn-3 PUFA emerging as a strong candidate.
The main objective of the current study was to perform a systematic literature review with the purpose of exploring the impact of long-chain n-3 polyunsaturated fatty acid (LCn-3 PUFA) relative to control oil supplementation on muscle strength, with secondary outcomes of muscle mass and physical function in older individuals under conditions of habitual physical activity/exercise.
The review protocol was registered with PROSPERO (CRD42021267011) and followed the guidelines outlined in the Preferred Reporting Items for Systematic Review and Meta-Analysis (PRISMA) statement(6). The search for relevant studies was performed utilizing databases such as PubMed, EMBASE, CINAHL, Scopus, Web of Science, and the Cochrane Central Register of Controlled Trials (CENTRAL) up to June 2023. Randomized controlled trials (RCTs) in older adults comparing the effects of LCn-3 PUFA with a control oil supplement on muscle strength were included.
Five studies(7,8,9,10,11) involving a total of 488 participants (348 females and 140 males) were identified that met the specified inclusion criteria and were included. Upon analyzing the collective data from these studies, it was observed that supplementation with LCn-3 PUFA did not have a significant impact on grip strength (standardized mean difference (SMD) 0.61, 95% confidence interval [−0.05, 1.27]; p = 0.07) in comparison to the control group. However, there was a considerable level of heterogeneity among the studies (I2 = 90%; p < 0.001). As secondary outcomes were only measured in a few studies, with significant heterogeneity in methods, meta-analyses of muscle mass and functional abilities were not performed. Papers with measures of knee extensor muscle mass as an outcome (n = 3) found increases with LCn-3 PUFA supplementation, but studies measuring whole body lean/muscle mass (n = 2) and functional abilities (n = 4) reported mixed results.
With a limited number of studies, our data indicate that LCn-3 PUFA supplementation has no effect on muscle strength or functional abilities in older adults but may increase muscle mass, although, with only a few studies and considerable heterogeneity, further work is needed to confirm these findings.
The effect of LCn-3 PUFA supplementation on body weight, body composition, and muscle function during alternate-day fasting (ADF)
- M. Alblaji, S.R Gray, T. Almesbehi, H. Miller, A. Gonzalo, D. Malkova
-
- Published online by Cambridge University Press:
- 03 July 2024, E216
-
- Article
-
- You have access Access
- Export citation
-
During weight loss the loss of body mass is associated not only with body fat reduction but also with a decrease in fat-free mass (FFM), related to the reduction in muscle mass and function(1). Supplementation with long-chain n-3 fatty acids (LCn-3 PUFA), in the absence of caloric restriction, results in a significant decrease in fat mass and an increase in FFM(2) along with improvements in muscle mass and strength(3). However, the impact of supplementation with LCn-3 PUFA during weight loss) remains unknown. Therefore, the aim of this study was to explore the effects of LCn-3 PUFA supplementation, in the form of Krill oil (KO), during alternate day fasting (ADF) on body weight, fat mass loss, FFM and muscle function changes in healthy overweight and obese adults.
A total of 41 men and women (age: 39.35 ± 10.4 years, BMI: 31.05 ± 4.2 kg/m2) completed the study (NCT06001632), in which they were randomised into either a KO or Placebo (PL) groups. Both groups carried out 8-weeks of ADF combined with intake of 4 g/day of the corresponding supplements. ADF involved consuming no more than 500 calories on the 'fast day’ and consuming food ad libitum on each 'feed day’. Data on body weight and body composition (TBF-300, Tanita, Manchester, UK), handgrip strength (Handheld Hydraulic Dynamometer, Vernier Jamar; England, UK), and time to conduct 5 repetition of chair rising test were obtained pre-and post-intervention. Changes from baseline within groups were assessed using paired samples t-test. Mixed analysis of variance (Mixed-ANOVA) was used to measure 2-way interactions between time and group to identify the differences between groups. All statistical analysis were conducted using IBM Statistical Package for the Social Sciences SPSS 28.0.
In both groups, body mass decreased significantly (KO:-4.7 ± 0.4kg, p<0.001; PL:-4.5 ± 0.4kg, p<0.001), along with a significant reduction in fat mass (KO:-2.4 ± 0.5kg p<0.001; PL:-2.3 ± 0.5kg p<0.001), and FFM (KO:-0.6 ± 0.2kg p<0.001; PL:-0.7 ± 0.2kg, p<0.001), with no differences between groups. In the PL group, there was a reduction in handgrip strength (-0.9 ± 0.7 kg, p<0.001), while there was no change in KO group (-0.2 ± 0.5 kg, p=0.1), with a significant difference between groups (p<0.001). In the KO group there was a significant reduction in time to conduct chair rising test (-1.8 ± 0.9s, p<0.05), with no change in the PL group (-0.3 ± 1.3s, p=0.2), with a significant difference between groups (p<0.001).
Supplementation with LCn-3 PUFA (4 g/day) during 8 weeks of ADF, applied to individuals living with overweigh and obesity, does not facilitate body or fat mass loss and does not diminish the reduction in FFM. However, it attenuated the reduction in muscle function in healthy overweight and obese adults.