To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Parkinson’s disease (PD) is an age-related neurological disorder characterized by bradykinesia, tremor, and postural instability. Early weight loss following diagnosis is linked to a poorer prognosis. In older adults, loss of skeletal muscle increases the risk of falls and related injuries, making body composition measurements, such as muscle and fat mass, critical in PD, where the risk of falls is high. Traditional body composition analysis equipment is bulky, expensive, and primarily limited to research settings(1). This study evaluated the reliability of the portable SECA mBCA 525 device, which has not been validated in PD populations.
Nineteen participants with PD and 11 household controls were recruited from Movement Disorder Clinics. Participants underwent body composition analysis using the SECA mBCA 525 device. A mild electrical current is passed between adhesive electrodes applied to hands and feet, and the impedance is measured. Proprietary algorithms use the impedance data and manually entered data, such as age, weight, height, waist circumference and reported activity level, to estimate the fat, lean, and water mass (in kilograms). Measurements were repeated at two visits one month apart to assess test-retest reliability.
Data were collected from 30 participants: 19 with PD and 11 controls. Five PD participants experienced data collection failures, all exhibiting rest tremor. However, tremor amplitude was similar to the group average (mean 1.6, standard deviation 1.9 vs. mean 1.6, standard deviation 1.2).
Two sets of complete data were collected for 14 participants with PD and 10 controls. No significant differences in lean- or fat mass estimates were observed between trials 1 and 2 (Bland-Altman plot and linear regression, p>0.05).
The SECA mBCA 525 portable bio-impedance analysis device demonstrated good test-retest reliability for assessing lean and fat mass in individuals with and without PD. However, data collection failures, potentially caused by limb tremor, limit its applicability in PD studies.
Belief network analysis (BNA) has enabled major advances in the study of belief systems, capturing Converse’s understanding of the interdependence among multiple beliefs (i.e., constraint) more intuitively than many conventional statistics. However, BNA struggles with representing political divisions that follow a spatial logic, such as the “left–right” or “liberal-conservative” ideological divide. We argue that Response Item Networks (ResINs) have important advantages for modeling political cleavage lines as they organically capture belief systems in a latent ideological space. In addition to retaining many desirable properties inherent to BNA, ResIN can uncover ideological polarization in a visually intuitive, theoretically grounded, and statistically robust fashion. We demonstrate the advantages of ResIN by analyzing ideological polarization with regard to five hot-button issues from 2000 to 2020 using the American National Election Studies (ANES), and by comparing it against an equivalent procedure using BNA. We further introduce system-level and attitude-level polarization measures afforded by ResIN and discuss their potential to enrich the analysis of ideological polarization. Our analysis shows that ResIN allows us to observe much more detailed dynamics of polarization than classic BNA approaches.
Older adult hip fracture patients are at high risk of malnutrition(1), but research exploring postoperative dietary intake remains limited. Previous studies have relied on visual estimation methods, which may lack precision(2). Resting Energy Expenditure (REE) can be used to calculate the nutritional intake of adults in an acute illness state(3). The aim of this study was to observe if postoperative dietary intake of older adult hip fracture patients met individual REE needs. Secondary aims were to examine associations between dietary intake, length of hospital stay (LOS), comorbidities, and postoperative complications.
This was a prospective cohort study of older adult hip fracture patients (> 60yrs) requiring surgical repair recruited over four weeks. Data were collected on patient characteristics and food diary data was collected using a digital weighing scale for all hospital meals from the day of surgery to postoperative day (POD) three inclusive. Descriptive data were analysed in Microsoft Excel, and we used a statistical package for regression analysis to explore variables of interest(4). REE was calculated according to current guidance(3) and adjusted according to acute factors regarding hip fracture surgery as appropriate. The study was registered with ClinicalTrials.gov (NCT06451679).
Twenty-one patients were consecutively recruited (80.5 ± 8.9 years) and fourteen patients completed the food diary. The REE intake for completed food diaries ranged from 645.6 kcal (2701.2 kJ) to 6419.8 kcal (26860.4 kJ). In relation to the primary aim, one participant (male) met their REE needs for the food diary inclusive. Regarding secondary aims, linear regression analysis of the percentage of REE met and LOS demonstrated that dietary intake on POD one was linked with a longer LOS (95% CI, –0.11 to –0.02, p=0.01). An increased ASA (American Society of Anaesthesiologists) grade was associated with reduced dietary intake, but this was not statistically significant. In relation to postoperative complications, a reduced appetite was the most reported complication across all participants regardless of completing the food diary (n = 13) and higher in females (57.1%) than males (4.8%).
Only one patient achieved their REE over the course of the food diary which may indicate that current nutritional strategies require further development to meet the needs of this patient group. The observed association between dietary intake on POD one and a longer LOS indicates targeted support for older adult hip fracture patients would be beneficial.
Rehabilitation with removable complete dentures (RCDs) involves navigating public dental systems that often present barriers like long wait times and limited access. While clinical outcomes are often known, patient experiences with service delivery remain underexplored. Understanding these experiences is key to improving denture care in public settings.
Objectives:
This study aimed to explore and gain a comprehensive understanding of the service delivery experiences of patients rehabilitated with RCDs through the public dental health system in the State of Tasmania in Australia with the goal of informing improvements to the patient journey and overall service quality.
Methods:
A qualitative study using a Constructivist Grounded Theory (CGT) approach was undertaken. Twenty-five adult participants who received RCDs between 2017 and 2022 were purposively selected from public dental clinics. Data were collected through in-depth, semi-structured, face-to-face interviews. Analysis followed CGT principles, including iterative coding, constant comparison, memo writing, and the co-construction of meaning with participants.
Results:
Participants reported emotional distress linked to prolonged waiting times and limited continuity of care. Despite valuing the professionalism and empathy of individual practitioners, many expressed a need for improved communication, more coordinated interdisciplinary care, and greater system responsiveness. Good service was characterised by accessibility, affordability, approachability, friendly staff, and high-quality care. Suggestions for improvement included moving services close to patients, better integration with other health sectors, and the use of visual aids to support understanding and self-management.
Conclusions:
Patient narratives reveal a pressing need to address delays, communication gaps, and fragmented care in the public denture service pathway. System-level changes adopting a more holistic approach such as patient-centred approach, improving interprofessional collaboration, decentralising service provision, and enhancing health communication may significantly improve the denture rehabilitation experience and patient outcomes.
Fulminant myocarditis is a life-threatening event that can present as cardiogenic shock. Human metapneumovirus (hMPV)–associated myocarditis is exceptionally uncommon, particularly in the pediatric population. Treatment may require mechanical ventilation, inotropic agents, vasopressors, and advanced life support systems. In this article, we report an 18-month-old previously healthy infant who presented with severe metabolic acidosis, elevated lactate, and profound biventricular systolic dysfunction secondary to hMPV infection. Despite mechanical ventilation, inotropic support, and initial immunomodulatory therapy with intravenous immunoglobulin, high-dose methylprednisolone, and anakinra, the patient’s clinical condition deteriorated rapidly, requiring venoarterial extracorporeal membrane oxygenation (VA-ECMO) within the first 12 hours of admission. Given the absence of a patent foramen ovale and significant ventricular distention risk, surgical left ventricular decompression via a cannula inserted through the right upper pulmonary vein was performed. Hemoadsorption was additionally incorporated to mitigate hyperinflammation. Laboratory findings fulfilled macrophage activation syndrome criteria, and interferon-gamma blockade with emapalumab was initiated due to refractory cytokine storm physiology. Antioxidant therapy (nicotinamide adenine dinucleotide, coenzyme Q10, quercetin) was used as supportive treatment. Progressive improvement in ventricular function was observed under this comprehensive life support–based regimen. By day 12 of ECMO support, biventricular systolic function had normalized, and the patient was successfully weaned and discharged with full recovery. This case underscores the importance of early recognition, advanced immunomodulation, and effective ventricular unloading in managing fulminant hMPV myocarditis in children.
Conflict-affected Palestinian communities experience profound mental health challenges. This systematic review assesses the evidence for mental health interventions in these contexts, focusing on the theoretical alignment of narrative therapy with cultural assets like sumud (steadfastness) and hikaye (storytelling).
Methods
Following PRISMA guidelines, we searched nine databases and grey literature up to December 2023. We included studies on mental health interventions for conflictaffected Palestinians, with a primary focus on narrative therapy and a secondary analysis of other approaches.
Results
Of 847 records screened for narrative therapy, no studies met the inclusion criteria. A broader search identified 23 intervention studies, revealing a predominant focus on cognitive-behavioral therapy (CBT; n = 11) and Narrative Exposure Therapy (n = 4), with limited therapeutic diversity. Analysis showed insufficient Palestinian researcher leadership and superficial cultural adaptation of interventions.
Conclusions
This review reveals a dual gap: a complete absence of narrative therapy research despite its theoretical relevance, and a broader pattern of limited intervention diversity. The predominance of Western-centric models reflects systemic biases in research funding. Addressing this requires community-led participatory research, shifts in funding priorities, and investment in culturally-grounded methodologies.
In the past five years the UK food system has been severely impacted by a combination of events including the pandemic, geopolitical conflicts, and the cost-of-living crisis. The resulting escalation of food prices has increased levels of food insecurity however much research has focused on families. Food aid providers have reported increasing numbers of older adults seeking support to access food(1). Food insecurity in older adults is affected by an accumulation of factors that can amplify the impact of financial threats to exacerbating their susceptibility to food poverty, including decline in functional ability and reduced social networks(2). Traditionally, services like Meals on Wheels or lunch clubs safeguarded the food security of older adults; however, austerity policies have led to their decline, further compromising older people’s ability to access nutritious food. Prolonged food insecurity increases risk of malnutrition/undernutrition. A scoping review aimed to map current academic knowledge of pensioner age experience of using food aid in the UK and other OECD member countries.
Four databases: Scopus, PubMed, Cochrane Library, and CINAHL were searched using predefined search terms identifying 4762 papers. Screening of abstracts and full text papers resulted in 30 academic articles that met the inclusion/exclusion criteria and were included in the review.
Most of the studies (16 studies) were undertaken in the USA, 8 in the UK, in Australia and 1 in the Netherlands. Most of the studies used qualitative methods (n=22), two were quantitative, and six used mixed methods. Included identified the following 4 themes: (1) Social and Emotional Benefits (12 Studies): Many individuals use food aid services not just for accessing food but also for social support and interaction; (2) Stigma (16 Studies): Stigma and shame were barriers for older people using food aid. Lack of knowledge about available services are significant barriers to accessing support; (3) Quality and Suitability of Food (8 Studies) : There are concerns about the quality and suitability of food provided, particularly for those with dietary restrictions or cultural preferences; and (4) Challenges Accessing Food Support (13 Studies): Challenges to accessing food aid include transport especially in rural areas, mobility affects ability to carry food home. Language barriers affected ability to engage with infrastructure.
The scoping review highlights the challenges PAH face when accessing food aid services. In the short term, in order to better meet the nutritional needs of the Increasing numbers of PAH needing support to address food insecurity requires food aid services to respond to their specific needs. Longer term, policy makers need to address the issues driving PAH to food aid. Further research is needed to explore the wider cost of food insecurity across the food, health and social care system.
Archaeological sediments can be used to retrieve evidence for parasites that infected past populations, giving evidence for disease, diet, sanitation, and migration in the past. To increase our understanding of parasite infections in Roman Britain and determine which parasites may have infected people living at Vindolanda, sediment samples were collected from a drain connected to a latrine at the bath complex of Vindolanda. These samples were used to look for preserved parasite eggs and cysts deposited in the drain with the faeces of people who used the latrine. Microscopic analysis was used to identify eggs of helminths, and enzyme-linked immunosorbent assay (ELISA) was used to look for protozoan parasites that can cause severe diarrhoea. Eggs of Ascaris sp. (roundworm) and Trichuris sp. (whipworm) were found by microscopy and Giardia duodenalis was detected using ELISA. All of these parasites are transmitted by the faecal-oral route, usually through contaminated food and water. This is the first evidence for G. duodenalis in Roman Britain. A range of zoonotic and faecal-oral parasites have been found at other sites in Roman Britain, yet the drain studied from Vindolanda only contained faecal-oral parasites that can be transmitted directly between humans. This predominance of faecal-oral parasites is similar to a pattern found in large urban sites in the Roman Mediterranean and other military sites in the empire. In contrast, sites from larger urban cities in Roman Britain, such as London and York, appear to have a more diverse range of parasites.
Food and Nutrition Security is heavily threatened by the onset of COVID-19 pandemic, and Pakistan is no exception. The most vulnerable segments including women and low wage workers are mainly relying on free or subsidized meals served by public and private sector managed food distribution networks (FDNs). These FDNs are mainly relying on wheat flour, which could be used as a fortification vehicle to provide essential nutrients after fortification for Zn, Fe, Folic acid and B12, during post COVID period. The aim of this project was to ensure that wheat flour, procured by selected FDNs, was replaced with high quality fortified wheat flour in daily distribution meals, so that more nutritious foods would reach the most vulnerable segments in Pakistan.
This pilot project executed through different private sector managed food distribution networks (FDNs), industrial distribution networks (IDNs), and ration distribution networks (RDNs), and their regular wheat flour was replaced with quality fortified wheat flour. After selection and agreement with flour mills and FDNs, their employees were capacitated to produce and serve quality fortified flour. Alongside the provision and monitoring of quality fortified wheat flour, supplied to FDNs, the fortification quality was also assessed for fortification compliance, through analysis of added iron content.
The agreements were signed with 11 flour mills, 11 FDNs, 6 RDNs, and 6 IDNs in 4 cities of Punjab i.e. Faisalabad, Lahore, Multan and Gujranwala; and 1 city of Sindh i.e. Karachi. Likewise, in total 858 people from these flour mills, FDNs and provincial regulatory authorities were capacitated to provide quality fortified flour. During the project period June to December 2021 around 1,722 tons of quality fortified flour was produced by selected flour mills and 8.6 million fortified meals were served to the vulnerable consumers (61.50% males and 38.50% females). Furthermore, the analysis revealed that the average content of added iron in case of all flour mills complied with the recommended fortification standards of Punjab Food Authority i.e. ≥15 mg/kg.
Micronutrient fortified wheat flour provision to vulnerable populations through these FDNs is one of the best strategy to be adopted both by government as well as private sector to compliment the basic nutrition of vulnerable segments.
Expert groups have recommended that adults over 65 years should aim for 1.0-1.2 grams (g) of protein per kilogram of body weight per day (g/kg/d) to support health and functionality(1). However, data from the National Diet and Nutrition Survey (2016-2019) showed 50.2% of over 65-year-olds consumed below 1.0 g/kg of adjusted body weight per day(2). Identifying barriers and facilitators to optimising protein intake in older adults is crucial for dietary intervention. This systematic review thus aimed to evaluate modifiable determinants of protein intake amongst community dwelling adults aged over 65 years.
The protocol was registered on the PROSPERO database (CRD42023399243). Eligible studies included community dwelling older adults within Organisation for Economic Co-operation and Development countries. Comprehensive searches were conducted in CINAHL, AMED, MEDLINE, PsycINFO and EMBASE, followed by backward and forward citation chasing and a search for grey literature. The latest search date was July 2024. Searches identified 2316 publications for screening once duplicates were removed. These were screened by two independent reviewers and quality assessed using the Mixed Methods Appraisal tool (version 2018). A meta-analysis was not possible due to substantial heterogeneity in the reporting of both protein consumption and the measurement of determinants.
Sixty publications were included in the final analysis, reporting 54 quantitative studies. The majority of studies were cross-sectional and presented a lesser quality of evidence. Age range of participants was 65-103 years. Protein intake was reported as amount consumed per day, or a criterion applied to determine low intake which varied from 0.75 g–1.1g/kg/d. There was evidence from multiple lesser-quality cross-sectional studies that lower financial income (n=4), increased physical limitations (including an increased number of limitations in activities of daily living, reduced mobility or being housebound, n=5), dental problems (n=8) and poor appetite (n=7) are associated with suboptimal or lower protein intake. For men only, living alone appeared to determine lower protein consumption (n=2). In contrast, higher self-perceived health-status (n=3), increased physical activity (n=7), lunch club attendance (n=3) and increased dietary knowledge (n=3) were associated with a higher protein intake. Poor appetite in older adults may be expected to reduce protein intake as food consumption decreases, however evidence suggested that the diet also became less protein dense.
In conclusion, determinants of protein consumption in community dwelling older adults are multifactorial, ranging across influences from socioeconomic, physical health, lifestyle and knowledge domains. These areas are likely to interact with one another, and all should be considered for future research, clinical practice and public health. There is a need for practical dietary interventions that consider affordability and accessibility for optimal uptake, additionally these need to be underpinned by increased dietary knowledge specific to older adults.
Flavour enhancement has been identified as a potential strategy to counteract the loss of taste in pureed foods(1,2), making them more palatable and recognisable for older people(3). However, it is unclear whether flavour enhancement with protein fortification of pureed diets can influence appetite and dietary intake. Thus, the aim of this study was to explore the influence of flavour-enhanced and protein-enriched pureed meals on perceived appetite, palatability, and intake in older people.
Forty-one healthy older people were recruited in Reading, UK to partake in a randomised, crossover study involving three study days. Participants consumed a standard breakfast, followed by one of three ad-libitum pureed lunches (high protein (HP), high-protein-with-aroma (HP-Aro) or low-protein (LP)) and three hours later an ad-libitum pureed buffet-style meal. Intake of the ad-libitum meals were measured by weighing food leftovers, perceived appetite and palatability were rated using a visual analogue scale and remaining intakes for the day assessed using food diaries.
SPSS was used to carry out all statistical analysis. Differences in energy and macronutrient intake between the three ad-libitum pureed lunch meals were examined using repeated measures analysis of variances (RM-ANOVA) with pair-wise comparisons (Bonferroni corrected). This test was also applied to assess the effects of these meals on subsequent intake at the ad-libitum buffet meal and total day intake. Post-lunch appetite sensations were analysed using Friedman’s test with Wilcoxon signed-rank test. Changes in self-reported palatability ratings between the three pureed lunch meals were evaluated using RM-ANOVA with pair-wise comparisons (Bonferroni corrected). Statistical significance was accepted at p<0.05 in all analysis.
Participants’ intake of ad-libitum pureed lunch was significantly higher when they consumed the HP+Aro pureed meal compared to the HP and LP pureed meals (p<0.001). Protein fortification with or without flavour enhancement significantly reduced energy intake (EI) and macronutrient intake at subsequent ad-libitum meal (p<0.001). Even though protein fortification did not affect EI for the rest of the day, it led to greater total protein intake (p<0.001). However, participants had higher total EI, carbohydrate and fat intake across the study day when they had LP pureed lunch compared to when they had HP pureed lunch (p<0.001) but no difference occurred between the LP and HP + Aro conditions (p=0.72).
Protein fortification led to significantly greater satiety (p<0.001) and fullness (p=0.04) post lunch. Palatability ratings were significantly improved with flavour enhancement (p<0.05).
The study suggested that intakes and liking of pureed meals are greater through the combination of flavour enhancement and protein fortification among older people. Also, although protein fortification could suppress appetite which might lead to reduced EI across the whole day, this may be mitigated by the combined addition of the flavour enhancer.
High intakes of ultra-processed foods (UPFs) have been associated with adverse health outcomes including weight gain, cardiometabolic disease and cancer(1,2). However, the limitations of NOVA highlight the need for a novel classification which captures food properties and features of processing that drive these health outcomes. Proposed mechanisms underlying the associations between UPF and health include energy density and hyperpalatability(3,4). Within the US food system, the prevalence of both energy dense and hyperpalatable foods is increasing, and there is moderate overlap between foods identified by each definition(5). The aim of this study was to characterise the energy density and hyperpalatability of foods and habitual diets consumed by UK adults.
Logged dietary data, baseline characteristics and body composition were analysed from ZOE PREDICT 1 UK participants (n= 1001; NCT03479866)(6). Weighted logged dietary data, including more than 12,000 foods (beverages excluded), from participants with ≥2 free-living days (n=891) were included. Energy density (kcal/g) for all foods and participant diets was calculated. Hyperpalatable foods (HPF) were classified using a data-defined definition(7) and percentage energy intake (%EI) from HPFs were calculated. Differences in energy density and hyperpalatability were examined across top and bottom tertiles of energy intake.
In the PREDICT 1 cohort (n=891, 73% females, (mean±SD) age 46±12 years and BMI 26±4.9 kg/m2) 55% of foods were high in energy density (>2 kcal/g), 49% were defined as HPFs and 35% were both energy dense and hyperpalatable. Across participants’ habitual diets (beverages excluded), the average energy density was 2.1±1.0 kcal/g and proportion of energy consumed from HPFs was 48±16 %EI. There was a significant difference across tertiles of energy intake (low versus high) in the energy density of diets (kcal/g) and contribution of energy from HPFs (%EI) (P<0.05 for all).
This analysis provides valuable insights into the energy density and hyperpalatability of habitual diets in the PREDICT 1 cohort, demonstrating overlap between these features. Further work is required to understand the independent and combined role of these factors, and additional proposed mechanisms underlying the negative health effects associated with processing, such as food texture, energy-intake-rate, and additives and non-culinary ingredients. A novel classification should build on NOVA by encompassing the food properties and features of processing that drive adverse health outcomes. Policy makers must consider the role of these factors to ensure nutritionally beneficial and affordable foods are not penalised without appropriate scientific justification.
With the global population set to reach 9.8 billion by 2050, including a doubling of adults aged ≥60 years old(1,2), sustainable dietary transitions are increasingly urgent. One postulated solution is controlled environment agriculture (CEA) vertical farming (VF), defined as plant factories in which environmental conditions are intricately controllable(3).
VF affords opportunities to reduce climate impact, such as reductions in water and land use(4). Further, it allows for tailoring the nutritional profiles of food through methods such as light manipulation and agronomic biofortification(5,6). This is especially relevant to older adults, with their inherent specific nutritional needs and the role of diet in healthy ageing(7). However, older adults have high rates of food neophobia(8) that may hinder acceptance of novel VF produce. This study assessed the acceptability of VF in UK older adults.
An online survey within a UK older adult cohort (≥60 years old) (n=680) assessed knowledge, sentiment scores (SS), and attitudes (Likert scales) towards VF. Willingness to pay (WTP) and purchasing likelihood for VF products were also assessed. Sentiment scores were generated using text highlighting(9). Data were analysed across genders and the index of multiple deprivation (IMD). Mann-Whitney and Kruskal-Wallis were used for non-normally distributed data. Two-way repeated measures ANOVA and two-factor MANOVA were used where applicable. Purchasing likelihood was evaluated using ordinal regression.
Knowledge of VF was low (44%), with fewer participants (23.6%) understanding CEA VF. Total SS were ambivalent (8.8), with no significant gender or IMD differences. However, sentences concerning sustainable benefits, e.g. yield, land and water use, were positive (range; 14-23). Females showed more positive SS for tailoring VF to support older adults’ nutritional needs (p=0.012). Attitudes (from Likert scale) were mostly neutral, with females slightly more favourable towards environmental concerns (p=0.008) and food fortification (p=0.03). WTP for VF produce was slightly higher (£0.016) than for conventionally produced foods (p < 0.001), with no significant differences between genders and IMD. 43% of participants were more likely to purchase VF produce, while 10% were less likely. Lower-income households (0.59 OR; 0.40–0.88, p=0.008); lack of knowledge of food fortification (0.62 OR; 0.42–0.89, p=0.011) and those with the more negative SS, SS Q1 (0.45 OR; 0.29 – 0.71, p=0.046) and SS Q2 (0.65 OR; 0.43 – 0.99, p<0.001) where significantly less likely to purchase VF produce.
Low knowledge and negative sentiment towards VF, coupled with low WTP, pose significant barriers to adopting VF food products in older adults. Socio-economic factors like income also affect purchasing likelihood, potentially exacerbating current health gaps. Increasing VF awareness and improving attitudes whilst reducing VF production costs could help promote its adoption in this demographic.
In their article, Pope and colleagues examine the ethical, legal, and practical complexities associated with the use of advance directives (ADs) to pursue voluntarily stopping eating and drinking (VSED) in the context of patients with advanced dementia. The authors detail the shortcomings of current VSED ADs, and they review a new VSED AD that they argue addresses these shortcomings and provides a better solution to the complexities associated with implementing VSED ADs. While the authors make a robust argument for the overall need and supportability of VSED ADs, this commentary highlights a problem that still persists even with the new VSED AD, specifically the absence of a robust ethical justification for continuing to honor the VSED AD even when the advanced dementia patient is requesting to eat or drink in the context of distress and suffering. Without this robust ethical justification, moral distress of family and caregivers is likely to occur, which could jeopardize the larger implementation of the VSED AD. Rather than pursue this more extreme measure without robust ethical justification, the commentary argues that the authors’ alternative proposal of minimal comfort feeding is the more practical and ethical strategy that best balances the patient’s longer- and shorter-term interests.
The American Law Institute (ALI) recently approved its first ever Restatement of Medical Malpractice law. One Restatement subsection embraces the position that an authoritative clinical practice guideline, if admissible and sufficiently relevant, can be prima facie evidence of a provider’s compliance with the standard of care. This article responds to a forceful critique of that position by two of the Restatement’s advisors, who are nationally esteemed members of the plaintiff’s bar. They argue that caselaw does not support this provision, that it is unsound public policy, and that the provision is unfair because it does not afford the same prima facie proof status to plaintiffs’ use of practice guidelines.
This article addresses each of those critiques. It starts with the observation that, at bottom, this opposition is fundamentally at odds with the primary governing principle of professional liability, namely, that professional standards have greater force in medical liability cases than do industry standards in general negligence cases. Because professional standards determine professional negligence, a relevant clinical practice guideline that speaks with authority for a relevant segment of medical professionals, if admissible, should be sufficient to support a jury finding of non-negligence for a doctor who complies.
The same conclusion does not apply, however, when a plaintiff presents a single relevant guideline that a physician failed to follow, for the simple reason that it is often the case that more than one approach can reasonably apply to a given clinical situation. Also, guidelines often set ideal rather than minimal standards. Thus, this provision’s differential effect is not fundamentally unfair. Instead, it flows directly from a plaintiff’s burden of establishing professional negligence, much like numerous other conventional legal rules can affect opposing sides of a case differently.
As accomplished advocates, it is no surprise these authors have made the strongest possible case against any enhanced legal status for defensive use of exculpatory practice guidelines. Thoughtful inspection and reflection reveal, though, that their analysis significantly misstates or over-states the Restatement’s position, and so their attacks are mostly misdirected. If this Restatement’s position were truly as radical or poorly considered as these authors portray, the ALI likely would not have adopted it. Nevertheless, engaging with this critique can better elucidate reasons that courts might view this Restatement provision as well-considered.
Progression towards elevated blood pressure (BP) may begin as early as adolescence. In low- and middle-income countries (LMICs), consumption of ultra-processed foods (UPFs), which are linked to poor cardiometabolic health, is often highest in adolescence. We examined sex- and age-specific associations of systolic and diastolic BP (SBP and DBP) with concurrent and lagged UPF intake from age 15 to 25 in a Filipino cohort. We used data from the 1998–2009 waves of the Cebu Longitudinal Health and Nutrition Survey (n 2124, 52 % male); participants were 15, 18, 21 and 25 years old. UPFs (% daily kilocalories) were classified using NOVA. Linear mixed-effects models estimated differences in SBP and DBP associated with a 5-percentage point difference in concurrent and lagged UPF intake (3–4 years earlier). Mean UPF intake was 10–11 % of total energy intake among males and 14–17 % among females over the study period. At age 21, intake of ultra-processed meats and fish was positively associated with DBP (β = 0·48 (95 % CI: 0·02, 0·94)) among males and intake of ultra-processed sugary beverages was positively associated with SBP (0·80 (0·13, 1·48)) and DBP (0·93 (0·34, 1·51)) among females. Among females only, SBP at age 18 was positively associated with total UPF intake at age 15 (0·25 (0·00, 0·50)). In this cohort, there were modest, positive associations between BP and UPF intake, which varied by sex and age. UPF intake during the transition to adulthood may be linked to higher BP, supporting efforts to limit adolescents’ intake in LMICs.