To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study aimed to evaluate the effectiveness of the SmartNav in detecting tip fold-over during cochlear implantation and to compare angular insertion depth measurements obtained via SmartNav and transorbital X-ray imaging.
Methods
This retrospective multicentre study included patients with normal cochlear anatomy, comprising 163 individuals and 213 ears who underwent cochlear implantation using Nucleus CI522 and CI622 systems at Gazi University Faculty of Medicine and Gaziantep City Hospital.
Results
Of the 213 cochlear implantations, tip fold-over was detected in 4 implantations (1.88 per cent) intra-operatively with SmartNav. One case (0.47 per cent) of tip fold-over was not detected by SmartNav and identified post-operatively through X-ray imaging. SmartNav showed a sensitivity of 80 per cent, specificity of 100 per cent. A strong correlation was found between SmartNav and X-ray angular insertion depth measurements (p < 0.001).
Conclusion
The SmartNav is a reliable tool for the intra-operative detection of tip fold-overand the assessment of angular insertion depth in patients with normal cochlear anatomy.
TEM analyses of germinal elements in miracidia and 6-day-old mother sporocysts of Echinostoma caproni were performed. Germinal elements in miracidia are represented by undifferentiated cells and germinal cells. They are localised in the posterior half of the body and form the primordium of the germinal mass, which plays the role of the gonad. In mother sporocysts the germinal mass is located caudally and plays the dual role of the gonad and the uterus. In addition to the undifferentiated cells and germinal cells, it contains embryos, which develop there up to the stage of germinal balls and then move into the sporocyst’s schizocoel, which plays the role of the brood chamber. New germinal cells are formed only by division of undifferentiated cells. No differences between undifferentiated and germinal cells in miracidia and those in sporocysts were found.
UK food system transformation is urgently needed, but to date, minimal research has investigated ‘blue foods’ probably because they are ethically nuanced. There exists a paradox whereby materially deprived communities should be eating more fish to meet nutritional requirements, yet there is a global ‘red flag’ around global overfishing. New collaborative and creative solutions are, therefore, needed to tackle such food system inequities. By working together, all voices can be equally heard when decisions are being made to improve the system. Similarly, innovation and disruption of established supply chains will enable better access to healthy, affordable and tasty food that will support better nutrition, health and wellbeing. This review paper will present a critique of the ‘The Plymouth Fish Finger’ as a collaborative social innovation case study. Part of the FoodSEqual research project, this exploratory pilot project championed ‘co-production’ approaches to achieve multiple (potential) impacts. This review will critically explore how this social innovation case study has exemplified the complex interplay between factors driving distortions in access to and availability of fish within the local food system. Through collaborative multi-stakeholder (transdisciplinary) processes, using participatory creative methods, new strategies and recommendations for research, practice, action and policy are informed, all of which offer great potential for progressive and transformative systemic (blue) food system change.
The mental health risk factors for primary healthcare workers (PHWs) following the Coronavirus Disease 2019 pandemic and the differences by urbanicity remain unclear. In this study, we aimed to identify key factors of anxiety and depression among PHWs in urban and rural settings in China.
Methods
This cross-sectional study was conducted in all 31 provinces in mainland China, between 1 May and 31 October 2022. A total of 3,769 PHWs, including family physicians, nurses, public health professionals, pharmacists, and other medical staff, were recruited from 44 urban community health service centers and 27 rural township hospitals. The Bayesian Additive Regression Tree model was employed to identify risk factors of anxiety and depression.
Results
Among 3,769 PHWs, 1,006 (26.7%) worked in urban areas and 2,763 (73.3%) in rural areas. Occupational satisfaction significantly influenced anxiety in both urban and rural practitioners. For urban PHWs, living with family (odds ratio (OR): 0.42, 95% confidence interval (CI): 0.28–0.62) and self-rated health (fair: OR: 0.31, 95% CI: 0.23–0.42; good: OR: 0.13, 95% CI: 0.09–0.20) were key factors of anxiety. For rural PHWs, after-work exercise (rarely: OR: 0.28, 95% CI: 0.11–0.76; frequently: OR: 0.15, 95% CI: 0.05–0.44) played a critical role. Depression was associated with after-work exercise, self-rated health, and occupational satisfaction for all PHWs. Additionally, living with family (OR: 0.51, 95% CI: 0.34–0.75) and organizational support satisfaction (satisfied: OR: 0.28, 95% CI: 0.19–0.42) were significant for urban practitioners.
Conclusions
Risk factors such as occupational satisfaction, health, and family relations significantly influence PHW mental health in China, with notable differences by urbanicity. Tailored mental health interventions are recommended to address urban–rural disparities.
Myasthenia gravis (MG) is an autoimmune neuromuscular disorder characterized by fatigable weakness and increased perioperative vulnerability. Postoperative myasthenic crisis, defined as respiratory failure requiring prolonged ventilation or re-intubation, remains a feared complication after surgical procedures such as thymectomy. The efficacy of preoperative interventions such as intravenous immunoglobulin (IVIg) and plasmapheresis remains uncertain. This review examines the evidence supporting risk stratification tools and immunomodulatory strategies to prevent postoperative myasthenic crisis. A comprehensive literature review was conducted focusing on studies evaluating the incidence, risk factors and preventive strategies for postoperative myasthenic crisis in MG patients. Particular emphasis was placed on clinical predictive models and randomized trials assessing preoperative IVIg and plasmapheresis. Recent data suggest the incidence of postoperative myasthenic crisis has declined to below 10%, largely due to advances in surgical technique and perioperative care. Established risk factors include bulbar involvement, reduced pulmonary function and prior crises. Risk prediction models such as the Leuzzi and Kanai scores offer clinically useful stratification. While older retrospective studies favored preoperative plasmapheresis, meta-analyses and randomized trials have yielded mixed results. Randomized trials of IVIg have shown no significant benefit in well-controlled patients, and both interventions carry notable risks and costs. Current evidence does not support the routine use of IVIg or plasmapheresis prior to surgery in all MG patients. A targeted, risk-based approach guided by validated predictive models is recommended to minimize unnecessary interventions and health care system costs.
Persons living with dementia (PLWD) and their caregivers (CG) face a complex disease trajectory, which includes a multitude of challenges related to identifying credible health resources, access to services, and securing emotional support. Scalable, sustainable interventions that guide recently diagnosed PLWD and CG are desperately needed to minimize unnecessary burden and improve quality of life. This article describes the feasibility and acceptability of an early virtual palliative care intervention (SUPPORT-DTM) for use among PLWD with mild Cognitive Impairment or Alzheimer’s disease and their CG.
Methods
Using a quasi-experimental design, this 6-week prospective feasibility study was conducted among 28 (PLWD/CG) dyads and 2 individual CG. Eligibility criteria for PLWD included those with mild cognitive impairment (FAST score ≥4). SUPPORT-DTM comprises 4 main areas of guided support: 1) understanding the disease, 2) caring for myself, 3) caring for the caregiver, and 4) planning for the future. Outcome data were collected pre/post and during the intervention. Semi-structured interviews were conducted post intervention with 10 dyads. This study was approved by the Medical University of South Carolina IRB and data were collected from January 2023 to March 2024.
Results
Seventy-six percent (23/30) of enrolled dyads successfully completed the study. PLWD and CG scores on validated measures of acceptability, appropriateness, and feasibility indicated SUPPORT-DTM was acceptable, appropriate, and feasible. Post-intervention interview feedback further evidenced the acceptability, appropriateness, and feasibility of SUPPORT-DTM.
Significance of results
Delivery of this virtual nurse-led early palliative care intervention (a Program of SUPPORT-DTM) was feasible for both PLWD and their CGs. A Program of SUPPORT-DTM has potential as a feasible intervention to provide anticipatory guidance to community-dwelling PLWD and CG. Participants endorsed inclusion of additional content specific to physical activity, stress management, and social support as helpful refinements for future delivery.
To assess differences in SARS-CoV-2 infection rates between patients receiving hemodialysis in outpatient centers (in-center) and those receiving dialysis in their homes (hemodialysis and peritoneal dialysis) from December 29, 2020, through May 9, 2023.
Design:
Retrospective cohort study.
Setting:
Outpatient dialysis facilities in the United States reporting to the Centers for Disease Control and Prevention’s National Healthcare Safety Network.
Patients:
Maintenance dialysis patients that received hemodialysis treatment at or were affiliated with outpatient dialysis facilities.
Methods:
SARS-CoV-2 infection rates were assessed by dialysis setting (in-center and home). Weeks were categorized as surge (rate of infection > median) and non-surge (rate of infection ≤ median) and by variant predominance. A negative binomial regression model with generalized estimating equations was constructed to examine differences in rates of infection among patients.
Results:
A total of 7,974 dialysis facilities reported 171,338 SARS-CoV-2 infections among patients. In-center hemodialysis patients had higher average rates of SARS-CoV-2 infection at 2.85 infections per 1000 patient-weeks than home patients at 1.69 infections per 1000 patient-weeks. During surge weeks, the differences in rates of infection between in-center and home patients were more pronounced than during non-surge weeks for all variant predominance categories: Delta (relative rate ratio (RRR) = 1.20, CI: 1.09–1.32), B.1 and Other (RRR = 1.11, CI: 1.02–1.22), and Omicron (RRR = 1.07, CI: 1.01–1.12).
Conclusion:
Rates of SARS-CoV-2 infection among patients receiving outpatient hemodialysis were persistently higher than rates among patients receiving dialysis treatments at home; these differences were more pronounced during surge weeks.
Positive food consumption remains one of the most common challenges among older adults in the UK with at least 10% in community settings and up to 45% in care homes affected by malnutrition. It is strongly associated with frailty, functional and health decline. Tracking and understanding the impact of diet is not easy. There are problems with monitoring diet and malnutrition screening such as difficulty remembering, lack of time, or needing a dietician to interpret the results. Computerised tailored education may be a positive solution to these issues. Due to the rise in smartphone ownership the use of technology to monitor diet is becoming more popular. This review paper will aim to look at the issues with current methods of dietary monitoring particularly in older adults, it will present the benefits and barriers of using to monitor food intake. It will discuss how a photo food monitoring app was developed to address the current issues with technology and how it was tested with older adults living in community and care settings. The prototype was co-developed and incorporated automated food classification to monitor dietary intake and food preferences and tested with older adults. The prototype was usable to both older adults and care workers and feedback on how to improve its use was collected. Key design improvements to make it quicker and more accurate were suggested for future testing in this population. With adaptions this prototype could be beneficial to older adults living in both community and care settings.
Growing demand for social care and resource constraints compel decision-makers to decide how to allocate public resources to social care. Such decisions may result in differences in access to social care between groups in society. In this study we conducted a secondary analysis of articles included in a systematic review on the underpinnings of resource allocation decisions in social care, extending that work to examine the potential consequences of such decisions. We conducted the review in accordance with the PRISMA framework. Through a thematic framework analysis of 37 of the 42 articles included in the parent review, we identified five groups in society that may be disproportionately affected by the consequences of resource allocation decisions on social care: (1) individuals with long-term social care needs (2) informal caregivers, (3) lower socio-economic groups, (4) individuals with limited health literacy skills, and (5) individuals living across different regions. Our findings highlight that allocation decisions in social care particularly affect women and individuals facing language barriers and may create local variation in provision of social care. These findings suggest potential for inequitable access to social care in society and underscore the need for decision-makers to consider the consequences of their allocation decisions.
Ensuring adequate food intake among older people is essential for maintaining health and preventing malnutrition. This review explores strategies to enhance dietary intake in this population group. Several key interventions are highlighted, including offering high-energy and protein-fortified meals and snacks, optimising the visual appeal and presentation of foods, enhancing flavours, and providing finger foods or modified textures to support consumption. Familiarity with fortified foods may encourage acceptance and increase intake, while improving food aesthetics and incorporating varied flavours can enhance enjoyment and promote consumption. Flavour enhancement may help compensate for decline in smell and taste sensitivity often experienced by older people, helping to sustain interest in food and promote greater intake. Finger foods present a practical solution for older adults with physical impairments, allowing for easier handling and self-feeding. Additionally, for individuals with dysphagia or chewing difficulties, texture-modified diets tailored to their needs support safe food intake. Research suggests that refining food presentation through techniques such as moulding and 3D printing may improve palatability and appeal, potentially boosting consumption among older adults. Addressing sensory preferences and physical challenges associated with eating is critical to ensuring adequate nutrition and promoting overall wellbeing in the elderly population. This review underscores the importance of multifaceted dietary strategies, advocating for personalised interventions that align with older individuals’ needs and preferences to enhance food intake and nutritional status.
To evaluate the healthfulness of the food/beverage products featured by TikTok influencers whose audiences include millions of adolescents.
Design:
In a cross-sectional study, we collected the maximum available up to 100 videos from the top 100 TikTok influencers in the USA – based on views, likes, comments and shares – in July 2022. For each video, we identified the most prominent food/beverage product featured. We used the Nutrient Profile Index (NPI) to classify food products as healthy/unhealthy. We grouped beverages by category.
Setting:
TikTok
Participants:
N/A
Results:
Our sample included 8871 videos, 1360 (15·3 %) of which featured at least one food (n 755, 55·5 %), beverage (n 580, 42·6 %) or dietary supplement (n 25, 1·8 %). Mean NPI score for foods was 54·73 (sd 19·95). Most foods (58 %) were considered unhealthy, with a 20-percentage-point difference between branded (70·8 %) and unbranded (50·8 %) foods. Alcohol (n 154, 26·6 %) and energy drinks (n 149, 25·7 %) were the most featured beverages overall. Among branded beverages, energy drinks were the largest category (n 148, 38·9 %). Among unbranded beverages, alcoholic drinks were the largest category (n 73, 36·5 %).
Conclusions:
More than half of the foods promoted by TikTok influencers were considered unhealthy, and most beverages featured were alcoholic and energy drinks. Many foods and a large share of alcoholic beverages were unbranded, either reflecting genuine influencer preferences or potentially masking the true extent of commercial marketing. Given the reach of influencers, including millions of adolescents, stronger regulations are needed for social media platforms, influencers and brands to protect consumers from undue harm from food/beverage marketing.
Ascaridia galli and Heterakis gallinarum, the most prevalent nematodes of chickens, inhabit the small intestine and caeca, respectively, and often co-occur. Current excreta egg count (EEC) methods do not differentiate between their eggs, and although chickens produce two distinct excreta types – intestinal excreta (IE) and caecal excreta (CE) – the distribution of eggs of these species across them remains poorly understood. Forty Hy-Line Brown laying hens (40 weeks, mean body weight (BW) 2·07 ± 0·02 kg), cleared of prior nematode infection and artificially infected with A. galli (n = 20) or H. gallinarum (n = 20) were housed in separate floor pens and monitored for 26 weeks. Assessments included clinical signs, EECs from IE, CE and mixed excreta (ME), and worm recovery from subsets of birds at 8, 14, 20 and 26 weeks. Neither infection resulted in clinical signs, but A. galli slightly reduced BW gain (0·5 g/week/hen) than H. gallinarum (2·8 g/week/hen). Egg detection aligned with worm predilection sites: A. galli eggs were predominantly found in IE, while H. gallinarum eggs were largely confined to CE. In ME samples, egg counts were reduced by 45% relative to IE for A. galli and 60% relative to CE for H. gallinarum. EECs showed a negative but non-significant association with excreta moisture content. Natural re-infection produced a stable adult worm population in both infections. These findings demonstrate that analysing IE and CE separately provides a practical, non-lethal approach for differentiating these infections, while ME appears to have limited diagnostic utility. Further studies should evaluate these patterns across broader conditions and individual variation.
Amerotyphlops brongersmianus (Vanzolini, 1976) is the only representative of its family in Argentina, and to date, there have been no records of its parasites. Between 2013 and 2018, 46 specimens of A. brongersmianus were collected in Corrientes province and investigated for helminths. Eighty-three specimens of Serpentirhabdias aff. vellardi were collected from the lungs of nineteen hosts. Sixty-nine percent of the nematodes were collected from the vascular lung mostly, with a prevalence of 41.3% and a mean intensity of 3.74 worms. Adult snakes had higher prevalence and mean abundance than juveniles; the mean intensity was similar between sexes and sexual maturity groups. The association between sexual maturity and the presence of lungworms was statistically significant. Lungworm abundance, weight, and length of adult snakes showed weak to strong positive correlations, with stronger correlations in males; however, these were not statistically significant. Nematodes followed a negative binomial distribution. Seasonal differences in parasitological descriptors and mean body length of lungworms were not statistically significant. Nonetheless, a significant negative correlation was observed between lungworm abundance and body length in spring, suggesting a clustering effect. Our results are discussed based on host phenology, ecology, biology, and anatomy. The life cycles of lungworms, the abundance of potential transport hosts, as well as abiotic factors (rainfall and temperature), are also considered. This is the first report of lungworms in A. brongersmianus throughout its range, the second global report of helminths in a Typhlopidae species in the 21st century, and the first in South America and Argentina, providing ecological data.
Installation of a percutaneous gastrostomy tube is often needed for patients with amyotrophic lateral sclerosis (ALS) who develop severe dysphagia. However, there is uncertainty regarding the optimal timing for this procedure, especially with regard to the decline in respiratory function. Several guidelines suggest that gastrostomy should be placed before the forced vital capacity (FVC) drops below 50%, since the procedural risks are heightened. However, multiple studies argue that this procedure could be safe in patients with an FVC of less than 50%.
Methods:
In this retrospective study, we reviewed the medical records of all patients with ALS who had a gastrostomy at our center between 2010 and 2023. Our primary objective was to identify the 30-day mortality rate and the incidence of complications after this procedure. Also, we investigated whether predictive factors of adverse outcomes could be identified, particularly to evaluate if there was an association with pulmonary function.
Results:
We included 54 patients. The 30-day mortality rate was 9.3%, and the incidence of major complications was 16.7%. There was no statistical difference in complications between percutaneous endoscopic and radiologically inserted gastrostomy procedures. Predictive factors for complications were pre-existing pulmonary disease, pre-procedural CO2 levels and, although not statistically significant, diabetes. There was no association between FVC and the occurrence of adverse outcomes, although only 70% of patients had a measure of pulmonary function.
Conclusion:
In our study, there was no correlation between FVC and the occurrence of adverse events from the gastrostomy procedure. This suggests that the traditional cutoff of 50% FVC level should be re-examined and explored further in future studies.
Economic evaluation supports public funding decisions about the use of health technologies within the Portuguese National Health System (NHS). The methods guide for economic evaluation in Portugal serves both companies preparing economic evaluation submissions and the independent commission appraising the evidence submitted.
Methods
This article presents the revised methods guide for economic evaluation in Portugal. The revisions reflect advances in economic evaluation, updates to regulatory policies, and responses to the evolving economic context. The paper highlights the most significant changes to the guidance, comparing the new Portuguese guidelines to those from the United Kingdom and Canada. The discussion is framed around key comments received during public consultation.
Results
The updated guidelines recommend cost-effectiveness analyses based on quality-adjusted life years and advocate for long-term modelling, a 4 percent discount rate, and a focus on NHS costs. New features include guidance on the identification and management of uncertainty within a dynamic appraisal process with regular contract negotiations (which can trigger reappraisals). The guide also covers how cost-effectiveness models, typically centrally developed, should be adapted to the Portuguese context. It highlights the key role of structured expert elicitation to address uncertainties in evidence, including those related to model adaptation.
Conclusions
The revision was developed through stakeholder consultations and aligns with international best practices, offering more explicit and transparent methods to support health resource allocation decisions.
To determine values for the digestible indispensable amino acid score (DIAAS), it is recommended that ileal amino acid (AA) digestibility values obtained in growing pigs are used to characterise protein quality in different foods. Therefore, an experiment was conducted to determine the standardised ileal digestibility (SID) of AA in eight energy ingredients (barley, sorghum, wheat, brown rice, rice bran, wheat bran, cassava and paddy rice) fed to pigs, where SID values in pigs can be used to calculate approximate DIAAS values in humans. Among the data obtained for all energy ingredients, significant variations (P < 0·01) in CP and AA composition were observed. Rice bran and wheat bran had the highest CP (16·43 % and 18·16 %, respectively) and DIAAS scores of 81–88 for adult, qualifying as ‘good’ protein sources (> 75). Cassava, with the lowest CP (2·74 %), was limited by sulphur amino acid (54). Lysine (Lys) was the first-limiting AA in barley (74), sorghum (51) and wheat (49), with SID values lowest in wheat (71·04 %). Brown rice and paddy rice showed higher SIDLys (87·51 % and 78·13 %, respectively). These findings highlight the potential of bran-based ingredients and Lys fortification to improve protein quality in grain-dependent diets, providing the scientific basis to combat protein malnutrition.