We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Variation between general practices in the rate of consultations for musculoskeletal pain conditions may signal important differences in access to primary care, perceived usefulness, or available alternative sources of care; however, it might also just reflect differences in underlying ‘need’ between practices’ registered populations. In a study of 30 general practices in Staffordshire, we calculated the proportion of adults consulting for a musculoskeletal pain condition, then examined this in relation to selected practice and population characteristics, including the estimated prevalence of self-reported musculoskeletal problems and chronic pain in each practices’ registered population. Between September 2021 and July 2022, 18,388 adults were consulted for a musculoskeletal pain condition. After controlling for length of recruitment, time of year, and age-sex structure, the proportion consulting varied up to two-fold between practices but was not strongly associated with the prevalence of self-reported long-term musculoskeletal problems, chronic pain, and high-impact chronic pain.
The marketing of unhealthy foods has been implicated in poor diet and rising levels of obesity. Rapid developments in the digital food marketing ecosystem and associated research mean that contemporary review of the evidence is warranted. This preregistered (CRD420212337091)1 systematic review and meta-analysis aimed to provide an updated synthesis of the evidence for behavioural and health impacts of food marketing on both children and adults, using the 4Ps framework (Promotion, Product, Price, Place). Ten databases were searched from 2014 to 2021 for primary data articles of quantitative or mixed design, reporting on one or more outcome of interest following food marketing exposure compared with a relevant control. Reviews, abstracts, letters/editorials and qualitative studies were excluded. Eighty-two studies were included in the narrative review and twenty-three in the meta-analyses. Study quality (RoB2/Newcastle–Ottawa scale) was mixed. Studies examined ‘promotion’ (n 55), ‘product’ (n 17), ‘price’ (n 15) and ‘place’ (n 2) (some > 1 category). There is evidence of impacts of food marketing in multiple media and settings on outcomes, including increased purchase intention, purchase requests, purchase, preference, choice, and consumption in children and adults. Meta-analysis demonstrated a significant impact of food marketing on increased choice of unhealthy foods (OR = 2·45 (95 % CI 1·41, 4·27), Z = 3·18, P = 0·002, I2 = 93·1 %) and increased food consumption (standardised mean difference = 0·311 (95 % CI 0·185, 0·437), Z = 4·83, P < 0·001, I2 = 53·0 %). Evidence gaps were identified for the impact of brand-only and outdoor streetscape food marketing, and for data on the extent to which food marketing may contribute to health inequalities which, if available, would support UK and international public health policy development.
The UK’s Health and Care Act (2022; paused until 2025) includes a globally novel ban on paid-for online advertising of food and beverage products high in saturated fat, salt and sugar (HFSS), to address growing concerns about the scale of digital marketing and its impact in particular on children’s food and beverage preferences, purchases and consumption. This study aimed to understand the potential impact of the novel ban (as proposed in 2020) on specified forms of online HFSS advertising, through the lens of interdisciplinary expertise. We conducted semi-structured interviews via videoconference with eight purposively selected UK and global digital marketing, food and privacy experts. We identified deductive and inductive themes addressing the policy’s scope, design, implementation, monitoring and enforcement through iterative, consensual thematic analyses. Experts felt this novel ‘breakthrough’ policy has potential to substantially impact global marketing by establishing the principle of no HFSS advertising online to consumers of all ages, but they also identified substantive limitations that could potentially render it ‘entirely ineffective’, for example, the exclusion of common forms of digital marketing, especially brand marketing and marketing integrated within entertainment content; virtual/augmented reality, and ‘advertainment’ as particularly likely spaces for rapid growth of digital food marketing; and technical digital media issues that raise significant barriers to effective monitoring and compliance. Experts recommended well-defined regulations with strong enforcement mechanisms. These findings contribute insights for effective design and implementation of global initiatives to limit online HFSS food marketing, including the need for government regulations in place of voluntary industry restrictions.
In acute ischemic stroke, a longer time from onset to endovascular treatment (EVT) is associated with worse clinical outcome. We investigated the association of clinical outcome with time from last known well to arrival at the EVT hospital and time from hospital arrival to arterial access for anterior circulation large vessel occlusion patients treated > 6 hours from last known well.
Methods:
Retrospective analysis of the prospective, multicenter cohort study ESCAPE-LATE. Patients presenting > 6 hours after last known well with anterior circulation large vessel occlusion undergoing EVT were included. The primary outcome was the modified Rankin Scale (mRS) score at 90 days. Secondary outcomes were good (mRS 0–2) and poor clinical outcomes (mRS 5–6) at 90 days, as well as the National Institutes of Health Stroke Scale at 24 hours. Associations of time intervals with outcomes were assessed with univariable and multivariable logistic regression.
Results:
Two hundred patients were included in the analysis, of whom 85 (43%) were female. 90-day mRS was available for 141 patients. Of the 150 patients, 135 (90%) had moderate-to-good collaterals, and the median Alberta Stroke Program Early CT Score (ASPECTS) was 8 (IQR = 7–10). No association between ordinal mRS and time from last known well to arrival at the EVT hospital (odds ratio [OR] = 1.01, 95% CI = 1.00–1.02) or time from hospital arrival to arterial access (OR = -0.01, 95% CI = -0.02–0.00) was seen in adjusted regression models.
Conclusion:
No relationship was observed between pre-hospital or in-hospital workflow times and clinical outcomes. Baseline ASPECTS and collateral status were favorable in the majority of patients, suggesting that physicians may have chosen to predominantly treat slow progressors in the late time window, in whom prolonged workflow times have less impact on outcomes.
Although pediatric cancer often causes significant stress for families, most childhood cancer survivors are resilient and do not exhibit severe or lasting psychopathology. Research demonstrates some survivors may report benefit-finding or positive outcomes following this stressful life event. However, considerably less research has included families of children who are unlikely to survive their illness. Thus, this study investigated benefit-finding among parents and their children with advanced cancer, as well as associated demographic and medical factors.
Methods
Families (N = 72) of children with advanced cancer (ages 5–25) were recruited from a large pediatric hospital. Advanced cancer was defined as relapsed or refractory disease, an estimated prognosis of <60%, or referral to end-of-life care. Participants completed a demographic survey and the Benefit Finding Scale at enrollment.
Results
Children, mothers, and fathers reported moderate to high benefit-finding scores. Correlations between family members were weak and non-significant. Children reported significantly higher benefit-finding than fathers. Demographic and medical factors were not associated with benefit-finding in children, mothers, or fathers.
Significance of results
Families of children with advanced cancer reported moderate to high benefit-finding regardless of background or medical factors. Children identified benefits of their cancer experience independent of the experiences of their mothers and fathers. Larger studies should continue to examine factors associated with positive and negative outcomes in the context of childhood cancer to inform interventions.
This study examines how psychological aspects of vestibular disorders are currently addressed highlighting any national variation.
Method
An online survey was completed by 101 UK healthcare professionals treating vestibular disorders. The survey covered service configurations, attitudes towards psychological aspects and current clinical practice.
Results
Ninety-six per cent of respondents thought there was a psychological component to vestibular disorders. There was a discrepancy between perceived importance of addressing psychological aspects and low confidence to undertake this. Those with more experience felt more confident addressing psychological aspects. History taking and questionnaires containing one or two psychological items were the most common assessment approaches. Discussing symptoms and signposting were the most frequent management approaches. Qualitative responses highlighted the interdependence of psychological and vestibular disorders which require timely intervention. Barriers included limited referral pathways, resources and interdisciplinary expertise.
Conclusion
Although psychological distress is frequently identified, suitable psychological treatment is not routinely offered in the UK.
To assess cost-effectiveness of late time-window endovascular treatment (EVT) in a clinical trial setting and a “real-world” setting.
Methods:
Data are from the randomized ESCAPE trial and a prospective cohort study (ESCAPE-LATE). Anterior circulation large vessel occlusion patients presenting > 6 hours from last-known-well were included, whereby collateral status was an inclusion criterion for ESCAPE but not ESCAPE-LATE. A Markov state transition model was built to estimate lifetime costs and quality-adjusted life-years (QALYs) for EVT in addition to best medical care vs. best medical care only in a clinical trial setting (comparing ESCAPE-EVT to ESCAPE control arm patients) and a “real-world” setting (comparing ESCAPE-LATE to ESCAPE control arm patients). We performed an unadjusted analysis, using 90-day modified Rankin Scale(mRS) scores as model input and analysis adjusted for baseline factors. Acceptability of EVT was calculated using upper/lower willingness-to-pay thresholds of 100,000 USD/50,000 USD/QALY.
Results:
Two-hundred and forty-nine patients were included (ESCAPE-LATE:n = 200, ESCAPE EVT-arm:n = 29, ESCAPE control-arm:n = 20). Late EVT in addition to best medical care was cost effective in the unadjusted analysis both in the clinical trial and real-world setting, with acceptability 96.6%–99.0%. After adjusting for differences in baseline variables between the groups, late EVT was marginally cost effective in the clinical trial setting (acceptability:49.9%–61.6%), but not the “real-world” setting (acceptability:32.9%–42.6%).
Conclusion:
EVT for LVO-patients presenting beyond 6 hours was cost effective in the clinical trial setting and “real-world” setting, although this was largely related to baseline patient differences favoring the “real-world” EVT group. After adjusting for these, EVT benefit was reduced in the trial setting, and absent in the real-world setting.
The complex structural canopy of tropical forests is extremely important for the survival and continued presence of arboreal primates. The destruction and degradation of tropical rainforest on the Indonesian island of Sumatra is causing significant declines in the endemic gibbon species residing in these shrinking habitats. This chapter compares recent density estimates of the lar gibbon (Hylobates lar) and the siamang (Symphalangus syndactylus) in a historically logged area of lowland forest, Sikundur, north Sumatra, to range-wide densities of both species and the ecologically similar agile gibbon (Hylobates agilis) across the island. Density estimates for Sumatran gibbon species are largely influenced by altitude and habitat preference. Siamang densities in Sikundur were similar to previously obtained range-wide densities, whereas lar gibbon densities were lower than their reported natural density range. Sikundur’s degraded forest, consisting of reduced tree heights and low tree connectivity, has potentially impeded the ability of the lar gibbon to attain higher densities. However, the presence of these small apes in this degraded lowland forest, albeit at lower densities, demonstrates that these areas can still be important habitats for gibbons, and emphasises the importance of ongoing regeneration of previously degraded forest for the future survival of these species.
Research on Palaeolithic hunter-gatherer diet has focused on the consumption of animals. Evidence for the use of plant foods is comparatively limited but is rapidly expanding. The authors present an analysis of carbonised macro-remains of processed plants from Franchthi Cave in the Aegean Basin and Shanidar Cave in the north-west Zagros Mountains. Microscopic examination of the charred food remains reveals the use of pounded pulses as a common ingredient in cooked plant foods. The results are discussed in the context of the regional archaeobotanical literature, leading the authors to argue that plants with bitter and astringent tastes were key ingredients of Palaeolithic cuisines in South-west Asia and the Eastern Mediterranean.
Due to continuing pressures on the UK National Health Service’s mental health services, there has been increased interest in the development of brief psychological interventions (BPIs). These interventions are usually defined as including selected components of established psychological interventions, delivered over fewer sessions, and by staff with less specialised training (paraprofessionals). Cognitive behavioural therapy (CBT)-based BPIs for anxiety and depression have been found to be helpful for clients with mild to moderate mental health problems. This project evaluates the introduction of BPIs for anxiety and depression in a secondary care adult mental health service, with clients experiencing moderate to severe mental health difficulties. The service developed CBT-based manuals for anxiety (anxiety management) and depression (behavioural activation) BPIs. The BPIs were delivered by mental health workers without core therapeutic training, who were offered training and group supervision by psychologists in the team. Measures of anxiety (GAD-7), depression (PHQ-9), wellbeing (SWEMWBS) and functioning (WSAS) were completed at the start and end of treatment. The data reported from a 2-year period suggest that BPIs are associated with reductions in symptoms of anxiety and low mood, and improvements in wellbeing and functioning. Whilst this is a small-scale initial evaluation, the results are promising for the potential benefit of BPIs for clients in secondary care settings. Given that this new way of working has possible additional benefits such as improving access to psychological treatment and cost-effectiveness, further research in the area is warranted and encouraged.
Key learning aims
(1) To overview the current evidence for BPIs.
(2) To outline a possible model for offering BPIs in secondary care.
(3) To illustrate the potential positive effects of BPIs within a secondary care population.
(4) To consider the need for future research and development of BPIs.
It is a cliché of self-help advice that there are no problems, only opportunities. The rationale and actions of the BSHS in creating its Global Digital History of Science Festival may be a rare genuine confirmation of this mantra. The global COVID-19 pandemic of 2020 meant that the society's usual annual conference – like everyone else's – had to be cancelled. Once the society decided to go digital, we had a hundred days to organize and deliver our first online festival. In the hope that this will help, inspire and warn colleagues around the world who are also trying to move online, we here detail the considerations, conversations and thinking behind the organizing team's decisions.
Distress intolerance has been suggested to be a maintaining factor in several mental health conditions. Distress tolerance skills training has been found to be beneficial in emotionally unstable personality disorder (EUPD) and post-traumatic stress disorder (PTSD). Short-term targeted interventions are increasingly being implemented in response to demand. This study investigates the efficacy of a distress tolerance brief psychological intervention (DT BPI) delivered by non-psychologists within an adult secondary care mental health service. Questionnaire data (pre and post) are reported from 43 participants who completed the intervention. Results suggest that the intervention was associated with significant improvements in distress tolerance, mood, anxiety and wellbeing. This indicates that a DT BPI can be effective when delivered by non-psychologists to real-world adult secondary care clients. The findings offer promising evidence that DT BPI could be a beneficial, cost-effective intervention and warrants further large-scale investigation.
Key learning aims
(1) To enhance practitioners’ awareness of distress intolerance as a potential maintaining factor and therefore treatment target.
(2) To outline a transdiagnostic distress tolerance brief psychological intervention.
(3) To illustrate the potential of this distress tolerance brief psychological intervention to produce positive reliable change with real-world clients when delivered by non-psychologists.
The review aimed to identify factors influencing opioid prescribing as regular pain-management medication for older people.
Background:
Chronic pain occurs in 45%–85% of older people, but appears to be under-recognised and under-treated. However, strong opiate prescribing is more prevalent in older people, increasing at the fastest rate in this age group.
Methods:
This review included all study types, published 1990–2017, which focused on opioid prescribing for pain management among older adults. Arksey and O’Malley’s framework was used to scope the literature. PubMed, EBSCO Host, the UK Drug Database, and Google Scholar were searched. Data extraction, carried out by two researchers, included factors explaining opioid prescribing patterns and prescribing trends.
Findings:
A total of 613 papers were identified and 53 were included in the final review consisting of 35 research papers, 10 opinion pieces and 8 grey literature sources. Factors associated with prescribing patterns were categorised according to whether they were patient-related, prescriber-driven, or system-driven. Patient factors included age, gender, race, and cognition; prescriber factors included attitudes towards opioids and judgements about ‘normal’ pain; and policy/system factors related to the changing policy landscape over the last three decades, particularly in the USA.
Conclusions:
A large number of context-dependent factors appeared to influence opioid prescribing for chronic pain management in older adults, but the findings were inconsistent. There is a gap in the literature relating to the UK healthcare system; the prescriber and the patient perspective; and within the context of multi-morbidity and treatment burden.
Characterizing non-lethal damage within dry seeds may allow us to detect early signs of ageing and accurately predict longevity. We compared RNA degradation and viability loss in seeds exposed to stressful conditions to quantify relationships between degradation rates and stress intensity or duration. We subjected recently harvested (‘fresh’) ‘Williams 82’ soya bean seeds to moisture, temperature and oxidative stresses, and measured time to 50% viability (P50) and rate of RNA degradation, the former using standard germination assays and the latter using RNA Integrity Number (RIN). RIN values from fresh seeds were also compared with those from accessions of the same cultivar harvested in the 1980s and 1990s and stored in the refrigerator (5°C), freezer (−18°C) or in vapour above liquid nitrogen (−176°C). Rates of viability loss (P50−1) and RNA degradation (RIN⋅d−1) were highly correlated in soya bean seeds that were exposed to a broad range of temperatures [holding relative humidity (RH) constant at about 30%]. However, the correlation weakened when fresh seeds were maintained at high RH (holding temperature constant at 35°C) or exposed to oxidizing agents. Both P50−1 and RIN⋅d−1 parameters exhibited breaks in Arrhenius behaviour near 50°C, suggesting that constrained molecular mobility regulates degradation kinetics of dry systems. We conclude that the kinetics of ageing reactions at RH near 30% can be simulated by temperatures up to 50°C and that RNA degradation can indicate ageing prior to and independent of seed death.
With ageing there is a reduction in muscle mass and strength, termed sarcopenia. A further consequence of ageing is a reduction in appetite and this can result in a reduced energy intake and malnutrition. Increased dietary protein intake may reduce the risk of sarcopenia, however, protein is particularly satiating. Increasing protein intake in the older adult population, without a reduction of overall energy intake and appetite is desirable. The primary aim of this study was to investigate the effect of protein supplementation on dietary intake and appetite. A further aim was to explore whether the time of consumption (morning vs evening) modified the impact of protein on energy intake and appetite.
Materials and methods
Twenty-four middle-older aged (50–75 years) participants were recruited to a randomised cross-over trial. In phase 1 (pre-supplementation) participants completed a 3d food diary and were asked to report hunger and appetite using visual analogue scale questionnaires. In the second and third phases, participants consumed a whey protein gel (containing 20 g protein and 376kJ of energy) for 4 days at either the evening (before bed) or in the morning (after breakfast) and completed the same tasks as phase 1. There was a 1-week wash-out period before crossing over to the alternative time point. Repeated measures ANOVA was used to analyse the data.
Results
There was no significant difference in average daily energy and macronutrient intake provided by the habitual diet in the pre-supplementation phase compared to the whey protein supplementation phases, irrespective of timing (p > 0.05). Similarly, no significant differences were observed in reported feelings of hunger and appetite (p > 0.05).
Discussion
Contrary to expectations, the addition of a 20g/day whey protein supplement did not alter subsequent energy and macronutrient intake when consumed over a 4-day period in this middle-older adult population. This may be due to the low-calorie composition of the supplement, or the timing of the intake. This research helps to inform protein delivery strategies, however different product formulations need to be explored, and studies of longer duration are required to understand the impact of prolonged supplementation on eating behaviour.
In 2003, the Convention for the Safeguarding of Intangible Cultural Heritage (UNESCO ICH Convention) formalized provision for forms of heritage not solely rooted in the material world. This expanded the scope and accessibility of cultural heritage rights for communities and groups. To much commentary and critique, the United Kingdom (UK) infamously decided not to ratify the UNESCO ICH Convention. This article examines the implications of the UK’s decision not to ratify the Convention for the cultural heritage and human rights of an asylum-seeking group in Glasgow, Scotland, namely, the Glasgow Bajuni campaigners, members of a minority Somali clan. Based on participatory ethnographic fieldwork with the group and analysis of their asylum cases, this article makes two observations: first, that the UK’s absence from the Convention establishes a precedent in which other state actors (that is, immigration authorities) are emboldened to advance skepticism over matters involving intangible cultural heritage and, second, that despite this, limitations in current provisions in the UNESCO ICH Convention would provide the group with little additional protection than they currently have. Developing these observations, we critique current UK approaches to intangible cultural heritage as complicit in the maintenance of hierarchies and the border. Finally, we consider the extent to which the current provisions of the UNESCO ICH Convention might be improved to include migrant and asylum-seeking groups.
Depression and borderline personality disorder (BPD) are both thought to be accompanied by alterations in the subjective experience of environmental rewards. We evaluated responses in women to sweet, bitter and neutral tastes (juice, quinine and water): 29 with depression, 17 with BPD and 27 healthy controls. The BPD group gave lower pleasantness and higher disgust ratings for quinine and juice compared with the control group; the depression group did not differ significantly from the control group. Juice disgust ratings were related to self-disgust in BPD, suggesting close links between abnormal sensory processing and self-identity in BPD.
Computers are encountered increasingly in the clinical setting, including during aphasia rehabilitation. However, currently we do not know what people with aphasia think about using computers in therapy and daily life, or to what extent people with aphasia use computers in their everyday life. The present study explored: (1) the use of computers by people with aphasia; and (2) the perceptions of people with aphasia towards computers and computer-based therapy. Thirty-four people with aphasia completed an aphasia-friendly paper-based survey about their use of computers before and after the onset of their aphasia, and their attitudes towards computer-based aphasia therapy. There was a high level of computer usage by people with aphasia both before and after the onset of their aphasia. However, the nature of the computer use changed following aphasia onset, with a move away from work-based usage. The majority of the cohort used computers for aphasia therapy and liked using computer-based aphasia therapy, provided that the programs were perceived as appropriate for their individual needs. The results highlight the importance of exposing people with aphasia to computer-based aphasia therapy in a supported clinical environment, and the need to ensure that computer-based therapy is individualised for each client. It should be noted, however, that while the majority of participants reported positive experiences with using computers, this does not mean that the computer-based therapy software used was necessarily an effective treatment for aphasia.
The short-interval fires required to promote grazing for large herbivores within the Cape Floristic Region World Heritage Site are detrimental to plant diversity. At the same time, longer interval fires significantly reduce graze quality. Conservation managers thus face an enormous challenge when the herbivores are also a conservation priority, since the competing conservation objectives are difficult to reconcile. Population growth rates of genetically important populations of endangered Cape mountain zebra (Equus zebra zebra) are low or declining following management focused on their fynbos habitat. Investigation of spatial and temporal habitat use and the diet of Cape mountain zebra, focusing on the use of land historically converted to agricultural grassland within fynbos in De Hoop Nature Reserve (South Africa), determined factors limiting populations and facilitated development of management strategies. Zebras selected grassland over other habitat types, despite grassland accounting for only a small proportion of the reserve. Grasses also made up the greatest proportion of diet for zebras throughout the year. Time spent on grasslands increased with grass height and was likely to have been influenced by grass protein levels. It is likely that grazing resources are a limiting factor for zebra, and so options for improving and/or increasing grassland at De Hoop should be considered. Translocation of surplus males to other conservation areas, reductions in other herbivore populations and targeted burns to increase grassland availability all offer short-term solutions. However, the acquisition of agricultural grassland adjacent to reserves is likely to be a viable long-term management strategy for this and other genetically important Cape mountain zebra populations. Low conservation priority habitats, such as farmland, should be considered for other management conflicts, as they have the potential to play a vital role in conservation.