We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The migratory phase is a critical time for Fasciola hepatica as it must locate, penetrate and migrate through the alimentary tract to the liver parenchyma whilst under attack from the host immune response. Here, scanning and transmission electron microscopy were used to monitor the in vitro effects of sera (with, and without, complement depletion) on F. hepatica newly excysted juveniles (NEJs) and flukes recovered at 7, 35, 70 and 98 days post infection (dpi) from the liver and bile ducts of rats. Test sera were from these F. hepatica-infected rats. A F. hepatica NEJ-specific rabbit antiserum was also used. All fluke stages demonstrated release of the tegumental glycocalyx and microvesicles and intense activity within the tegumental syncytium characterized by eccrine secretion of T-0/T-1/T-2 secretory bodies with subsequent microvillar formation and shedding of microvesicles from the apical plasma membrane. Exposure of both NEJs and 35 dpi flukes to 35 and 70 dpi rat sera produced significant amounts of eccrine-derived secretory material and putative attached immunocomplex. Rabbit anti-F. hepatica NEJ-specific antiserum produced similar responses at the NEJ tegument, including binding of putative immunocomplex to the surface, but with additional blistering of some regions of the apical plasma membrane. Our data suggest that immune sera stimulates multiple interrelated secretory mechanisms to maintain the integrity of the tegumental barrier in response to immune attack. Concurrent release of microvesicles may also serve to both divert the immune response away from the fluke itself and permit delivery of immunomodulatory cargo to immune effector cells.
Vaccines have revolutionised the field of medicine, eradicating and controlling many diseases. Recent pandemic vaccine successes have highlighted the accelerated pace of vaccine development and deployment. Leveraging this momentum, attention has shifted to cancer vaccines and personalised cancer vaccines, aimed at targeting individual tumour-specific abnormalities. The UK, now regarded for its vaccine capabilities, is an ideal nation for pioneering cancer vaccine trials. This article convened experts to share insights and approaches to navigate the challenges of cancer vaccine development with personalised or precision cancer vaccines, as well as fixed vaccines. Emphasising partnership and proactive strategies, this article outlines the ambition to harness national and local system capabilities in the UK; to work in collaboration with potential pharmaceutic partners; and to seize the opportunity to deliver the pace for rapid advances in cancer vaccine technology.
Referrals for adult autism assessment to the Cambridgeshire Lifespan Autism Spectrum Service (CLASS) have increased from 430 in 2019 to 887 in 2023, with demand exceeding capacity. The team enrolled in the Royal College of Psychiatrists’ Quality Improvement (QI) Demand, Capacity and Flow (DCF) Collaborative. The agreed aim was to increase the number of diagnostic assessments by 51% per month.
Methods
Participants included the CLASS multi-disciplinary team (MDT), referrers, the provider improvement advisor and an autistic adult. Using the NHS Quality Service Improvement and Redesign (QSIR) six-step approach, a process map identified five key stages of the CLASS pathway. A project driver diagram was then used to identify change ideas in the referral, screening, pre-assessment, assessment and post-diagnostic stages.
Change ideas in the screening and assessment stages were prioritised and two Plan-Do-Study-Act (PDSA) cycles designed: PDSA 1) To reduce screening time by removing the first screening of referrals; PDSA 2) To increase the number of assessments conducted and completed in a single face-to-face appointment.
Data collected for PDSA 1 included: number of working days from date of referral to date added to waiting list and total screening time (minutes) per referral. Data were compared in a sample of 133 referrals from the two-stage screening process and 68 referrals from the one-stage process. Data collected for PDSA 2 included: average assessment time (minutes), average duration of open assessments, and the number of assessments completed within the same month. The data at Time 1 (before introducing PDSA 2) were compared with Time 2 (after PDSA 2) in a sample of 10 and nine referrals, respectively.
Results
PDSA 1) Statistical Process Control (SPC) charts show a reduction in mean working days from 160 to 30 working days. The mean screening time per referral reduced from 33 minutes to 23 minutes. PDSA 2) SPC charts show that between Time 1 and Time 2 there was (i) a reduction in clinician time in minutes per assessment (m = 236.8 to m = 210), (ii) a reduction in working days assessment remained open (m = 39.4 to m = 6.4), (iii) a reduction in number of assessments involving multiple appointments (6 of 10 to 3 of 9), (iv) an increase in the number of assessments completed in the same month (3 of 10 to 7 of 9).
Conclusion
These results show promise towards increasing DCF across the pathway, but further PDSAs (e.g., digitalising reporting, refining the post-diagnostic pathway) need to be implemented to achieve the overall aim.
Suicidal behaviors are prevalent among college students; however, students remain reluctant to seek support. We developed a predictive algorithm to identify students at risk of suicidal behavior and used telehealth to reduce subsequent risk.
Methods
Data come from several waves of a prospective cohort study (2016–2022) of college students (n = 5454). All first-year students were invited to participate as volunteers. (Response rates range: 16.00–19.93%). A stepped-care approach was implemented: (i) all students received a comprehensive list of services; (ii) those reporting past 12-month suicidal ideation were directed to a safety planning application; (iii) those identified as high risk of suicidal behavior by the algorithm or reporting 12-month suicide attempt were contacted via telephone within 24-h of survey completion. Intervention focused on support/safety-planning, and referral to services for this high-risk group.
Results
5454 students ranging in age from 17–36 (s.d. = 5.346) participated; 65% female. The algorithm identified 77% of students reporting subsequent suicidal behavior in the top 15% of predicted probabilities (Sensitivity = 26.26 [95% CI 17.93–36.07]; Specificity = 97.46 [95% CI 96.21–98.38], PPV = 53.06 [95% CI 40.16–65.56]; AUC range: 0.895 [95% CIs 0.872–0.917] to 0.966 [95% CIs 0.939–0.994]). High-risk students in the Intervention Cohort showed a 41.7% reduction in probability of suicidal behavior at 12-month follow-up compared to high-risk students in the Control Cohort.
Conclusions
Predictive risk algorithms embedded into universal screening, coupled with telehealth intervention, offer significant potential as a suicide prevention approach for students.
Foodborne trematodes (FBTs) have a worldwide distribution (with particular prevalence in south-east Asia) and are believed to infect almost 75 million people, with millions more living at risk of infection. Although mortality due to trematodiasis is low, these infections cause considerable morbidity and some species are associated with the development of cancer in hyperendemic regions. Despite this, FBTs are often side-lined in terms of research funding and have been dubbed neglected tropical diseases by the World Health Organisation. Thus, the aim of this special issue was to provide an update of our understanding of FBT infections, to shine a light on current work in the field and to highlight some research priorities for the future. With contributions from leading researchers, many from endemic regions, we review the major FBT species. In doing so we revisit some old foes, uncover emerging infections and discover how outbreaks are being dealt with as a result of new approaches to parasite control. We also report advances in our understanding of the interactions of FBTs with their mammalian hosts and uncover new interplay between trematodes and host microbiome components. We hope that this article collection will stimulate discussion and further research on the FBTs and help raise them from their neglected status.
OBJECTIVES/GOALS: Neonatal hypoxic-ischemic encephalopathy (HIE) is an acute neurologic syndrome where decreased blood flow and oxygen to the brain causes acute and chronic brain dysfunction. The only proven neuroprotective intervention for HIE is hypothermia treatment started within 6 hours of birth and 50% of survivors have long-term deficits. METHODS/STUDY POPULATION: Pre-clinical adult stroke studies demonstrated that vagus nerve stimulation (VNS) has anti-inflammatory effects and attenuates brain damage. Transcutaneous auricular VNS (taVNS) is safe and feasible in infants and may improve the motor skill of bottle feeding. We hypothesize that a combined hypothermia-taVNS treatment shortly after HIE birth will have neuroprotective effects, improve motor function, attenuate infarct volume inflammation compared to hypothermia alone. The HIE model includes ligation of the right common carotid artery in postnatal day 7 (P7) rats followed by 90min hypoxia (8% oxygen) and 2hr hypothermia. taVNS or sham taVNS was administered using a bipolar electrode placed on the auricular concha region for 30min, [30sec trains, 0.5msec duration, 20Hz frequency, followed by 4.5min breaks] RESULTS/ANTICIPATED RESULTS: Experimental groups include +HIE/+taVNS, +HIE/-taVNS, and -HIE/-taVNS. To assess motor function, grasping reflex and forelimb grip strength tasks were assessed prior to surgery through P10. Infarct volume was assessed at 72h after injury by staining coronal sections with cresyl-violet. Thirty-four rat pups underwent surgery with an 8.82% mortality rate. taVNS was well tolerated by the P7 rats when delivered below perceptual threshold (0.4-1.1mA). There was no difference in elementary motor function or infarct volume between any group. DISCUSSION/SIGNIFICANCE: Future studies will include 2.5hr hypoxia for a more severe brain injury and a -HIE/+taVNS control group. These initial pre-clinical studies in neonates are important in determining whether taVNS may translate as a treatment to improve outcomes after neonatal HIE.
This book provides a rigorous and challenging review of recent research in the realms of communication and cultural diversity. Focusing on health communication interventions concerning service users who may lack fluency in English, it shows that meeting the needs of all health service users depends on both structures and processes of communication.
Glyphosate’s efficacy is influenced by the amount absorbed and translocated throughout the plant to inhibit 5-enolpyruvyl shikimate-3-phosphate synthase (EPSPS). Glyphosate resistance can be due to target-site (TS) or non–target site (NTS) resistance mechanisms. TS resistance includes an altered target site and gene overexpression, while NTS resistance includes reduced absorption, reduced translocation, enhanced metabolism, and exclusion/sequestration. The goal of this research was to elucidate the mechanism(s) of glyphosate resistance in common ragweed (Ambrosia artemisiifolia L.) from Ontario, Canada. The resistance factor for this glyphosate-resistant (GR) A. artemisiifolia biotype is 5.1. No amino acid substitutions were found at positions 102 or 106 of the EPSPS enzyme in this A. artemisiifolia biotype. Based on [14C]glyphosate studies, there was no difference in glyphosate absorption or translocation between glyphosate-susceptible (GS) and GR A. artemisiifolia biotypes. Radio-labeled glyphosate metabolites were similar for GS and GR A. artemisiifolia 96 h after application. Glyphosate resistance in this A. artemisiifolia biotype is not due to an altered target site due to amino acid substitutions at positions 102 and 106 in the EPSPS and is not due to the NTS mechanisms of reduced absorption, reduced translocation, or enhanced metabolism.
Data from rock shelters in southern Belize show evidence of tool making, hunting, and aquatic resource exploitation by 10,500 cal b.c.; the shelters functioned as mortuary sites between 7600 and 2000 cal b.c. Early Holocene contexts contain stemmed and barbed bifaces as part of a tradition found broadly throughout the neotropics. After around 6000 cal b.c., bifacial tools largely disappear from the record, likely reflecting a shift to increasing reliance on plant foods, around the same time that the earliest domesticates appear in the archaeological record in the neotropics. We suggest that people living in southern Belize maintained close ties with neighbors to the south during the Early Holocene, but lagged behind in innovating new crops and farming technologies during the Middle Holocene. Maize farming in Belize intensified between 2750–2050 cal b.c. as maize became a dietary staple, 1000–1300 years later than in South America. Overall, we argue from multiple lines of data that the Neotropics of Central and South America were an area of shared information and technologies that heavily influenced cultural developments in southeastern Mesoamerica during the Early and Middle Holocene.
Feedback loops are a key characteristic of engineering design processes that increase complexity, time to market, and costs. However, some feedback loops, due to design iteration, have a positive impact on design outcomes (i.e., the quality of the final design), so are worth the time and costs incurred. Other loops, resulting from rework, also have a positive impact on the final design but their impact on current projects, in terms of their urgency and so interruption, is high. Thus, overall, and drawing on socio-technical systems literature, some feedback loops are virtuous circles with a positive impact whereas others are vicious circles with a negative impact. In this paper, we report early work exploring these interplays between rework and design iteration through the development of process simulation models.
The British Society for Parasitology (BSP) holds a biannual symposium devoted to the kinetoplastids, and seeks to cover the full gamut of research into these important organisms, and alternates with the Woods Hole Kinetoplastid Molecular Cell Biology meeting that serves a similar community. While normally embedded within the main BSP Spring meeting, on several occasions the symposium has enjoyed the opportunity of being hosted on mainland Europe. In 2020, the BSP was fortunate to spend some time in Granada in Spain, where a superb meeting with excellent science in a spectacular setting was overshadowed by news of an emerging novel coronavirus. In this editorial, we hope to have captured some of that excellent science and to highlight aspects of the many great papers and reviews in this special issue, as well as provide a few images from the meeting, which we hope for this who attended will bring back some fond memories.
To evaluate the role of procalcitonin (PCT) results in antibiotic decisions for COVID-19 patients at hospital presentation.
Design, setting, and participants:
Multicenter retrospective observational study of patients ≥18 years hospitalized due to COVID-19 at the Johns Hopkins Health system. Patients who were transferred from another facility with >24 hours stay and patients who died within 48 hours of hospitalization were excluded.
Methods:
Elevated PCT values were determined based on each hospital’s definition. Antibiotic therapy and PCT results were evaluated for patients with no evidence of bacterial community-acquired pneumonia (bCAP) and patients with confirmed, probable, or possible bCAP. The added value of PCT testing to clinical criteria in detecting bCAP was evaluated using receiving operating curve characteristics (ROC).
Results:
Of 962 patients, 611 (64%) received a PCT test. ROC curves for clinical criteria and clinical criteria plus PCT test were similar (at 0.5 ng/mL and 0.25 ng/mL). By bCAP group, median initial PCT values were 0.58 ng/mL (interquartile range [IQR], 0.24–1.14), 0.23 ng/mL (IQR, 0.1–0.63), and 0.15 ng/mL (IQR, 0.09–0.35) for proven/probable, possible, and no bCAP groups, respectively. Among patients without bCAP, an elevated PCT level was associated with 1.8 additional days of CAP therapy (95% CI, 1.01–2.75; P < .01) compared to patients with a negative PCT result after adjusting for potential confounders. Duration of CAP therapy was similar between patients without a PCT test ordered and a low PCT level for no bCAP and possible bCAP groups.
Conclusions:
PCT results may be abnormal in COVID-19 patients without bCAP and may result in receipt of unnecessary antibiotics.
To establish how real-world evidence (RWE) has been used to inform single technology appraisals (STAs) of cancer drugs conducted by the National Institute for Health and Care Excellence (NICE).
Methods
STAs published by NICE from April 2011 to October 2018 that evaluated cancer treatments were reviewed. Information regarding the use of RWE to directly inform the company-submitted cost-effectiveness analysis was extracted and categorized by topic. Summary statistics were used to describe emergent themes, and a narrative summary was provided for key case studies.
Results
Materials for a total of 113 relevant STAs were identified and analyzed, of which nearly all (96 percent) included some form of RWE within the company-submitted cost-effectiveness analysis. The most common categories of RWE use concerned the health-related quality of life of patients (71 percent), costs (46 percent), and medical resource utilization (40 percent). While sources of RWE were routinely criticized as part of the appraisal process, we identified only two cases where the use of RWE was overtly rejected; hence, in the majority of cases, RWE was accepted in cancer drug submissions to NICE.
Discussion
RWE has been used extensively in cancer submissions to NICE. Key criticisms of RWE in submissions to NICE are seldom regarding the use of RWE in general; instead, these are typically concerned with specific data sources and the applicability of these to the decision problem. Within an appropriate context, RWE constitutes an extremely valuable source of information to inform decision making; yet the development of best practice guidelines may improve current reporting standards.
We examined demographic, clinical, and psychological characteristics of a large cohort (n = 368) of adults with dissociative seizures (DS) recruited to the CODES randomised controlled trial (RCT) and explored differences associated with age at onset of DS, gender, and DS semiology.
Methods
Prior to randomisation within the CODES RCT, we collected demographic and clinical data on 368 participants. We assessed psychiatric comorbidity using the Mini-International Neuropsychiatric Interview (M.I.N.I.) and a screening measure of personality disorder and measured anxiety, depression, psychological distress, somatic symptom burden, emotional expression, functional impact of DS, avoidance behaviour, and quality of life. We undertook comparisons based on reported age at DS onset (<40 v. ⩾40), gender (male v. female), and DS semiology (predominantly hyperkinetic v. hypokinetic).
Results
Our cohort was predominantly female (72%) and characterised by high levels of socio-economic deprivation. Two-thirds had predominantly hyperkinetic DS. Of the total, 69% had ⩾1 comorbid M.I.N.I. diagnosis (median number = 2), with agoraphobia being the most common concurrent diagnosis. Clinical levels of distress were reported by 86% and characteristics associated with maladaptive personality traits by 60%. Moderate-to-severe functional impairment, high levels of somatic symptoms, and impaired quality of life were also reported. Women had a younger age at DS onset than men.
Conclusions
Our study highlights the burden of psychopathology and socio-economic deprivation in a large, heterogeneous cohort of patients with DS. The lack of clear differences based on gender, DS semiology and age at onset suggests these factors do not add substantially to the heterogeneity of the cohort.
We discuss the factors influencing the relationship between government policy-makers and scientists and how they affect the use of science in policy. We highlight issues related to context, values, culture, timeframes, communication and interpersonal relationships, providing insights from policy-makers and scientists. A spectrum of working strategies is given with examples of practical mechanisms that improve the effective use of science in policy. The shared governance model is a relatively mature approach with the potential to overcome many of the barriers discussed. At its core, shared governance, or co-production, invites policy-makers and scientists to develop and manage research priorities collaboratively. We explore the primary features of a successful shared governance arrangement, exemplified by the collaborative working model between the Australian Government Department of Agriculture and the Centre of Excellence for Biosecurity Risk Analysis. We conclude by outlining the advantages and disadvantages of the co-production of research priorities by scientists and policy-makers and present the learnings from its implementation in the biosecurity sector in Australia.
The Must Farm pile-dwelling site is an extraordinarily well-preserved Late Bronze Age settlement in Cambridgeshire, UK. The authors present the site's contextual setting, from its construction, occupation and subsequent destruction by fire in relatively quick succession. A slow-flowing watercourse beneath the pile-dwellings provided a benign burial environment for preserving the debris of construction, use and collapse, while the catastrophic manner of destruction introduced a definitive timeframe. The scale of its occupation speaks to the site's exceptional nature, enabling the authors to deduce the everyday flow and use of things in a prehistoric domestic setting.
We assess Mercury’s geologic history, focusing on the distribution and origin of terrain types and an overview of Mercury’s evolution from the pre-Tolstojan through the Kuiperian Period. We review evidence for the nature of Mercury’s early crust, including the possibility that a substantial portion formed by the global eruption of lavas generated by partial melting during and after overturn of the crystalline products of magma ocean cooling, whereas a much smaller fraction of the crust may have been derived from crystal flotation in such a magma ocean. The early history of Mercury may thus have been similar to that of the other terrestrial planets, with much of the crust formed through volcanism, in contrast to the flotation-dominated crust of the Moon. Small portions of Mercury’s early crust may still be exposed in a heavily modified and brecciated form; the majority of the surface is dominated by intercrater plains (Pre-Tolstojan and Tolstojan in age) and smooth plains (Tolstojan and Calorian) that formed through a combination of volcanism and impact events. As effusive volcanism waned in the Calorian, explosive volcanism continued at least through the Mansurian Period; the Kuiperian Period was dominated by impact events and the formation of hollows.
Schizophrenia affects 1% of the population. Clozapine is the only medication licensed for treatment-resistant schizophrenia and is intensively monitored to prevent harm from neutropenia. Clozapine is also associated with increased risk of pneumonia although the mechanism is poorly understood.
Aims
To investigate the potential association between clozapine and antibody deficiency.
Methods
Patients taking clozapine and patients who were clozapine-naive and receiving alternative antipsychotics were recruited and completed a lifestyle, medication and infection-burden questionnaire. Serum total immunoglobulins (immunoglobulin (Ig)G, IgA, IgM) and specific IgG antibodies to haemophilus influenzae type B, tetanus and IgG, IgA and IgM to pneumococcus were measured.
Results
Immunoglobulins were all significantly reduced in the clozapine-treated group (n = 123) compared with the clozapine-naive group (n = 111). Odds ratios (ORs) for a reduction in clozapine:control immunoglobulin values below the fifth percentile were IgG, OR = 6.00 (95% CI 1.31–27.44); IgA, OR = 16.75 (95% CI 2.18–128.60); and IgM, OR = 3.26 (95% CI 1.75–6.08). These findings remained significant despite exclusion of other potential causes of hypogammaglobulinaemia. In addition, duration on clozapine was associated with decline in IgG. A higher proportion of the clozapine-treated group reported taking more than five courses of antibiotics in the preceding year (5.3% (n = 5) versus 1% (n = 1).
Conclusions
Clozapine use was associated with significantly reduced immunoglobulin levels and an increased proportion of patients using more than five antibiotic courses in a year. Antibody testing is not included in existing clozapine monitoring programmes but may represent a mechanistic explanation and modifiable risk factor for the increased rates of pneumonia and sepsis-related mortality previously reported in this vulnerable cohort.
Declaration of interest
S.J. has received support from CSL Behring, Shire, LFB, Biotest, Binding Site, Sanofi, GSK, UCB Pharma, Grifols, BPL SOBI, Weatherden, Zarodex and Octapharma for projects, advisory boards, meetings, studies, speaker and clinical trials.
Increasing longevity and the strain on state and occupational pensions have brought into question long-held assumptions about the age of retirement, and raised the prospect of a workplace populated by ageing workers. In the United Kingdom the default retirement age has gone, incremental increases in state pension age are being implemented and ageism has been added to workplace anti-discrimination laws. These changes are yet to bring about the anticipated transformation in workplace demographics, but it is coming, making it timely to ask if the workplace is ready for the ageing worker and how the extension of working life will be managed. We report findings from qualitative case studies of five large organisations located in the United Kingdom. Interviews and focus groups were conducted with employees, line managers, occupational health staff and human resources managers. Our findings reveal a high degree of uncertainty and ambivalence among workers and managers regarding the desirability and feasibility of extending working life; wide variations in how older workers are managed within workplaces; a gap between policies and practices; and evidence that while casualisation might be experienced negatively by younger workers, it may be viewed positively by financially secure older workers seeking flexibility. We conclude with a discussion of the challenges facing employers and policy makers in making the modern workplace fit for the ageing worker.