We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Cannabis use severely affects the outcome of people with psychotic disorders, yet there is a lack of treatments. To address this, in 2019 the National Health Service (NHS) Cannabis Clinic for Psychosis (CCP) was developed to support adults suffering from psychosis to reduce and/or stop their cannabis use.
Aims
Examine outcome data from the first 46 individuals to complete the CCP's intervention.
Method
The sample (N = 46) consisted of adults (aged ≥ 18) with psychosis under the care of the South London and Maudsley NHS Foundation Trust, referred to the CCP between January 2020 and February 2023, who completed their intervention by September 2023. Clinical and functional measures were collected before (T0) and after (T1) the CCP intervention (one-to-one sessions and peer group attendance). Primary outcomes were changes in the Cannabis Use Disorders Identification Test-Revised (CUDIT-R) score and pattern of cannabis use. Secondary outcomes included T0–T1 changes in measures of delusions, paranoia, depression, anxiety and functioning.
Results
A reduction in the mean CUDIT-R score was observed between T0 (mean difference = 17.10, 95% CI = 15.54–18.67) and T1, with 73.91% of participants achieving abstinence and 26.09% reducing the frequency and potency of their use. Significant improvements in all clinical and functional outcomes were observed, with 90.70% being in work or education at T1 compared with 8.70% at T0. The variance in CUDIT-R scores explained between 34 and 64% of the variance in our secondary measures.
Conclusions
The CCP intervention is a feasible strategy to support cannabis use cessation/reduction and improve clinical and functional outcomes of people with psychotic disorders.
Anaemia is characterised by low hemoglobin (Hb) concentration. Despite being a public health concern in Ethiopia, the role of micronutrients and non-nutritional factors as a determinant of Hb concentrations has been inadequately explored. This study focused on the assessment of serum micronutrient and Hb concentrations and a range of non-nutritional factors, to evaluate their associations with the risk of anaemia among the Ethiopian population (n 2046). It also explored the mediation effect of Zn on the relation between se and Hb. Bivariate and multivariate regression analyses were performed to identify the relationship between serum micronutrients concentration, inflammation biomarkers, nutritional status, presence of parasitic infection and socio-demographic factors with Hb concentration (n 2046). Sobel–Goodman test was applied to investigate the mediation of Zn on relations between serum se and Hb. In total, 18·6 % of participants were anaemic, 5·8 % had iron deficiency (ID), 2·6 % had ID anaemia and 0·6 % had tissue ID. Younger age, household head illiteracy and low serum concentrations of ferritin, Co, Cu and folate were associated with anaemia. Serum se had an indirect effect that was mediated by Zn, with a significant effect of se on Zn (P < 0·001) and Zn on Hb (P < 0·001). The findings of this study suggest the need for designing a multi-sectorial intervention to address anaemia based on demographic group.
The mental health consequences of health emergencies and disasters have the potential to be sustained and severe. In recognition, the 2018 Kobe Expert Meeting on Health Emergency and Disaster Risk Management (Health EDRM), prioritized mental health as one of the key research areas of Health EDRM, to be addressed in a multi-country research project supported by WHO (Kayano et.al., 2019). As climate change, growing urbanization, population density and viral transmission generate increasingly severe hazards, attention to mental health will be critical.
Method:
The Asia Pacific Disaster Mental Health Network was established in 2020 to foster advancements in mental health research and policy in the region. Building connections between researchers, practitioners and policy makers, the Network includes broad representation from interdisciplinary scholars and organizations across eight Asian and Pacific nations. A research agenda was designed in early meetings, and collaborative research projects were established.
Results:
The Network has supported the development of innovative disaster mental health research investigating community engagement in recovery, psychosocial interventions, and evaluation frameworks. A recent multilingual systematic review of more than 200 longitudinal studies identified the long-term trajectories of post-traumatic stress symptoms, depression and anxiety following disasters and pandemics (Newnham et al., 2022). Synthesized evidence of risks related to age, gender and disaster type were determined to inform intervention targets.
Conclusion:
The Asia Pacific Disaster Mental Health Network established a platform for scholarly connection, intervention planning and knowledge dissemination. This presentation will provide an overview of the Network’s activities, and research highlights that have identified targeted points for policy and practice.
Current treatment guidelines advise that the deprescribing of antidepressants should occur around 6 months post-remission of symptoms. However, this is not routinely occurring in clinical practice, with between 30% and 50% of antidepressant users potentially continuing treatment with no clinical benefit. To support patients to deprescribe antidepressant treatment when clinically appropriate, it is important to understand what is important to patients when making the decision to reduce or cease antidepressants in a naturalistic setting.
Aim:
The current study aimed to describe the self-reported reasons primary care patients have for reducing or stopping their antidepressant medication.
Methods:
Three hundred and seven participants in the diamond longitudinal study reported taking an SSRI/SNRI over the life of the study. Of the 307, 179 reported stopping or tapering their antidepressant during computer-assisted telephone interviews and provided a reason for doing so. A collective case study approach was used to collate the reasons for stopping or tapering.
Findings:
Reflexive thematic analysis of patient-reported factors revealed five overarching themes; 1. Depression; 2. Medication; 3. Healthcare system; 4. Psychosocial, and; 5. Financial. These findings are used to inform suggestions for the development and implementation of antidepressant deprescribing discussions in clinical practice.
Patient- and proxy-reported outcomes (PROs) are an important indicator of healthcare quality and can be used to inform treatment. Despite the widescale use of PROs in adult cardiology, they are underutilised in paediatric cardiac care. This study describes a six-center feasibility and pilot experience implementing PROs in the paediatric and young adult ventricular assist device population.
Methods:
The Advanced Cardiac Therapies Improving Outcomes Network (ACTION) is a collaborative learning network comprised of 55 centres focused on improving clinical outcomes and the patient/family experience for children with heart failure and those supported by ventricular assist devices. The development of ACTION’s PRO programme via engagement with patient and parent stakeholders is described. Pilot feasibility, patient/parent and clinician feedback, and initial PRO findings of patients and families receiving paediatric ventricular assist support across six centres are detailed.
Results:
Thirty of the thirty-five eligible patients (85.7%) were enrolled in the PRO programme during the pilot study period. Clinicians and participating patients/parents reported positive experiences with the PRO pilot programme. The most common symptoms reported by patients/parents in the first month post-implant period included limitations in activities, dressing change distress, and post-operative pain. Poor sleep, dressing change distress, sadness, and fatigue were the most common symptoms endorsed >30 days post-implant. Parental sadness and worry were notable throughout the entirety of the post-implant experience.
Conclusions:
This multi-center ACTION learning network-based PRO programme demonstrated initial success in this six-center pilot study experience and yields important next steps for larger-scale PRO collection, research, and clinical intervention.
We developed an agent-based model using a trial emulation approach to quantify effect measure modification of spillover effects of pre-exposure prophylaxis (PrEP) for HIV among men who have sex with men (MSM) in the Atlanta-Sandy Springs-Roswell metropolitan area, Georgia. PrEP may impact not only the individual prescribed, but also their partners and beyond, known as spillover. We simulated a two-stage randomised trial with eligible components (≥3 agents with ≥1 HIV+ agent) first randomised to intervention or control (no PrEP). Within intervention components, agents were randomised to PrEP with coverage of 70%, providing insight into a high PrEP coverage strategy. We evaluated effect modification by component-level characteristics and estimated spillover effects on HIV incidence using an extension of randomisation-based estimators. We observed an attenuation of the spillover effect when agents were in components with a higher prevalence of either drug use or bridging potential (if an agent acts as a mediator between ≥2 connected groups of agents). The estimated spillover effects were larger in magnitude among components with either higher HIV prevalence or greater density (number of existing partnerships compared to all possible partnerships). Consideration of effect modification is important when evaluating the spillover of PrEP among MSM.
Multiple micronutrient deficiencies are widespread in Ethiopia. However, the distribution of Se and Zn deficiency risks has previously shown evidence of spatially dependent variability, warranting the need to explore this aspect for wider micronutrients. Here, blood serum concentrations for Ca, Mg, Co, Cu and Mo were measured (n 3102) on samples from the Ethiopian National Micronutrient Survey. Geostatistical modelling was used to test spatial variation of these micronutrients for women of reproductive age, who represent the largest demographic group surveyed (n 1290). Median serum concentrations were 8·6 mg dl−1 for Ca, 1·9 mg dl−1 for Mg, 0·4 µg l−1 for Co, 98·8 µg dl−1 for Cu and 0·2 µg dl−1 for Mo. The prevalence of Ca, Mg and Co deficiency was 41·6 %, 29·2 % and 15·9 %, respectively; Cu and Mo deficiency prevalence was 7·6 % and 0·3 %, respectively. A higher prevalence of Ca, Cu and Mo deficiency was observed in north western, Co deficiency in central and Mg deficiency in north eastern parts of Ethiopia. Serum Ca, Mg and Mo concentrations show spatial dependencies up to 140–500 km; however, there was no evidence of spatial correlations for serum Co and Cu concentrations. These new data indicate the scale of multiple mineral micronutrient deficiency in Ethiopia and the geographical differences in the prevalence of deficiencies suggesting the need to consider targeted responses during the planning of nutrition intervention programmes.
While it is known that patients with schizophrenia recognize facial emotions, specifically negative emotions, less accurately, little is known about how they misattribute these emotions to other emotions and whether such misattribution biases are associated with symptoms, course of the disorder, or certain cognitive functions.
Method
Outpatients with schizophrenia or schizoaffective disorder (n = 73) and healthy controls (n = 30) performed a computerised Facial Emotion Attribution Test and Wisconsin Card Sorting Test (WCST). Patients were also rated on the Positive and Negative Syndrome Scale (PANSS).
Results
Patients were poor at recognizing fearful and angry emotions and attributed fear to angry and angry to neutral expressions. Fear-as-anger misattributions were predicted independently by a longer duration of illness and WCST perseverative errors.
Conclusion
The findings show a bias towards misattributing fearful and angry facial emotions. The propensity for fear-as-anger misattribution biases increases as the length of time that the disorder is experienced increases and a more rigid style of information processing is used. This, at least in part, may be perpetuated by subtle fearfulness expressed by others while interacting with people with schizophrenia.
Malnutrition remains a leading contributor to the morbidity and mortality of children under the age of 5 years and can weaken the immune system and increase the severity of concurrent infections. Livestock milk with the protective properties of human milk is a potential therapeutic to modulate intestinal microbiota and improve outcomes. The aim of this study was to develop an infection model of childhood malnutrition in the pig to investigate the clinical, intestinal and microbiota changes associated with malnutrition and enterotoxigenic Escherichia coli (ETEC) infection and to test the ability of goat milk and milk from genetically engineered goats expressing the antimicrobial human lysozyme (hLZ) milk to mitigate these effects. Pigs were weaned onto a protein–energy-restricted diet and after 3 weeks were supplemented daily with goat, hLZ or no milk for a further 2 weeks and then challenged with ETEC. The restricted diet enriched faecal microbiota in Proteobacteria as seen in stunted children. Before infection, hLZ milk supplementation improved barrier function and villous height to a greater extent than goat milk. Both goat and hLZ milk enriched for taxa (Ruminococcaceae) associated with weight gain. Post-ETEC infection, pigs supplemented with hLZ milk weighed more, had improved Z-scores, longer villi and showed more stable bacterial populations during ETEC challenge than both the goat and no milk groups. This model of childhood disease was developed to test the confounding effects of malnutrition and infection and demonstrated the potential use of hLZ goat milk to mitigate the impacts of malnutrition and infection.
To use next-generation sequencing (NGS) analysis to enhance epidemiological information to identify and resolve a Clostridium difficile outbreak and to evaluate its effectiveness beyond the capacity of current standard PCR ribotyping.
METHODS
NGS analysis was performed as part of prospective surveillance of all detected C. difficile isolates at a university hospital. An outbreak of a novel C. difficile sequence type (ST)-295 was identified in a hospital and a community hostel for homeless adults. Phylogenetic analysis was performed of all ST-295 and closest ST-2 isolates. Epidemiological details were obtained from hospital records and the public health review of the community hostel.
RESULTS
We identified 7 patients with C. difficile ST-295 infections between June 2013 and April 2015. Of these patients, 3 had nosocomial exposure to this infection and 3 had possible hostel exposure. Current Society for Healthcare Epidemiology of America (SHEA)— Infectious Diseases Society of America (IDSA) surveillance definitions (2010) were considered in light of our NGS findings. The initial transmission was not detectable using current criteria, because of 16 weeks between ST-295 exposure and symptoms. We included 3 patients with hostel exposure who met surveillance criteria of hospital-acquired infection due to their hospital admissions.
CONCLUSION
NGS analysis enhanced epidemiological information and helped identify and resolve an outbreak beyond the capacity of standard PCR ribotyping. In this cluster of cases, NGS was used to identify a hostel as the likely source of community-based C. difficile transmission.
Toxoplasma gondii and Sarcocystis neurona are protozoan parasites with terrestrial definitive hosts, and both pathogens can cause fatal disease in a wide range of marine animals. Close monitoring of threatened southern sea otters (Enhydra lutris nereis) in California allowed for the diagnosis of dual transplacental transmission of T. gondii and S. neurona in a wild female otter that was chronically infected with both parasites. Congenital infection resulted in late-term abortion due to disseminated toxoplasmosis. Toxoplasma gondii and S. neurona DNA was amplified from placental tissue culture, as well as from fetal lung tissue. Molecular characterization of T. gondii revealed a Type X genotype in isolates derived from placenta and fetal brain, as well as in all tested fetal organs (brain, lung, spleen, liver and thymus). This report provides the first evidence for transplacental transmission of T. gondii in a chronically infected wild sea otter, and the first molecular and immunohistochemical confirmation of concurrent transplacental transmission of T. gondii and S. neurona in any species. Repeated fetal and/or neonatal losses in the sea otter dam also suggested that T. gondii has the potential to reduce fecundity in chronically infected marine mammals through parasite recrudescence and repeated fetal infection.
In line with the aims of Part Six, this chapter considers ways in which understanding of ethics can be embedded into the thinking and practice of professionals in training (see Figure 1.1). To illustrate this, the case study of an educational psychology professional training programme is presented. The authors take a positive ethics approach and highlight the use of moral theory for ethical decision making.
The ability to be alert to the ethical dimensions of practice and to acquire professional knowledge and skills to make informed ethical decisions is a core competency of educational psychologists. This chapter considers approaches to teaching and learning about ethics and ethical practice that may assist student educational psychologists. Specifically, it reports on the authors’ reflections about approaches adopted in one professional training programme; although it is anticipated that some of the issues and lessons learned will have resonance for other professional training programmes in the UK and further afield. In reviewing teaching and learning approaches, the authors have considered professional guidance (for example, codes of ethics), ethical theories and research in moral development and ethical behaviour. They undertook an exploratory investigation into the ethical perspectives of a group of student educational psychologists at the beginning of their professional training. The findings from this study, which resulted in a re-evaluation of approaches to teaching and learning about ethics and ethical practice, will be used in an illustrative manner.
Professionals’ moral development and ethical behaviour
An ecological perspective (Bronfenbrenner, 1979) is helpful in conceptualising the dynamic interplay between different influences on professionals’ ethical behaviour. An individual entering a profession carries with them their psychological characteristics, life experiences and values (for example, cultural values and religious beliefs), which will interact with the professional context to both shape and be shaped by that context, in line with an interactionist perspective. Lindsay (2009) referred to the influence of contextual factors on the development of psychologists’ values that underpin their ethical behaviour; hence the importance of ethical codes reflecting societal values. Thus, new entrants to a profession carry with them views on what is ‘right’ and ‘wrong’ that will have an impact on their ethical behaviour. During training, individuals learn and experience the values and cultural norms of their chosen profession.
Lactoferrin and lysozyme are antimicrobial and immunomodulatory proteins produced in high quantities in human milk that aid in gastrointestinal (GI) health and have beneficial effects when supplemented separately and in conjunction in human and animal diets. Ruminants produce low levels of lactoferrin and lysozyme; however, there are genetically engineered cattle and goats that respectively secrete recombinant human lactoferrin (rhLF-milk), and human lysozyme (hLZ-milk) in their milk. Effects of consumption of rhLF-milk, hLZ-milk and a combination of rhLF-and hLZ-milk were tested on young pigs as an animal model for the GI tract of children. Compared with control milk-fed pigs, pigs fed a combination of rhLF and hLZ (rhLF+hLZ) milk had a significantly deeper intestinal crypts and a thinner lamina propria layer. Pigs fed hLZ-milk, rhLF-milk and rhLF+hLZ had significantly reduced mean corpuscular volume (MCV) and red blood cells (RBCs) were significantly increased in pigs fed hLZ-milk and rhLF-milk and tended to be increased in rhLF+hLZ-fed pigs, indicating more mature RBCs. These results support previous research demonstrating that pigs fed milk containing rhLF or hLZ had decreased intestinal inflammation, and suggest that in some parameters the combination of lactoferrin and lysozyme have additive effects, in contrast to the synergistic effects reported when utilising in-vitro models.
This Short Review critically evaluates three hypotheses about the effects of emotion on memory: First, emotion usually enhances memory. Second, when emotion does not enhance memory, this can be understood by the magnitude of physiological arousal elicited, with arousal benefiting memory to a point but then having a detrimental influence. Third, when emotion facilitates the processing of information, this also facilitates the retention of that same information. For each of these hypotheses, we summarize the evidence consistent with it, present counter-evidence suggesting boundary conditions for the effect, and discuss the implications for future research. (JINS, 2013, 19, 1–9)
This paper presents an overview of the results of two brief excavation seasons (2008 and 2010) at Foxhole Cave, Gower, south Wales, placing them into the wider context of mid-Holocene Britain. No prehistoric pottery was found and the few pieces of worked flint recovered are diagnostic of the Mesolithic period. Typically for the Carboniferous limestone caves of Gower, bone was well preserved, however, and though much of the material in the heavily disturbed upper metre or so of the deposits was modern sheep and rabbit, scattered fragments representing the remains of at least six humans were also recovered, of which two have been directly radiocarbon-dated using accelerator mass spectrometry (AMS 14C) to the Late Mesolithic and two to the earlier Neolithic (the remaining two providing Romano-British and medieval dates). Their associated stable carbon and nitrogen isotope values indicate a significant difference in diet between the two periods (contrary to the results from an earlier excavation in 1997), with marine foods contributing around half of the protein for the Mesolithic individuals and little or none for the Neolithic individuals. The new results are consistent with those from Caldey Island, Pembrokeshire, some 30km to the west. The floor of the cave has still not been reached at around 2m depth; limited investigation of the lowermost levels has yielded a Pleistocene fauna (including reindeer, aurochs or bison and collared lemming) with dates back to approx 33,500 cal bc, though with no definite evidence for human activity so far. A small, dark-stained fragment of human cranium was recovered from what may be pre-Holocene levels, but this failed to produce sufficient collagen for dating. In addition to a marked dietary shift, the combined stable isotope and dating programme provides further support for an equally striking temporal gap of some two millennia between the Mesolithic and Neolithic use of caves for burial.