We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Distinguishing early domesticates from their wild progenitors presents a significant obstacle for understanding human-mediated effects in the past. The origin of dogs is particularly controversial because potential early dog remains often lack corroborating evidence that can provide secure links between proposed dog remains and human activity. The Tumat Puppies, two permafrost-preserved Late Pleistocene canids, have been hypothesized to have been littermates and early domesticates due to a physical association with putatively butchered mammoth bones. Through a combination of osteometry, stable isotope analysis, plant macrofossil analysis, and genomic and metagenomic analyses, this study exploits the unique properties of the naturally mummified Tumat Puppies to examine their familial relationship and to determine whether dietary information links them to human activities. The multifaceted analysis reveals that the 14,965–14,046 cal yr BP Tumat Puppies were littermates who inhabited a dry and relatively mild environment with heterogeneous vegetation and consumed a diverse diet, including woolly rhinoceros in their final days. However, because there is no evidence of mammoth consumption, these data do not establish a link between the canids and ancient humans.
Edited by
David Mabey, London School of Hygiene and Tropical Medicine,Martin W. Weber, World Health Organization,Moffat Nyirenda, London School of Hygiene and Tropical Medicine,Dorothy Yeboah-Manu, Noguchi Memorial Institute for Medical Research, University of Ghana,Jackson Orem, Uganda Cancer Institute, Kampala,Laura Benjamin, University College London,Michael Marks, London School of Hygiene and Tropical Medicine,Nicholas A. Feasey, Liverpool School of Tropical Medicine
Crises, including natural or man-made disasters and complex emergencies, are a source of significant morbidity and mortality in low- and middle-income countries, including on the African continent. This chapter aims to present an approach to understanding and mitigating the health impacts of crises, whether those impacts are direct, through illness, injury or death, or indirect, due to forced displacement, loss of livelihoods, loss of health-care infrastructure or otherwise. This chapter outlines key definitions relevant to humanitarian crises and forced displacement and proposes a rights-based approach to priority setting and programming for humanitarian assistance in these contexts (Box 3.1).
To evaluate the impact of implementing a multi-step Clostridioides difficile infection (CDI) testing algorithm on hospital-onset (HO)-CDI rates and clinical outcomes.
Two academic hospitals in Pittsburgh, Pennsylvania.
Methods:
In the pre-intervention period, a standalone polymerase chain reaction (PCR) assay was used for diagnosing CDI. In the post-intervention period, positive PCR assays were reflexed to a glutamate dehydrogenase antigen test and an enzyme immunoassay for toxin A/B.
Results:
The implementation of a multi-step testing algorithm resulted in a significant reduction in HO-CDI cases per 10,000 patient days from 5.92 to 2.36 (P < 0.001). Despite the decrease in reportable HO-CDI cases, there were no significant differences in clinical outcomes such as hospital length of stay, intensive care unit admissions, and treatment courses. In addition, there was a significant reduction in all-cause 30-day readmissions in the post-intervention group, though CDI-related readmissions remained similar.
Conclusions:
The multi-step testing algorithm significantly reduced HO-CDI rates without compromising clinical outcomes. The study supports the use of a multi-step CDI testing algorithm to assist healthcare providers with CDI management decisions and potentially to reduce financial penalties burdened on healthcare systems.
Poor diets and food insecurity during adolescence can have long-lasting effects, and Métis youth may be at higher risk. This study, as part of the Food and Nutrition Security for Manitoba Youth study, examines dietary intakes, food behaviours and health indicators of Métis compared with non-Métis youth.
Design:
This observational cross-sectional study involved a cohort of adolescents who completed a self-administered web-based survey on demographics, dietary intake (24-h recall), food behaviours, food security and select health indicators.
Setting:
Manitoba, Canada
Participants:
Participants included 1587 Manitoba grade nine students, with 135 (8·5 %) self-identifying as Métis, a distinct Indigenous nation living in Canada.
Results:
Median intake of sugar was significantly higher in Métis (89·2 g) compared with non-Métis (76·3 g) participants. Percent energy intake of saturated fat was also significantly higher in Métis (12·4 %) than non-Métis (11·6 %) participants. Median intakes of grain products and meat and alternatives servings were significantly lower among Métis than non-Métis (6·0 v. 7·0 and 1·8 v. 2·0, respectively) participants. Intake of other foods was significantly higher in Métis (4·0) than non-Métis (3·0). Significantly more Métis participants were food insecure (33·1 %) compared with non-Métis participants (19·1 %). Significantly more Métis participants ate family dinners and breakfast less often than non-Métis participants and had lower self-reported health. Significantly more Métis participants had a BMI classified as obese compared with non-Métis participants (12·6 % v. 7·1 %).
Conclusions:
The dietary intakes observed in this study, both among Métis and non-Métis youth, are concerning. Many have dietary patterns that put them at risk for developing health issues in the future.
Substance use and substance use disorders run in families. While it has long been recognized that the etiology of substance use behaviors and disorders involves a combination of genetic and environmental factors, two key questions remain largely unanswered: (1) the intergenerational transmission through which these genetic predispositions are passed from parents to children, and (2) the molecular mechanisms linking genetic variants to substance use behaviors and disorders. This article aims to provide a comprehensive conceptual framework and methodological approach for investigating the intergenerational transmission of substance use behaviors and disorders, by integrating genetic nurture analysis, gene expression imputation, and weighted gene co-expression network analysis. We also additionally describe two longitudinal cohorts — the Brisbane Longitudinal Twin Study in Australia and the Lifelines Cohort Study in the Netherlands. By applying the methodological framework to these two unique datasets, our future research will explore the complex interplay between genetic factors, gene expression, and environmental influences on substance use behaviors and disorders across different life stages and populations.
Following an outbreak of Salmonella Typhimurium in Wales in July 2021 associated with sheep meat and offal, further genetically related cases were detected across the UK. Cases were UK residents with laboratory-confirmed Salmonella Typhimurium in the same 5-single-nucleotide polymorphism (SNP) single-linkage cluster with specimen date between 01/08/2021–2031/12/2022. We described cases using routine (UK) and enhanced (Wales only) surveillance data. Exposures in cases in Wales were compared with non-Typhimurium Salmonella case–controls. Environmental Health Practitioners and the Food Standards Agency investigated supply chains of food premises reported by ≥2 cases. Animal, carcass, and environmental samples taken for diagnostic or monitoring purposes for gastrointestinal pathogens were included in microbiological investigations. We identified 142 cases: 75% in England, 23% in Wales and 3% in Scotland. Median age was 32 years, and 59% were male. Direct contact with sheep was associated with becoming a case (aOR: 14, 95%CI: 1.4–145) but reported by few (6/32 cases). No single food item, premises, or supplier linked all cases. Multi-agency collaboration enabled the identification of isolates in the same 5-SNP single-linkage cluster from a sheep carcass at an English abattoir and in ruminant, wildlife, poultry, and environmental samples, suggesting multiple vehicles and pathways of infection.
To assess the safety and efficacy of a novel beta-lactam allergy assessment algorithm managed by an antimicrobial stewardship program (ASP) team.
Design:
Retrospective analysis.
Setting:
One quaternary referral teaching hospital and one tertiary care teaching hospital in a large western Pennsylvania health network.
Patients or participants:
Patients who received a beta-lactam challenge dose under the beta-lactam allergy assessment algorithm.
Interventions:
A beta-lactam allergy assessment protocol was designed and implemented by an ASP team. The protocol risk stratified patients’ reported allergies to identify patients appropriate for a challenge with a beta-lactam antibiotic. This retrospective analysis assessed the safety and efficacy of this protocol among patients receiving a challenge dose from November 2017 to July 2021.
Results:
Over a 45-month period, 119 total patients with either penicillin or cephalosporin allergies entered the protocol. Following a challenge dose, 106 (89.1%) patients were treated with a beta-lactam. Eleven patients had adverse reactions to a challenge dose, one of which required escalation of care to the intensive care unit. Of the patients with an unknown or low-risk reported allergy, 7/66 (10.6%) had an observed adverse reaction compared to 3/42 (7.1%) who had an observed reaction with a reported high-risk or anaphylactic allergy.
Conclusions:
Our implemented protocol was safe and effective, with over 90% of patients tolerating the challenge without incident and many going on to receive indicated beta-lactam therapy. This protocol may serve as a framework for other inpatient ASP teams to implement a low-barrier allergy assessment led by ASP teams.
This paper draws on new data regarding judicial decisions involving religious and anti-religious expression to map the political beneficiaries of judicial empowerment. In particular, the paper assesses the extent to which free-expression decisions issued by the U.S. Supreme Court and European Court of Human Rights have favored claimants who are religious majorities, religious minorities, or secular elites. We find the U.S. doctrine relatively more libertarian and the European Court of Human Rights doctrine relatively more secularist, but both bodies of case law extend regular and substantial rights protection to religious minorities.
To explore the effect of hindsight bias on retrospective reviews of clinical decision making prior to adverse incidents to inform future approaches to incident investigations.
Methods
We have undertaken focus groups with doctors of varying grades across the North West of England and North Wales. A vignette based on a real-life case from the publicly available NHS England Homicide Independent Investigation report database was presented to each group in one of three versions which differed in terms of the ending of the vignettes (i.e. suicide, homicide, no adverse incident). Using a semi-structured interview approach, the group participants were encouraged by the facilitators to reflect on issues relating to risk and risk management. All groups were provided with the same vignette which initially made no reference to the outcome and asked to comment on matters of risk and risk management. Halfway through the discussion, one of the three outcomes was disclosed, and further group discussion was held. The recorded interviews were transcribed and thematic analysis was undertaken using an adapted Framework Method.
Results
Preliminary results (n = 10) indicate that participants identified the potential for significant harm, particularly to others, and identified evidence of key psychopathological and historical correlates to support assertive management of risk and admission to hospital.
Whilst knowledge of the outcome did not lead to participants changing their favoured management plans, it did alter how they appraised the case and led to participants constructing “narrative” explanations for the outcome given. The level of conviction participants held for their management plan reduced when their expectations about the outcome were confounded.
Participants presented with the suicide outcome vignette described their difficulties appraising risk to others and their over-sensitivity to that risk. Participants faced with the ‘no adverse outcome’ vignette perceived the original management plan far more favourably in hindsight. The groups that were presented with the homicide outcome vignette initially focused on both risks to self and others as well as the perceived need for further information. Following knowledge of the outcome, there was a tendency to highlight parts of the letter pertaining to risk to others which they previously had not given as much attention.
Conclusion
The initial analysis of our data confirms the findings from previous studies that hindsight colours the appraisal of adverse events. However, this study is novel in that it describes the nature of the thought processes underpinning the influence of hindsight on appraisals of risk.
Over the years of studying the magical powers of words in the Burmese Buddhist context, I spent much of my research on the inner workings of the letter and phrase combinations and how the correct construction, combination and usage of these words, known in Burmese as inn, aing, and sama made for a potent prophylactic against a host of maladies. So focused had I been on the words that I had not taken the time to consider the individual letters, in and of themselves, before they went on to be combined into esoteric phrases and diagrams. Moreover, I had devoted my research to examining the ways Burmese words were predominantly seen to protect, purify, and even attack, but had never considered other ways in which letters may ‘do’ things, as Fox points out: namely, ‘represent cultural identity’; ‘embody and transmit knowledge’; ‘animate and enable’; ‘render things usable and so nameable’; ‘turn on their user’; and ‘both incur and pay debts’. While I cannot address each of these points in this short essay, I would like to discuss how Fox's book helped me to discover new ways of interpreting how letters and words may transfer their powers to people and things (‘embody and transmit knowledge’), as well as encouraging me to look into concepts of how, and if, letters can be considered ‘alive’ (‘animate and enable’) in the Burmese context.
Many mental disorders, including depression, bipolar disorder and schizophrenia, are associated with poor dietary quality and nutrient intake. There is, however, a deficit of research looking at the relationship between obsessive–compulsive disorder (OCD) severity, nutrient intake and dietary quality.
Aims
This study aims to explore the relationship between OCD severity, nutrient intake and dietary quality.
Method
A post hoc regression analysis was conducted with data combined from two separate clinical trials that included 85 adults with diagnosed OCD, using the Structured Clinical Interview for DSM-5. Nutrient intakes were calculated from the Dietary Questionnaire for Epidemiological Studies version 3.2, and dietary quality was scored with the Healthy Eating Index for Australian Adults – 2013.
Results
Nutrient intake in the sample largely aligned with Australian dietary guidelines. Linear regression models adjusted for gender, age and total energy intake showed no significant associations between OCD severity, nutrient intake and dietary quality (all P > 0.05). However, OCD severity was inversely associated with caffeine (β = −15.50, 95% CI −28.88 to −2.11, P = 0.024) and magnesium (β = −6.63, 95% CI −12.72 to −0.53, P = 0.034) intake after adjusting for OCD treatment resistance.
Conclusions
This study showed OCD severity had little effect on nutrient intake and dietary quality. Dietary quality scores were higher than prior studies with healthy samples, but limitations must be noted regarding comparability. Future studies employing larger sample sizes, control groups and more accurate dietary intake measures will further elucidate the relationship between nutrient intake and dietary quality in patients with OCD.
The topic of patients recording healthcare consultations has been previously debated in the literature, but little consideration has been given to the risks and benefits of such recordings in the context of mental health assessments and treatment. This issue is of growing importance given the increasing use of technology in healthcare and the recent increase in online healthcare services, largely accelerated by the COVID-19 pandemic. We discuss the clinical, ethical and legal considerations relevant to audio or visual recordings of mental health consultations by patients, with reference to existing UK guidance and the inclusion of a patient's perspective.
The invasion of waterhemp into northern sugarbeet growing regions has prompted producers to re-integrate inter-row cultivation into weed management programs, as no currently registered herbicides can control glyphosate-resistant waterhemp POST in crop. Inter-row cultivation was a common weed control practice in sugarbeet until the release of glyphosate-resistant sugarbeet cultivars in 2008 made the use of inter-row cultivation unnecessary. In the late 2010s, producers began again to use inter-row cultivation to remove weeds that glyphosate did not control, but producers need information on the effectiveness and safety of inter-row cultivation when used with soil-residual herbicide programs. Efficacy and tolerance field experiments were conducted in Minnesota and North Dakota from 2017 to 2019. Results from the efficacy experiment demonstrated that cultivation improved waterhemp control 11% and 12%, 14 and 28 d after treatment, respectively. Waterhemp response to cultivation was dependent on crop canopy and precipitation after cultivation. Cultivation had minimal effect on waterhemp density in three environments, but at one environment, near Galchutt, ND in 2019, waterhemp density increased 600% and 196%, 14 and 28 d after treatment, respectively. Climate data indicated that in 2019 Galchutt, ND received 105 mm of precipitation in the 14 d following cultivation and had an open crop canopy that probably contributed to further weed emergence. Results from the tolerance experiment demonstrated that root yield and recoverable sucrose were not affected by cultivation timing or number of cultivations. In one environment, cultivating reduced sucrose content by 0.8% regardless of date or cultivation number, but no differences were found in four environments. Damage/destruction of leaf tissue from in-season cultivation is probably responsible for the reduction in sucrose content. Results indicate that cultivation can be a valuable tool to control weeds that herbicide cannot, but excessive rainfall and open crop canopy following cultivation can create an environment conducive to further weed emergence.
OBJECTIVES/GOALS: This study’s goal is to examine the feasibility and acceptability of using VRM to impact the APP of adults in the inpatient setting. Aims include examining the: 1) feasibility of VRM for APP management; 2) acceptability of using VRM for APP management; and 3) experience of VRM for APP management. METHODS/STUDY POPULATION: To comprehensively examine participants’ experience of using VRM for APP, this study will employ a convergent mixed-methods design in which living kidney donors (N = 45) will be recruited to serially use VRM during their hospital stay. Feasibility and acceptability will be evaluated using descriptive and inferential statistics evaluating patient-reported outcome (PRO) measures taken pre-, post- and 1-hour post-VRM, PRO measures extracted from the participant’s electronic health record and data on VRM use. Semi-structured interviews will allow formulation of inferences based on participants’ experience of VRM for APP management and their insights on content, deployment, and clinical use of VRM. RESULTS/ANTICIPATED RESULTS: This in-process study expects: 1) an adequate sample of participants undergoing living kidney donor surgery who agree to enroll with retention of >90% of participants (Aim 1); 2) participants to report VRM as an acceptable and suitable treatment, feel “present” and interested in the VR environment, and feel comfortable using VRM in the hospital (Aim 2); and 3) to provide insight into participants’ experience of VRM for APP, understanding of extended VRM use for APP analgesia, examination of key variables affecting participants’ experience of VRM for APP and feedback about VRM procedures and protocol to inform future VRM use for APP management (Aim 3). DISCUSSION/SIGNIFICANCE OF IMPACT: Results of the proposed study will inform future clinical testing and deployment of VRM, guide future use of VRM as an adjunct for inpatient APP management, and provide insight into inpatients’ experience of VRM for APP analgesia.
Exposure to marketing for foods high in fat, salt or sugar (HFSS) reportedly influences consumption, nutritional knowledge and diet-related health among adolescents. In 2018/2019, the UK government held two consultations about introducing new restrictions on marketing for HFSS foods. To reinforce why these restrictions are needed, we examined adolescents’ awareness of marketing for HFSS foods, and the association between past month awareness and weekly HFSS food consumption.
Design:
Cross-sectional survey that measured past month awareness of ten marketing activities for HFSS foods (1 = everyday; 6 = not in last month). Frequencies were converted into aggregate past month awareness across marketing activities and grouped into three categories (low/medium/high). Consumption was self-reported for fifteen foods (twelve HFSS) (1 = few times/d; 9 = never). For each food, frequency was divided into higher/lower weekly consumption.
Setting:
United Kingdom.
Participants:
11–19-year-olds (n 3348).
Results:
Most adolescents (90·8 %) reported awareness of a least one marketing activity for HFSS foods, and at least half reported seeing ≥70 instances in the past month. Television, social media and price offers were the marketing activities most frequently reported. Awareness was associated with higher weekly consumption for ten of the twelve HFSS foods. For example, those reporting medium marketing awareness were 1·5 times more likely to report higher weekly consumption of cakes/biscuits compared with those reporting low awareness (AOR = 1·51, P = 0·012). The likelihood of higher weekly HFSS food consumption increased relative to the level of marketing awareness.
Conclusions:
Assuming there is a causal relationship between marketing awareness and consumption, the restrictions proposed by the UK government are likely to help reduce HFSS consumption.
Emergency Medical Services (EMS) systems have developed protocols for prehospital activation of the cardiac catheterization laboratory for patients with suspected ST-elevation myocardial infarction (STEMI) to decrease first-medical-contact-to-balloon time (FMC2B). The rate of “false positive” prehospital activations is high. In order to decrease this rate and expedite care for patients with true STEMI, the American Heart Association (AHA; Dallas, Texas USA) developed the Mission Lifeline PreAct STEMI algorithm, which was implemented in Los Angeles County (LAC; California USA) in 2015. The hypothesis of this study was that implementation of the PreAct algorithm would increase the positive predictive value (PPV) of prehospital activation.
Methods:
This is an observational pre-/post-study of the effect of the implementation of the PreAct algorithm for patients with suspected STEMI transported to one of five STEMI Receiving Centers (SRCs) within the LAC Regional System. The primary outcome was the PPV of cardiac catheterization laboratory activation for percutaneous coronary intervention (PCI) or coronary artery bypass graft (CABG). The secondary outcome was FMC2B.
Results:
A total of 1,877 patients were analyzed for the primary outcome in the pre-intervention period and 405 patients in the post-intervention period. There was an overall decrease in cardiac catheterization laboratory activations, from 67% in the pre-intervention period to 49% in the post-intervention period (95% CI for the difference, -14% to -22%). The overall rate of cardiac catheterization declined in post-intervention period as compared the pre-intervention period, from 34% to 30% (95% CI, for the difference -7.6% to 0.4%), but actually increased for subjects who had activation (48% versus 58%; 95% CI, 4.6%-15.0%). Implementation of the PreAct algorithm was associated with an increase in the PPV of activation for PCI or CABG from 37.9% to 48.6%. The overall odds ratio (OR) associated with the intervention was 1.4 (95% CI, 1.1-1.8). The effect of the intervention was to decrease variability between medical centers. There was no associated change in average FMC2B.
Conclusions:
The implementation of the PreAct algorithm in the LAC EMS system was associated with an overall increase in the PPV of cardiac catheterization laboratory activation.
The iconic Plaza Tree of Pueblo Bonito is widely believed to have been a majestic pine standing in the west courtyard of the monumental great house during the peak of the Chaco Phenomenon (AD 850–1140). The ponderosa pine (Pinus ponderosa) log was discovered in 1924, and since then, it has been included in “birth” and “life” narratives of Pueblo Bonito, although these ideas have not been rigorously tested. We evaluate three potential growth origins of the tree (JPB-99): Pueblo Bonito, Chaco Canyon, or a distant mountain range. Based on converging lines of evidence—documentary records, strontium isotopes (87Sr/86Sr), and tree-ring provenance testing—we present a new origin for the Plaza Tree. It did not grow in Pueblo Bonito or even nearby in Chaco Canyon. Rather, JPB-99 originated from the Chuska Mountains, over 50 km west of Chaco Canyon. The tree was likely carried to Pueblo Bonito sometime between AD 1100 and 1130, although why it was left in the west courtyard, what it meant, and how it might have been used remain mysteries. The origin of the Plaza Tree of Pueblo Bonito underscores deep cultural and material ties between the Chaco Canyon great houses and the Chuska landscape.