We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Delirium frequently occurs among hospital in-patients, with significant attributable healthcare costs. It is associated with long-term adverse outcomes, including an eightfold increased risk of subsequent dementia. The purpose of this article is to inform clinicians of the best practices for spotting, stopping and treating delirium and provide guidance on common challenging clinical dilemmas. For spotting delirium, suggested screening tools are the 4 ‘A's Test (in general medical settings) and the Confusion Assessment Method for the Intensive Care Unit (CAM-ICU). Prevention is best achieved with multicomponent interventions and targeted strategies focusing on: (a) avoiding iatrogenic causes; (b) brain optimisation by ensuring smooth bodily functioning; (c) maintaining social interactions and normality. Non-pharmacological approaches are the first line for treatment; they largely mirror prevention strategies, but the focus of empirical evidence is on prevention. Although sufficient evidence is lacking for most pharmacological approaches, an antipsychotic at low doses for short durations may be of utility for highly distressing or high-risk situations, particularly in hyperactive delirium, but only as a last resort.
Among nursing home outbreaks of coronavirus disease 2019 (COVID-19) with ≥3 breakthrough infections when the predominant severe acute respiratory coronavirus virus 2 (SARS-CoV-2) variant circulating was the SARS-CoV-2 δ (delta) variant, fully vaccinated residents were 28% less likely to be infected than were unvaccinated residents. Once infected, they had approximately half the risk for all-cause hospitalization and all-cause death compared with unvaccinated infected residents.
Emergency medical teams (EMTs) have helped to provide surgical care in many recent sudden onset disasters (SODs), especially in low- and middle-income countries (LMICs). General surgical training in Australia has undergone considerable change in recent years, and it is not known whether the new generation of general surgeons is equipped with the broad surgical skills needed to operate as part of EMTs.
Aim:
To analyze the differences between the procedures performed by contemporary Australian general surgeons during training and the procedures performed by EMTs responding to SODs in low- and middle-income countries (LMICs).
Methods:
General surgical trainee logbooks between February 2008 and January 2017 were obtained from General Surgeons Australia. Operating theatre logs from EMTs working during the 2010 earthquake in Haiti, 2014 typhoon in the Philippines, and 2015 earthquake in Nepal were also obtained. These caseloads were collated and compared.
Results:
A total of 1,396,383 procedures were performed by Australian general surgical trainees in the study period. The most common procedure categories were abdominal wall hernia procedures (12.7%), cholecystectomy (11.7%), and specialist colorectal procedures (11.5%). Of note, Caesarean sections, hysterectomy, fracture repair, specialist neurosurgical, and specialist pediatric surgical procedures all made up <1% of procedures each. There were a total of 3,542 procedures recorded in the EMT case logs. The most common procedures were wound debridement (31.5%), other trauma (13.3%), and Caesarean section (12.5%). Specialist colorectal, hepato-pancreaticobiliary, upper gastrointestinal, urological, vascular, neurosurgical, and pediatric surgical procedures all made up <1% each.
Discussion:
Australian general surgical trainees get limited exposure to the obstetric, gynecological, and orthopedic procedures that are common during EMT responses to SODs. However, there is considerable exposure to the soft tissue wound management and abdominal procedures.
To describe the types of surgical procedures performed by emergency medical teams (EMTs) with general surgical capability in the aftermath of sudden-onset disasters (SODs) in low- and middle-income countries (LMICs).
Methods:
A search of electronic databases (PubMed, MEDLINE, and EMBASE) was carried out to identify articles published between 1990 and 2018 that describe the type of surgical procedures performed by EMTs in the impact and post-impact phases a SOD. Further relevant articles were obtained by hand-searching reference lists.
Results:
16 articles met the inclusion criteria. Articles reporting on EMTs from a number of different countries and responding to a variety of disasters were included. There was a high prevalence of procedures for extremity soft tissue injuries (46.8%) and fractures (28.3%). However, a significant number of genitourinary/obstetric procedures were also reported.
Discussion:
Knowledge of the types of surgical procedures most frequently performed by EMTs may help further determine the necessary prerequisite surgical skills required for the recruitment of surgeons for EMTs. Experience in basic plastic, orthopedic, urological, and obstetric surgery would seem desirable for surgeons and surgical teams wishing to participate in an EMT.
We report daptomycin minimum inhibitory concentrations (MICs) for vancomycin-resistant Enterococcus faecium isolated from bloodstream infections over a 4-year period. The daptomycin MIC increased over time hospital-wide for initial isolates and increased over time within patients, culminating in 40% of patients having daptomycin-nonsusceptible isolates in the final year of the study.
Herbicide resistance is the heritable ability of a weed biotype or population to survive a herbicide application that would effectively kill a susceptible population of the weed. In the U.K. the most widespread and financially important herbicide-resistant weed is blackgrass. Investigations to elucidate the molecular mechanisms conferring herbicide resistance to blackgrass populations have been ongoing for two decades. Although the identification of target site–resistant populations has proved to be relatively straightforward (using, for example, target site assays in vitro), the study and understanding of resistance mechanisms involved in enhanced metabolism has proven to be more problematic. Research has focused on the cytochrome P450 monooxygenase and glutathione S-transferase (GST) enzyme families, both of which have been shown to be important in herbicide metabolism in many weed and crop species. GST activity and abundance are greater in a selection of herbicide-resistant blackgrass biotypes, and herbicide treatment of field populations of blackgrass results in the survival of the proportion of population possessing the greatest GST activity and abundance. In addition, GST activity in the field increases between winter and spring, and this coincides with reduced efficacy of important blackgrass herbicides. GST activities within field populations of blackgrass are highly varied, and this plasticity is discussed in relation to the development of resistant populations in field situations. This article describes research results in blackgrass and compares them with GST studies in other weed species as well as with other mechanisms for enhanced metabolism-based resistance.
XMM-Newton performs a survey of the sky in the 0.2-12 keV X-ray band while slewing between observation targets. The sensitivity in the soft X-ray band is comparable with that of the ROSAT all-sky survey, allowing bright transients to be identified in near real-time by a comparison of the flux in both surveys. Several of the soft X-ray flares are coincident with galaxy nuclei and five of these have been interpreted as candidate tidal disruption events (TDE). The first three discovered had a soft X-ray spectrum, consistent with the classical model of TDE, where radiation is released during the accretion phase by thermal processes. The remaining two have an additional hard, power-law component, which in only one case was accompanied by radio emission. Overall the flares decay with the classical index of t−5/3 but vary greatly in the early phase.
The present study is the first record of twinning in Lagenorhynchus acutus and indeed any Lagenorhynchus sp. Both foetuses were male and located in the left uterine horn, had distinct grossly normal placentas and amniotic sacs, and were therefore likely dizygotic twins. The twins were an incidental finding in an animal that died of a systemic Brucella ceti infection.
Infection caused by parasitic nematodes of humans and livestock can have significant health and economic costs. Treatments aimed at alleviating these costs, such as chemotherapy and vaccination, alter parasite survival and reproduction, the main selective pressures shaping life-history traits such as age to maturity, size and fecundity. Most authors have argued that the life-history evolution prompted by animal and public health programmes would be clinically beneficial, generating smaller, less fecund worms, and several mathematical models support this view. However, using mathematical models of long-lasting interventions, such as vaccination, and regularly repeated short interventions, such as drenching, we show here that the expected outcome actually depends on how mortality rates vary as a function of worm size and developmental status. Interventions which change mortality functions can exert selection pressure to either shorten or extend the time to maturity, and thus increase or decrease worm fecundity and size. The evolutionary trajectory depends critically on the details of the mortality functions with and without the intervention. Earlier optimism that health interventions would always prompt the evolution of smaller, less fecund and hence clinically less damaging worms is premature.
From a Chandra survey of nine interacting galaxy systems the evolution of X-ray emission during the merger process has been investigated. From comparing LX/LK and LFIR/LB it is found that the X-ray luminosity peaks ∼300 Myr before nuclear coalescence, even though we know that rapid and increasing star formation is still taking place at this time. It is likely that this drop in X-ray luminosity is a consequence of outflows breaking out of the galactic discs of these systems. At a time ∼1 Gyr after coalescence, the merger-remnants in our sample are X-ray dim when compared to typical X-ray luminosities of mature elliptical galaxies. However, we do see evidence that these systems will start to resemble typical elliptical galaxies at a greater dynamical age, given the properties of the 3 Gyr system within our sample, indicating that halo regeneration will take place within low LX merger-remnants.
I review here the current ideas regarding the origin, evolution, and physical nature of hot diffuse gas in normal, starburst, interacting and merging galaxies, using recent X-ray observations with XMM-Newton and Chandra. Many types of diffuse X-ray structures, including winds, bubbles, halos, chimneys and fountains, can be formed in galaxies, and can enrich the intergalactic medium with mass, energy and metals. This has profound implications as regards galactic formation and evolution, and the enrichment and evolution of galaxy groups and clusters.
By
Andrew F. Read, School of Biological Sciences, University of Edinburgh, Edinburgh EH9 3JT, UK,
Sylvain Gandon, School of Biological Sciences, University of Edinburgh, Edinburgh EH9 3JT, UK,
Sean Nee, School of Biological Sciences, University of Edinburgh, Edinburgh EH9 3JT, UK,
Margaret J. Mackinnon, School of Biological Sciences, University of Edinburgh, Edinburgh EH9 3JT, UK
Pathogen evolution poses the critical challenge for infectious disease management in the twenty-first century. As is already painfully obvious in many parts of the world, the spread of drug-resistant and vaccine-escape (epitope) mutants can impair and even debilitate public and animal health programs. But there may also be another way in which pathogen evolution can erode the effectiveness of medical and veterinary interventions. Virulence- and transmission-related traits are intimately linked to pathogen fitness and are almost always genetically variable in pathogen populations. They can therefore evolve. Moreover, virulence and infectiousness are the target of medical and veterinary interventions. Here, we focus on vaccination and ask whether large-scale immunization programs might impose selection that results in the evolution of more-virulent pathogens.
The word virulence is used in a variety of ways in different disciplines. We take a parasite-centric view as follows. We use “disease severity” (morbidity and/or mortality) to mean the harm to the host following infection. Disease severity is thus a phenotype measured at the whole-organism (host) level that is determined by host genes, parasite genes, environmental effects, and the interaction between those factors. One component of this is virulence, a phenotypic trait of the pathogen whose expression depends on the host. Thus, virulence is the component of disease severity that is due to pathogen genes, and it can be measured only on a given host. We assume no specificity in the interaction between host and pathogen (more-virulent strains are always more virulent, whatever host they infect).
We report on the results of XMM-Newton observations of nearby starburst galaxies that form part of a multi-wavelength study of all phases of extraplanar gas in external galaxies, which is conducted in order to assess the importance of halos as repositories of a metal-enriched medium and their significance in terms of galactic chemical evolution and possible metal enrichment of the intergalactic medium (IGM). XMM-Newton observations of the starburst galaxy NGC 1511 revealed e.g. the presence of a previously unknown extended hot gaseous phase of its interstellar medium (ISM), which partly extends out of the disk plane. We also present preliminary results based on XMM-Newton observations of NGC 1808, NGC 4666 and NGC 3628.
Disparity-tuned cells in primary visual cortex (V1) are thought to play a significant role in the processing of stereoscopic depth. The disparity-specific responses of these neurons have been previously described by an energy model based on local, feedforward interactions. This model fails to predict the response to binocularly anticorrelated stimuli, in which images presented to left and right eyes have opposite contrasts. The original energy model predicts that anticorrelation should invert the disparity tuning curve (phase difference π), with no change in the amplitude of the response. Experimentally, the amplitude tends to be reduced with anticorrelated stimuli and a spread of phase differences is observed, although phase differences near π are the most common. These experimental observations could potentially reflect a modulation of the V1 signals by feedback from higher visual areas (because anticorrelated stimuli create a weaker or nonexistent stereoscopic depth sensation). This hypothesis could explain the effects on amplitude, but the spread of phase differences is harder to understand. Here, we demonstrate that changes in both amplitude and phase can be explained by a straightforward modification of the energy model that involves only local processing. Input from each eye is passed through a monocular simple cell, incorporating a threshold, before being combined at a binocular simple cell that feeds into the energy computation. Since this local feedforward model can explain the responses of complex cells to both correlated and anticorrelated stimuli, there is no need to invoke any influence of global stereoscopic matching.
By
Andrew F. Read, Institute of Cell, Animal and Population Biology, University of Edinburgh, United Kingdom,
Todd G. Smith, Department of Medicine, University of Toronto, Medical Sciences Building, Ontario, Canada,
Sean Nee, Institute of Cell, Animal and Population Biology, University of Edinburgh, United Kingdom,
Stuart A. West, Institute of Cell, Animal and Population Biology, University of Edinburgh, United Kingdom
We review methods for studying the adaptive basis of sex allocation in the phylum Apicomplexa, a group of parasitic protozoa that includes the aetiological agents of malaria. It is our contention that analysis of apicomplexan sex ratios is not only interesting in its own right, but may actually provide insights into matters of clinical and epidemiological importance. We begin by justifying that position, and then summarize the natural history of these parasites and the sex ratio expectations that flow from that. Broadly speaking, these expectations are supported, but the evidence is scanty relative to that for many multicelled taxa. In the second half of the chapter, we give an overview of the theoretical and empirical methods available to take this work further. Much remains to be done: many key assumptions are currently little more than acts of faith.
Introduction
Almost all work on the evolution of sex allocation is motivated by and tested on multicelled organisms. Yet the causative agents of some of the most serious diseases of humans and livestock have anisogamous sexual stages (Figure 15.1). These are all members of the protozoan phylum Apicomplexa, and include the malaria parasites (Plasmodium spp.). Species in other protozoan phyla can also have anisogamous sexual stages (e.g. some dinoflagellates, volvocidians and perhaps some foraminiferans; Lee et al. 1985) but we are unaware of any analysis of sex allocation in micro-organisms other than the Apicomplexa.
Malaria, a disease caused by protozoan parasites of the genus Plasmodium, can substantially reduce host fitness in wild animals (Atkinson and Van Riper 1991; Schall 1996). In humans, the major disease syndromes – severe anemia, coma, and organ failure, as well as general pathology such as respiratory distress, aches, and nausea – cause considerable mortality and morbidity (Marsh and Snow 1997).
Biomedical research attributes malaria to red cell destruction, infected cell sequestration in vital organs, and the parasite-induced release of cytokines (Marsh and Snow 1997). But mechanistic explanations are just one type of explanation for any biological phenomenon, and, in recent years, evolutionary biologists have become interested in offering evolutionary explanations of infectious disease virulence. This is entirely appropriate (Read 1994). In the context of malaria, for example, the clinical outcome of infection has an important impact on parasite and host fitness and is – at least in part – determined by heritable variation in host and parasite factors (Greenwood et al. 1991). Yet in the recent rush to provide evolutionary explanations of disease, there has been, in our view, too little interaction between the models built by evolutionary biologists and reality. There is unlikely to be a simple, general model of virulence: the causes of disease and the fitness consequences for host and parasite are too variable. Instead, different models, and even different frameworks, will be relevant in different contexts.