We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Following the recent report of strongyloidiasis caused by Strongyloides fuelleborni within a semi-captive colony of baboons in a UK safari park, we investigated the genetic relationships of this isolate with other Strongyloides isolates across the world. Whole-genome sequencing data were generated with later phylogenetic analysis of mitochondrial (mt) cytochrome oxidase subunit 1 (cox1) and nuclear ribosomal 18S sequences against 300 published Strongyloides reference genotypes. The putative African origin of the UK S. fuelleborni was confirmed and full-length mt genome sequences were assembled to facilitate a more detailed phylogenetic analysis of 14 mt coding regions against all available Strongyloides species. Our analyses demonstrated that the UK isolate represented a novel African lineage not previously described. Additional complete mt genomes were assembled for several individual UK safari park worms to reveal a slightly altered mt genome gene arrangement, allowing clear separation from Asian S. fuelleborni. Furthermore, these UK worms possessed expanded intergenic regions of unknown function that increase their mt genome size to approximately 24 kilobases (kb) as compared with some 16 kb for Asian S. fuelleborni; this may have arisen from unique populational founder and genetic drift effects set within the peculiar mixed species baboon and drill ancestry of this semi-captive primate colony. A maximum likelihood phylogeny constructed from 14 mt coding regions also supported an evolutionary distinction between Asian and African S. fuelleborni.
We aimed to determine if implementation of universal nasal decolonization with daily chlorhexidine bathing will decrease blood stream infections (BSI) in patients undergoing extracorporeal membrane oxygenation (ECMO).
Design:
Retrospective cohort study.
Setting:
Tertiary care facility.
Patients:
Patients placed on ECMO from January 1, 2017 to December 31, 2023.
Intervention:
Daily bathing with 4% chlorhexidine soap and universal mupirocin nasal decolonization were initiated for all ECMO patients May 2021. The primary outcome was rate of ECMO-attributable positive blood cultures. Zero-inflated Poisson regression analysis was performed to estimate rate ratios (RRs) for the association between decolonization with BSI rates.
Results:
A total of 776 patients met inclusion criteria during the study period, 425 (55%) preimplementation and 351 (45%) post-implementation. Following implementation of decolonization, the overall incidence rate of BSI increased nonsignificantly from 10.7 to 14.0 infections per 1000 ECMO days (aRR 1.09, 95% CI 0.74–1.59). For gram-positive cocci (GPC) pathogens, a nonsignificant 40% increased rate was observed in the post-implementation period (RR 1.40, 95% CI 0.89–2.21), due mostly to a significant increase in the crude rate of Enterococcus BSI (RR 1.89, 95% CI 1.01–3.55). Excluding Enterococcus resulted in a nonsignificant 28% decreased rate (aRR 0.72, 95% CI 0.39-1.36) due to a nonsignificant 55% decreased rate of MRSA (aRR 0.45, 95% CI 0.18–3.58).
Conclusions:
Implementation of a universal decolonization protocol did not significantly reduce rates of certain BSIs, including MRSA and other gram-positive pathogens. Although nonsignificant, reduction in BSI rates in this patient population has important implications on surveillance metrics, such as MRSA, and in the future, hospital-onset bacteremia.
Vaccines have revolutionised the field of medicine, eradicating and controlling many diseases. Recent pandemic vaccine successes have highlighted the accelerated pace of vaccine development and deployment. Leveraging this momentum, attention has shifted to cancer vaccines and personalised cancer vaccines, aimed at targeting individual tumour-specific abnormalities. The UK, now regarded for its vaccine capabilities, is an ideal nation for pioneering cancer vaccine trials. This article convened experts to share insights and approaches to navigate the challenges of cancer vaccine development with personalised or precision cancer vaccines, as well as fixed vaccines. Emphasising partnership and proactive strategies, this article outlines the ambition to harness national and local system capabilities in the UK; to work in collaboration with potential pharmaceutic partners; and to seize the opportunity to deliver the pace for rapid advances in cancer vaccine technology.
The 16-8-, 8-5-, 5-2-, 2-1-, 1-0.5-, 0.5-0.3-, 0.3-0.1-, and <0.1-μm size fractions were centrifuged from a Georgia (U.S.A.) sedimentary kaolin and a hydrothermal kaolin from the Sasso mine (Italy) and analyzed by scanning electron microscopy (SEM), X-ray powder diffraction (XRD), infrared spectroscopy (IR), differential thermal analysis (DTA) and thermogravimetry (TGA) together with the corresponding whole rocks. All size fractions of the Georgia sample consisted dominantly of well-crystallized, fine-grained kaolinite, associated with minor quantities of smectite. Some halloysite-like elongate particles were noted by SEM in the intermediate size fractions, minor amounts of quartz were identified in the coarsest size fractions, and < 1% noncrystalline material and traces of organic material were suspected in the finest size fraction. The crystallinity of the kaolinite as measured by XRD and IR varied moderately with size. IR suggested that nacrite-like stacking disorder increased with decreasing size for particles < 5 μm in size.
In the Sasso sample kaolinite dominated all size fractions and was accompanied by dickite in the coarse and by halloysite in the fine size fractions. Regular mixed-layer illite/smectite (I/S) was present in all size fractions and dominated in the finest. Abundant quartz and traces of alunite were identified in the whole rock and coarsest size fractions. The kaolinite in this sample showed marked variation in stacking order and crystallinity, as shown by changes in XRD, IR, and DTA patterns.
The observed compositional and structural variations in the size fractions of the Georgia sedimentary kaolin are small, as expected from formational environment, which was characterized by low temperatures and relative stable genetic conditions. The much more marked differences in composition within the size fractions of the Sasso hydrothermal kaolin are likely a result of the broad range of temperatures and fluid chemistry of its formational environment. The sequence dickite-well-crystallized kaolinite-kaolinite-halloysite is probably temperature-dependent.
Until now the career of Elise Hall (1853–1924), the world's first female orchestral saxophonist and altruistic patroness of the arts, has been regarded as a twentieth-century affair. Between 1900 and 1920, she appeared in select performances with the Boston Symphony Orchestra, Boston Orchestral Club, and Longy Club, and was heard on Parisian programs of La Société nationale de musique and Salon des musiciens français. Most famously, she was the sole financial benefactor of the Boston Orchestral Club and commissioned twenty-two works by prominent French composers (Debussy, d’Indy, Loeffler, Schmitt, and others), many of which she premiered. During this time, Hall also presented several self-produced concerts on both sides of the Atlantic and participated in countless private musicales. By all measures, hers was a monumental and enduring achievement, especially in view of the resistance that women of her generation encountered with regard to performing in public and, in her case, playing the saxophone. Although recognition was slow in coming, Elise Hall is now celebrated for her pioneering efforts.
Paradoxically, these twentieth-century triumphs in the East have long been regarded as having less than exceptional nineteenth-century antecedents out West. According to her friend Renée Longy, it was while in California in the mid-1890s that Hall suffered a bout of typhoid fever resulting in hearing loss. Her husband, Richard J. Hall, MD, is alleged to have prescribed a novel treatment: playing a wind instrument to stimulate hearing and prevent further impairment. This she did, but because Santa Barbara was a small remote city, finding a teacher was said to have been difficult; hence, according to Longy, lessons began with a village laborer who happened to have a saxophone. William Street, in his dissertation, mentioned that Elise Hall's musical involvement in Santa Barbara included an organization known as “The Amateurs.” Reports of these curious and unremarkable beginnings led historians to conclude that her time in California lacked noteworthy achievement. However, a thorough reinvestigation reveals that such an unflattering narrative is largely false. The truth is that Hall's unprecedented successes in Boston was a continuation of what she had begun eight years earlier and some three thousand miles away.
The Hall family moved from New York to Santa Barbara in late 1889, and Elise began saxophone lessons in 1891. Although dozens of newspaper accounts refer to her throughout the 1890s (including societal happenings, charitable events, and musical performances), none mention illness or hearing loss.
Although the term neuroendocrine is now applied to many different contexts in which the nervous system interacts with the endocrine system to regulate hormone release and bring about important changes in physiology, the hypothalamic–pituitary axis remains the best known and most well-characterised example of a neuroendocrine system. Here, the hypothalamus acts as the major coordinating centre, integrating a diverse array of intrinsic (e.g. higher cortical, autonomic, endocrine) and extrinsic (e.g. environmental) signals to direct the function of multiple different target cells/tissues within the central nervous system, pituitary gland and peripheral sites. Hypothalamic regulation of pituitary function has far-reaching consequences, governing the release of hormones from other key endocrine glands (e.g. adrenal, thyroid, gonad), which, in turn, regulate the function of many physiological pathways with important consequences for energy balance and metabolism, osmo- and thermo-regulation, heart rate and blood pressure control, central nervous system function, growth and reproduction.
Purity violations overlap with other moral domains. They are not uniquely characterized by hypothesized markers of purity – the witness's emotion of disgust, taint to perpetrator's soul, or the diminished role of intention in moral judgment. Thus, Fitouchi et al.'s proposition that puritanical morality (a subset of violations in the purity domain) is part of cooperation-based morality is an important advance.
From the safety inside vehicles, Knowsley Safari offers visitors a close-up encounter with captive olive baboons. As exiting vehicles may be contaminated with baboon stool, a comprehensive coprological inspection was conducted to address public health concerns. Baboon stools were obtained from vehicles, and sleeping areas, inclusive of video analysis of baboon–vehicle interactions. A purposely selected 4-day sampling period enabled comparative inspections of 2662 vehicles, with a total of 669 baboon stools examined (371 from vehicles and 298 from sleeping areas). As informed by our pilot study, front-line diagnostic methods were: QUIK-CHEK rapid diagnostic test (RDT) (Giardia and Cryptosporidium), Kato–Katz coproscopy (Trichuris) and charcoal culture (Strongyloides). Some 13.9% of vehicles were contaminated with baboon stool. Prevalence of giardiasis was 37.4% while cryptosporidiosis was <0.01%, however, an absence of faecal cysts by quality control coproscopy, alongside lower than the expected levels of Giardia-specific DNA, judged RDT results as misleading, grossly overestimating prevalence. Prevalence of trichuriasis was 48.0% and strongyloidiasis was 13.7%, a first report of Strongyloides fuelleborni in UK. We advise regular blanket administration(s) of anthelminthics to the colony, exploring pour-on formulations, thereafter, smaller-scale indicator surveys would be adequate.
Malformed trilobite specimens present important insight into understanding how this extinct arthropod group recovered from developmental or moulting malfunctions, pathologies, and injuries. Previously documented examples of malformed trilobite specimens are often considered in isolation, with few studies reporting on multiple malformations in the same species. Here we report malformed specimens of the ellipsocephaloid trilobite Estaingia bilobata from the Emu Bay Shale Konservat-Lagerstätte (Cambrian Series 2, Stage 4) on Kangaroo Island, South Australia. Ten malformed specimens exhibiting injuries, pathologies, and a range of teratologies are documented. Furthermore, five examples of mangled exoskeletons are presented, indicative of predation on E. bilobata. Considering the position of malformed and normal specimens of E. bilobata in bivariate space, we demonstrate that the majority of malformed specimens cluster among the larger individuals. Such specimens may exemplify larger forms successfully escaping predation attempts, but could equally represent individuals exhibiting old injuries that were made during earlier (smaller) growth stages that have healed through subsequent moulting events. The available evidence from the Emu Bay Shale suggests that this small, extremely abundant trilobite likely played an important role in the structure of the local ecosystem, occupying a low trophic level and being preyed upon by multiple durophagous arthropods. Furthermore, the scarcity of malformed E. bilobata specimens demonstrates how rarely injuries, developmental malfunctions, and pathological infestations occurred within the species.
The Residual Lesion Score is a novel tool for assessing the achievement of surgical objectives in congenital heart surgery based on widely available clinical and echocardiographic characteristics. This article describes the methodology used to develop the Residual Lesion Score from the previously developed Technical Performance Score for five common congenital cardiac procedures using the RAND Delphi methodology.
Methods:
A panel of 11 experts from the field of paediatric and congenital cardiology and cardiac surgery, 2 co-chairs, and a consultant were assembled to review and comment on validity and feasibility of measuring the sub-components of intraoperative and discharge Residual Lesion Score for five congenital cardiac procedures. In the first email round, the panel reviewed and commented on the Residual Lesion Score and provided validity and feasibility scores for sub-components of each of the five procedures. In the second in-person round, email comments and scores were reviewed and the Residual Lesion Score revised. The modified Residual Lesion Score was scored independently by each panellist for validity and feasibility and used to develop the “final” Residual Lesion Score.
Results:
The Residual Lesion Score sub-components with a median validity score of ≥7 and median feasibility score of ≥4 that were scored without disagreement and with low absolute deviation from the median were included in the “final” Residual Lesion Score.
Conclusion:
Using the RAND Delphi methodology, we were able to develop Residual Lesion Score modules for five important congenital cardiac procedures for the Pediatric Heart Network’s Residual Lesion Score study.
During the past decade—more precisely during the last five to seven years—the increased use of urban guerrilla warfare and terrorism have characterized the activities of many revolutionary groups in the less developed world. High-lighted by the olympic assassinations of 1972, this phenomenon has also been evident in various African and Asian states. It is in Latin America, however, that the change from the traditional rural base for guerrilla operations to an urban environment has been most pronounced. The years from 1962 to 1967 saw many Latin American insurgents copying the Cuban revolutionary model, with its emphasis on rural guerrilla operations and the peasantry as the ultimate motive force, but recent years have seen an equally strong pull toward either purely urban insurgency or a more balanced strategy according equal importance to both rural and urban activities. In either case, the identifiable shift away from a totally rural guerrilla strategy for most Latin American revolutionary groups seems an established fact.
Urban insurgency has been used with increasing frequency and effectiveness in many areas of the developed and less developed world during the past decade. In Latin America, this trend toward expanded urban guerrilla warfare has been most pronounced in Brazil, Uruguay, and Argentina. In the three nations, revolutionary forces have rejected completely the concept of the primacy of guerrilla activities based in the countryside, a theory adapted to the Latin American environment by Cuba's Ernesto “Che” Guevara and French Marxist Régis Debray. Instead, attention has been focused on organizing and developing guerrilla and terrorist operations in such population centers as Rio de Janeiro, São Paulo, Montevideo, Buenos Aires, Rosario, and Córdoba. (For a discussion of factors leading to the development of urban insurgency in Latin America see “The Urban Guerrilla in Latin America: A Select Bibliography,” LARR: 9: 1).
Prior to the COVID-19 pandemic, our research group initiated a pediatric practice-based randomized trial for the treatment of childhood obesity in rural communities. Approximately 6 weeks into the originally planned 10-week enrollment period, the trial was forced to pause all study activity due to the COVID-19 pandemic. This pause necessitated a substantial revision in recruitment, enrollment, and other study methods in order to complete the trial using virtual procedures. This descriptive paper outlines methods used to recruit, enroll, and manage clinical trial participants with technology to obtain informed consent, obtain height and weight measurements by video, and maintain participant engagement throughout the duration of the trial.
Methods:
The study team reviewed the IRB records, protocol team meeting minutes and records, and surveyed the site teams to document the impact of the COVID-19 shift to virtual procedures on the study. The IRB approved study changes allowed for flexibility between clinical sites given variations in site resources, which was key to success of the implementation.
Results:
All study sites faced a variety of logistical challenges unique to their location yet successfully recruited the required number of patients for the trial. Ultimately, virtual procedures enhanced our ability to establish relationships with participants who were previously beyond our reach, but presented several challenges and required additional resources.
Conclusion:
Lessons learned from this study can assist other study groups in navigating challenges, especially when recruiting and implementing studies with rural and underserved populations or during challenging events like the pandemic.
Despite the impact of inappropriate prescribing on antibiotic resistance, data on surgical antibiotic prophylaxis in sub-Saharan Africa are limited. In this study, we evaluated antibiotic use and consumption in surgical prophylaxis in 4 hospitals located in 2 geographic regions of Sierra Leone.
Methods:
We used a prospective cohort design to collect data from surgical patients aged 18 years or older between February and October 2021. Data were analyzed using Stata version 16 software.
Results:
Of the 753 surgical patients, 439 (58.3%) were females, and 723 (96%) had received at least 1 dose of antibiotics. Only 410 (54.4%) patients had indications for surgical antibiotic prophylaxis consistent with local guidelines. Factors associated with preoperative antibiotic prophylaxis were the type of surgery, wound class, and consistency of surgical antibiotic prophylaxis with local guidelines. Postoperatively, type of surgery, wound class, and consistency of antibiotic use with local guidelines were important factors associated with antibiotic use. Of the 2,482 doses administered, 1,410 (56.8%) were given postoperatively. Preoperative and intraoperative antibiotic use was reported in 645 (26%) and 427 (17.2%) cases, respectively. The most commonly used antibiotic was ceftriaxone 949 (38.2%) with a consumption of 41.6 defined daily doses (DDD) per 100 bed days. Overall, antibiotic consumption was 117.9 DDD per 100 bed days. The Access antibiotics had 72.7 DDD per 100 bed days (61.7%).
Conclusions:
We report a high rate of antibiotic consumption for surgical prophylaxis, most of which was not based on local guidelines. To address this growing threat, urgent action is needed to reduce irrational antibiotic prescribing for surgical prophylaxis.
Drug development is a long and arduous process that requires many researchers at different types of institutions. These include researchers in university settings, researchers in government settings, researchers in non-profit organizations and researchers in the pharmaceutical industry. The pharmaceutical industry itself is heterogeneous, ranging from tiny biotech companies to large multi-national organizations. This chapte emphasizes drug development efforts by the pharmaceutical industry but will also make note of the many collaborations between pharma and researchers at other types of institutions.
Historical accounts of the Internet's origins tend to emphasize U.S. government investment and university-based researchers. In contrast, this article introduces actors who have been overlooked: the entrepreneurs and private firms that developed standards, evaluated competing standards, educated consumers about the value of new products, and built products to sell. Start-up companies such as 3Com and Cisco Systems succeeded because they met rapidly rising demand from users, particularly those in large organizations, who were connecting computers into networks and networks into internetworks. We consider a relatively brief yet dynamic period, from the late 1960s to the late 1980s, when regulators attacked incumbent American firms, entrepreneurs flourished in new market niches, and engineers set industry standards for networking and internetworking. As a consequence, their combined efforts forged new processes and institutions for so-called open standards that, in turn, created the conditions favorable for the “network effects” that sustained the formative years of the digital economy.
The idea of Zoroastrian mysticism might at first glance seem a contradiction in terms. The Good Religion, after all, is čīmīg, “rational,” above all else: Zoroaster elegantly solved the most intractable mystery of all faiths, theodicy, by the revelation of cosmic dualism. The mere assertion that there are esoteric doctrines within Zoroastrianism has been criticized. This criticism springs in part from a flawed perception of mysticism itself, which, as it will be argued, is not an independent entity, everywhere the same. Rather, each religion has a mysticism of its own, often irreconcilable in some of its features with the mainstream, and, in the case of Zoroastrianism, with some of the religion's plain logic. Also, the existence of mysticism within a religious tradition does not imply its centrality to that tradition. Mysticism exists in Christianity, but could scarcely be called essential to it, considering the claim, elaborated as the Christian church rose to universal prominence, to the radically overt and sufficient truth of the Gospel.
Two introduced carnivores, the European red fox Vulpes vulpes and domestic cat Felis catus, have had extensive impacts on Australian biodiversity. In this study, we collate information on consumption of Australian birds by the fox, paralleling a recent study reporting on birds consumed by cats. We found records of consumption by foxes on 128 native bird species (18% of the non-vagrant bird fauna and 25% of those species within the fox’s range), a smaller tally than for cats (343 species, including 297 within the fox’s Australian range, a subset of that of the cat). Most (81%) bird species eaten by foxes are also eaten by cats, suggesting that predation impacts are compounded. As with consumption by cats, birds that nest or forage on the ground are most likely to be consumed by foxes. However, there is also some partitioning, with records of consumption by foxes but not cats for 25 bird species, indicating that impacts of the two predators may also be complementary. Bird species ≥3.4 kg were more likely to be eaten by foxes, and those <3.4 kg by cats. Our compilation provides an inventory and describes characteristics of Australian bird species known to be consumed by foxes, but we acknowledge that records of predation do not imply population-level impacts. Nonetheless, there is sufficient information from other studies to demonstrate that fox predation has significant impacts on the population viability of some Australian birds, especially larger birds, and those that nest or forage on the ground.
The purpose of this study was to pilot safety and tolerability of a 1-week aerobic exercise program during the post-acute phase of concussion (14–25 days post-injury) by examining adherence, symptom response, and key functional outcomes (e.g., cognition, mood, sleep, postural stability, and neurocognitive performance) in young adults.
Method:
A randomized, non-blinded pilot clinical trial was performed to compare the effects of aerobic versus non-aerobic exercise (placebo) in concussion patients. The study enrolled three groups: 1) patients with concussion/mild traumatic brain injury (mTBI) randomized to an aerobic exercise intervention performed daily for 1-week, 2) patients with concussion/mTBI randomized to a non-aerobic (stretching and calisthenics) exercise program performed daily for 1-week, and 3) non-injured, no intervention reference group.
Results:
Mixed-model analysis of variance results indicated a significant decrease in symptom severity scores from pre- to post-intervention (mean difference = −7.44, 95% CI [−12.37, −2.20]) for both concussion groups. However, the pre- to post-change was not different between groups. Secondary outcomes all showed improvements by post-intervention, but no differences in trajectory between the groups. By three months post-injury, all outcomes in the concussion groups were within ranges of the non-injured reference group.
Conclusions:
Results from this study indicate that the feasibility and tolerability of administering aerobic exercise via stationary cycling in the post-acute time frame following post-concussion (14–25 days) period are tentatively favorable. Aerobic exercise does not appear to negatively impact recovery trajectories of neurobehavioral outcomes; however, tolerability may be poorer for patients with high symptom burden.