We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Through analyzing the compensation accounts and stock ledgers in the Bank of England Archive, this article explores how British firms—especially those in the City of London—profited from the unique business opportunity that arose through the payment of slavery compensation in 1835. It uses a new dataset with 18,930 observations to establish that a cohort of 27 “compensation agents” handled as intermediaries approximately two-thirds of the transactions associated with £5 million paid in compensation as government stock (3.5% Reduced Annuities) to slave owners in Barbados, Mauritius, the Cape of Good Hope, and the Virgin Islands. The article argues that this demonstrates how the City’s financial capacity, infrastructure, and business community were significant in delivering the efficient payment of compensation. It also underscores the need to understand the slavery compensation process as contemporaries did; as an important moment in the history of the City and its financial markets.
Clostridioides difficile (C. difficile) is one of the most common causes of healthcare-associated infections (HAIs). Elimination of C. difficile spores is difficult as they are resistant to common hospital-grade disinfectants. Copper-impregnated surfaces provide continuous reduction of multiple pathogens, potentially lowering the risk of infections. This manuscript aims to evaluate the efficacy of copper-impregnated surfaces on C. difficile spores.
Methods:
Control (no copper) coupons and copper coupons containing 20% copper-oxide were inoculated with C. difficile spore loads ranging from 105 to 107 spores, with or without 5% fetal bovine serum soil load. After 4 hours of contact time, the C. difficile spores were recovered, plated on C. difficile growth media, and colony forming units were counted. The efficacy of copper (log10 kill) was estimated using a Bayesian latent variables model.
Results:
After 4 hours, unsoiled copper bedrail and copper table coupons at mean spore inoculation resulted in a 97.3% and 96.8% reduction in spore count (1.57 and 1.50 log10 kill, respectively). That of soiled bedrail and table coupons showed a 91.8% and 91.7% reduction (1.10 and 1.10 log10 kill, respectively).
Conclusions:
Copper coupons can substantially reduce C. difficile spores after 4 hours, but results vary depending on the initial spore concentration and presence or absence of organic material. Higher initial spore loads or excess organic material may prevent spores from contact with copper surfaces, thus decreasing kill efficacy. Continuous sporicidal effect of copper-impregnated surfaces may decrease spore burden and help prevent transmission of spores.
To meet the specific education needs of ethics committee members (primarily full-time healthcare professionals), the Regional Ethics Department of Kaiser Permanente Northern California (KPNCAL) and Washington State University’s Elson Floyd School of Medicine have partnered to create a one-academic year Medical Ethics Certificate Program. The mission-driven nature of the KPNCAL-WSU’s Certificate Program was designed to be a low-cost, high-quality option for busy full-time practitioners who may not otherwise opt to pursue additional education.
This article discusses the specific competency-focused methodologies and pedagogies adopted, as well as how the Certificate Program made permanent changes in response to the global pandemic. This article also discusses in detail one of the Program’s signature features, its Practicum—an extensive simulated clinical ethics consultation placing students in the role of ethics consultant, facilitating a conflict between family members played by paid professional actors. This article concludes with survey data responses from Program alumni gathered as part of a quality study.
Low-intensity psychological interventions are effective for children and young people (CYP) with mental health difficulties and can help bridge the demand–capacity gap. Despite increasing awareness, training and use of low-intensity psychological interventions, it is not yet understood what is being implemented in clinical practice in the UK and the associated evidence base.
Method:
This paper presents two studies; first, a national survey (n=102) of practitioners to identify low-intensity psychological interventions currently delivered in practice and second, an exploration of the availability and the strength of empirical support (characterised as ‘gold’, ‘silver’ and ‘bronze’) of low-intensity CBT interventions for CYP.
Results:
The first study found a wide variety of interventions being used across different services; 101/102 respondents reported using routine outcome measures. The second study identified 44 different low-intensity interventions, 28 of which were rated as having gold empirical support. However, only 13 of the gold interventions were considered accessible for practitioners and only two were reported being used in routine practice.
Conclusion:
These findings highlight that these interventions have been developed and empirically tested, but many are not easily accessible, highlighting the ‘research–practice’ gap in the provision of low-intensity interventions. There is a need for an increase in standardisation of care and accessibility of gold interventions. This paper hopes to begin the process of creating a hub of low-intensity interventions that are accessible and empirically supported to improve equity of access and outcomes of low-intensity psychological interventions for CYP.
A variational principle is proposed to derive the governing equations for the problem of ocean wave interactions with a floating ice shelf, where the ice shelf is modelled by the full linear equations of elasticity and has an Archimedean draught. The variational principle is used to form a thin-plate approximation for the ice shelf, which includes water–ice coupling at the shelf front and extensional waves in the shelf, in contrast to the benchmark thin-plate approximation for ocean wave interactions with an ice shelf. The thin-plate approximation is combined with a single-mode approximation in the water, where the vertical motion is constrained to the eigenfunction that supports propagating waves. The new terms in the approximation are shown to have a major impact on predictions of ice shelf strains for wave periods in the swell regime.
Access to local, population specific, and timely data is vital in understanding factors that impact population health. The impact of place (neighborhood, census tract, and city) is particularly important in understanding the Social Determinants of Health. The University of Rochester Medical Center’s Clinical and Translational Science Institute created the web-based tool RocHealthData.org to provide access to thousands of geographically displayed publicly available health-related datasets. The site has also hosted a variety of locally curated datasets (eg., COVID-19 vaccination rates and community-derived health indicators), helping set community priorities and impacting outcomes. Usage statistics (available through Google Analytics) show returning visitors with a lower bounce rate (leaving a site after a single page access) and spent longer at the site than new visitors. Of the currently registered 1033 users, 51.7% were from within our host university, 20.1% were from another educational institution, and 28.2% identified as community members. Our assessments indicate that these data are useful and valued across a variety of domains. Continuing site improvement depends on new sources of locally relevant data, as well as increased usage of data beyond our local region.
With persistent incidence, incomplete vaccination rates, confounding respiratory illnesses, and few therapeutic interventions available, COVID-19 continues to be a burden on the pediatric population. During a surge, it is difficult for hospitals to direct limited healthcare resources effectively. While the overwhelming majority of pediatric infections are mild, there have been life-threatening exceptions that illuminated the need to proactively identify pediatric patients at risk of severe COVID-19 and other respiratory infectious diseases. However, a nationwide capability for developing validated computational tools to identify pediatric patients at risk using real-world data does not exist.
Methods:
HHS ASPR BARDA sought, through the power of competition in a challenge, to create computational models to address two clinically important questions using the National COVID Cohort Collaborative: (1) Of pediatric patients who test positive for COVID-19 in an outpatient setting, who are at risk for hospitalization? (2) Of pediatric patients who test positive for COVID-19 and are hospitalized, who are at risk for needing mechanical ventilation or cardiovascular interventions?
Results:
This challenge was the first, multi-agency, coordinated computational challenge carried out by the federal government as a response to a public health emergency. Fifty-five computational models were evaluated across both tasks and two winners and three honorable mentions were selected.
Conclusion:
This challenge serves as a framework for how the government, research communities, and large data repositories can be brought together to source solutions when resources are strapped during a pandemic.
Face masks reduce disease transmission by protecting the wearer from inhaled pathogens and reducing the emission of infectious aerosols. Although methods quantifying efficiency for wearer protection are established, current methods for assessing face mask containment efficiency rely on measurement of a low concentration of aerosols emitted from an infected or noninfected individual.
Methods:
A small port enabled the introduction of 0.05 µm sodium chloride particles at a constant rate behind the mask worn by a study participant. A condensation particle counter monitored ambient particle numbers 60 cm in front of the participant over 3-minute periods of rest, speaking, and coughing. The containment efficiency (%) for each mask and procedure was calculated as follows: 100 × (1 − average ambient concentration with face covering worn/average ambient concentration with a sham face covering in place). The protection efficiency (%) was also measured using previously published methods. The probability of transmission (%) from infected to uninfected (a function of both the containment efficiency and the protection efficiency) was calculated as follows: {1 − (containment efficiency/100)}×{1 − (protection efficiency/100)}×100.
Results:
The average containment efficiencies for each mask over all procedures and repeated measures were 94.6%, 60.9%, 38.8%, and 43.2%, respectively, for the N95 mask, the KN95 mask, the procedure face mask, and the gaiter. The corresponding protection efficiencies for each mask were 99.0%, 63.7%, 45.3%, and 24.2%, respectively. For example, the transmission probability for 1 infected and 1 uninfected individual in close proximity was ∼14.2% for KN95 masks, compared to 36%–39% when only 1 individual wore a KN95 mask.
Conclusion:
Overall, we detected a good correlation between the protection and containment that a face covering afforded to a wearer.
The success of agriculture relies on healthy bees to pollinate crops. Commercially managed pollinators are often kept under temperature-controlled conditions to better control development and optimize field performance. One such pollinator, the alfalfa leafcutting bee, Megachile rotundata, is the most widely used solitary bee in agriculture. Problematically, very little is known about the thermal physiology of M. rotundata or the consequences of artificial thermal regimes used in commercial management practices. Therefore, we took a broad look at the thermal performance of M. rotundata across development and the effects of commonly used commercial thermal regimes on adult bee physiology. After the termination of diapause, we hypothesized thermal sensitivity would vary across pupal metamorphosis. Our data show that bees in the post-diapause quiescent stage were more tolerant of low temperatures compared to bees in active development. We found that commercial practices applied during development decrease the likelihood of a bee recovering from another bout of thermal stress in adulthood, thereby decreasing their resilience. Lastly, commercial regimes applied during development affected the number of days to adult emergence, but the time of day that adults emerged was unaffected. Our data demonstrate the complex interactions between bee development and thermal regimes used in management. This knowledge can help improve the commercial management of these bees by optimizing the thermal regimes used and the timing of their application to alleviate negative downstream effects on adult performance.
The purpose of this document is to highlight practical recommendations to assist acute-care hospitals in prioritization and implementation of strategies to prevent healthcare-associated infections through hand hygiene. This document updates the Strategies to Prevent Healthcare-Associated Infections in Acute Care Hospitals through Hand Hygiene, published in 2014. This expert guidance document is sponsored by the Society for Healthcare Epidemiology (SHEA). It is the product of a collaborative effort led by SHEA, the Infectious Diseases Society of America, the Association for Professionals in Infection Control and Epidemiology, the American Hospital Association, and The Joint Commission, with major contributions from representatives of a number of organizations and societies with content expertise.
The term “blue justice” was coined in 2018 during the 3rd World Small-Scale Fisheries Congress. Since then, academic engagement with the concept has grown rapidly. This article reviews 5 years of blue justice scholarship and synthesizes some of the key perspectives, developments, and gaps. We then connect this literature to wider relevant debates by reviewing two key areas of research – first on blue injustices and second on grassroots resistance to these injustices. Much of the early scholarship on blue justice focused on injustices experienced by small-scale fishers in the context of the blue economy. In contrast, more recent writing and the empirical cases reviewed here suggest that intersecting forms of oppression render certain coastal individuals and groups vulnerable to blue injustices. These developments signal an expansion of the blue justice literature to a broader set of affected groups and underlying causes of injustice. Our review also suggests that while grassroots resistance efforts led by coastal communities have successfully stopped unfair exposure to environmental harms, preserved their livelihoods and ways of life, defended their culture and customary rights, renegotiated power distributions, and proposed alternative futures, these efforts have been underemphasized in the blue justice scholarship, and from marine and coastal literature more broadly. We conclude with some suggestions for understanding and supporting blue justice now and into the future.
This study compares various morphometric features of two strains of broilers, selected and ‘relaxed’ (ie random-bred), raised under two feeding regimes, ad-libitum-fed and restricted-fed. We consider the possible consequences of the different body shapes on the musculoskeletal system. The ad-libitum-fed selected birds reached heavier bodyweights at younger ages, had wider girths, and developed large amounts of breast muscle which probably displaced their centre of gravity cranially. At cull weight, they had shorter legs than birds in the other groups and greater thigh-muscle masses; therefore, greater forces would have to be exerted by shorter lever arms in order to move the body. The tarsometatarsi were broader, providing increased resistance to greater loads, but the bones had a lower calcium and phosphorus content, which would theoretically make them weaker. Many of these morphological changes are likely to have detrimental effects on the musculoskeletal system and therefore compromise the walking ability and welfare of the birds.
This study tests the hypothesis that growth rate and bodyweight affect walking ability in broilers by comparing objective measurements of the spatial and temporal gait parameters of several groups of birds. Two strains of birds were used (relaxed and selected), raised on two feeding regimes (ad-libitum and restricted), and culled at the same final bodyweight (commercial cull weight of 2.4 kg). The ad-libitum-fed selected birds walked more slowly, with lower cadences, and took shorter steps. The steps were wider, and the toes were pointed outwards, resulting in a wider walking base. They kept their feet in contact with the ground for longer periods, having longer percentage stance times, shorter percentage swing times and increased double-contact times compared to the relaxed birds. These changes serve to increase stability during walking and are a likely consequence of the morphological changes in the selected broiler — in particular, the rapid growth of breast muscle moving the centre of gravity forward, and the relatively short legs compared to their bodyweight (see Corr et al, pp 145-157, this issue). This altered gait would be very inefficient and would rapidly tire the birds, and could help to explain the low level of activity seen in the modern broiler.
The post-release survival of hand-reared tawny owls, Strix aluco, was studied. Hand-rearing did not appear to affect the birds’ instinctive behaviour or post-release survival. The recovery of several pellets confirmed that hunting in this species is an innate process. In terms of animal welfare, hand-reared tawny owls do not appear to be at a disadvantage when compared with wild juveniles, indicating that current rearing and release practices are effective.
Children with genetic conditions may experience significant mental health difficulties such as anxiety and challenging behaviour. However, understanding of the feasibility and effectiveness of psychological interventions for emotional and behavioural problems in the context of genetic conditions is limited. Low-intensity psychological interventions have demonstrated promise in paediatric populations and may be able to address their mental health difficulties. A case series design was used to assess the feasibility of low-intensity interventions for emotional and behavioural difficulties in children and young people with genetic conditions recruited from a mental health drop-in centre at a tertiary hospital. Participants received seven weekly sessions with a trained practitioner. The intervention was based on existing modular treatments and evidence-based self-help materials. Feasibility and treatment satisfaction were assessed, as well as measures of symptoms of anxiety and challenging behaviour, treatment goals and quality of life, at baseline, during treatment and 6-month follow-up. Five participants received treatment for challenging behaviour, one for anxiety, and one for obsessive compulsive disorder. All participants completed treatment. Clinically significant change in the SDQ Total score was found in three participants. All participants demonstrated progress in goals and symptoms of emotional and behavioural difficulties over the course of treatment. Low-intensity psychological interventions for emotional and behavioural difficulties in children and young people with genetic conditions is feasible, acceptable and potentially beneficial. Further research is warranted to examine the effectiveness of the intervention and its use in clinical paediatric settings.
Key learning aims
(1) To gain a basic understanding of low-intensity psychological intervention in children and young people with genetic conditions.
(2) To enhance understanding of the practicalities and acceptability of delivering low-intensity psychological intervention to children and young people with genetic conditions and co-morbid emotional and behavioural difficulties.
(3) To learn about the potential clinical benefits of delivering low-intensity psychological intervention to children and young people with genetic conditions in the context of stepped care.
Late-life depression (LLD) is characterized by differences in resting state functional connectivity within and between intrinsic functional networks. This study examined whether clinical improvement to antidepressant medications is associated with pre-randomization functional connectivity in intrinsic brain networks.
Methods
Participants were 95 elders aged 60 years or older with major depressive disorder. After clinical assessments and baseline MRI, participants were randomized to escitalopram or placebo with a two-to-one allocation for 8 weeks. Non-remitting participants subsequently entered an 8-week trial of open-label bupropion. The main clinical outcome was depression severity measured by MADRS. Resting state functional connectivity was measured between a priori key seeds in the default mode (DMN), cognitive control, and limbic networks.
Results
In primary analyses of blinded data, lower post-treatment MADRS score was associated with higher resting connectivity between: (a) posterior cingulate cortex (PCC) and left medial prefrontal cortex; (b) PCC and subgenual anterior cingulate cortex (ACC); (c) right medial PFC and subgenual ACC; (d) right orbitofrontal cortex and left hippocampus. Lower post-treatment MADRS was further associated with lower connectivity between: (e) the right orbitofrontal cortex and left amygdala; and (f) left dorsolateral PFC and left dorsal ACC. Secondary analyses associated mood improvement on escitalopram with anterior DMN hub connectivity. Exploratory analyses of the bupropion open-label trial associated improvement with subgenual ACC, frontal, and amygdala connectivity.
Conclusions
Response to antidepressants in LLD is related to connectivity in the DMN, cognitive control and limbic networks. Future work should focus on clinical markers of network connectivity informing prognosis.
As COVID-19 was declared a health emergency in March 2020, there was immense demand for information about the novel pathogen. This paper examines the clinician-reported impact of Project ECHO COVID-19 Clinical Rounds on clinician learning. Primary sources of study data were Continuing Medical Education (CME) Surveys for each session from the dates of March 24, 2020 to July 30, 2020 and impact surveys conducted in November 2020, which sought to understand participants’ overall assessment of sessions. Quantitative analyses included descriptive statistics and Mann-Whitney testing. Qualitative data were analyzed through inductive thematic analysis. Clinicians rated their knowledge after each session as significantly higher than before that session. 75.8% of clinicians reported they would ‘definitely’ or ‘probably’ use content gleaned from each attended session and clinicians reported specific clinical and operational changes made as a direct result of sessions. 94.6% of respondents reported that COVID-19 Clinical Rounds helped them provide better care to patients. 89% of respondents indicated they ‘strongly agree’ that they would join ECHO calls again.COVID-19 Clinical Rounds offers a promising model for the establishment of dynamic peer-to-peer tele-mentoring communities for low or no-notice response where scientifically tested or clinically verified practice evidence is limited.
This study aimed to determine the probability of hearing recovery in patients with idiopathic sudden sensorineural hearing loss following salvage intratympanic steroids
Method
A retrospective review of all patients receiving salvage intratympanic steroid injections for idiopathic sudden sensorineural hearing loss was performed (January 2014 to December 2019). Twenty-two patients were identified, of whom 15 met inclusion criteria. Pre- and post-treatment audiograms were compared with the unaffected ear. Hearing recovery was categorised based on American Academy of Otolaryngology Head and Neck Surgery criteria.
Results
Only 1 patient out of 15 (6.7 per cent) made a partial recovery, and the remainder were non-responders. The median duration of time between symptom onset and first salvage intratympanic steroid treatment was 52 days (range, 14–81 days). No adverse reactions were observed.
Conclusion
‘Real world’ patients with idiopathic sudden sensorineural hearing loss present differently to those in the literature. Sudden sensorineural hearing loss should be diagnosed with care and intratympanic steroid injections initiated early if considered appropriate. Patients should make an informed decision on treatment based on prognostic factors and local success rates.
Few investigations have evaluated the validity of current body composition technology among racially and ethnically diverse populations. This study assessed the validity of common body composition methods in a multi-ethnic sample stratified by race and ethnicity. One hundred and ten individuals (55 % female, age: 26·5 (sd 6·9) years) identifying as Asian, African American/Black, Caucasian/White, Hispanic, Multi-racial and Native American were enrolled. Seven body composition models (dual-energy X-ray absorptiometry (DXA), air displacement plethysmography (ADP), two bioelectrical impedance devices (BIS, IB) and three multi-compartment models) were evaluated against a four-compartment criterion model by assessing total error (TE) and standard error of the estimate. For the total sample, measures of % fat and fat-free mass (FFM) from multi-compartment models were all excellent to ideal (% fat: TE = 0·94–2·37 %; FFM: TE = 0·72–1·78 kg) compared with the criterion. % fat measures were very good to excellent for DXA, ADP and IB (TE = 2·52–2·89 %) and fairly good for BIS (TE = 4·12 %). For FFM, single device estimates were good (BIS; TE = 3·12 kg) to ideal (DXA, ADP, IB; TE = 1·21–2·15 kg). Results did not vary meaningfully between each race and ethnicity, except BIS was not valid for African American/Black, Caucasian/White and Multi-racial participants for % fat (TE = 4·3–4·9 %). The multi-compartment models evaluated can be utilised in a multi-ethnic sample and in each individual race and ethnicity to obtain highly valid results for % fat and FFM. Estimates from DXA, ADP and IB were also valid. The BIS may demonstrate greater TE for all racial and ethnic cohorts and results should be interpreted cautiously.
Initial assessments of coronavirus disease 2019 (COVID-19) preparedness revealed resource shortages and variations in infection prevention policies across US hospitals. Our follow-up survey revealed improvement in resource availability, increase in testing capacity, and uniformity in infection prevention policies. Most importantly, the survey highlighted an increase in staffing shortages and use of travel nursing.