We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To meet the specific education needs of ethics committee members (primarily full-time healthcare professionals), the Regional Ethics Department of Kaiser Permanente Northern California (KPNCAL) and Washington State University’s Elson Floyd School of Medicine have partnered to create a one-academic year Medical Ethics Certificate Program. The mission-driven nature of the KPNCAL-WSU’s Certificate Program was designed to be a low-cost, high-quality option for busy full-time practitioners who may not otherwise opt to pursue additional education.
This article discusses the specific competency-focused methodologies and pedagogies adopted, as well as how the Certificate Program made permanent changes in response to the global pandemic. This article also discusses in detail one of the Program’s signature features, its Practicum—an extensive simulated clinical ethics consultation placing students in the role of ethics consultant, facilitating a conflict between family members played by paid professional actors. This article concludes with survey data responses from Program alumni gathered as part of a quality study.
Access to local, population specific, and timely data is vital in understanding factors that impact population health. The impact of place (neighborhood, census tract, and city) is particularly important in understanding the Social Determinants of Health. The University of Rochester Medical Center’s Clinical and Translational Science Institute created the web-based tool RocHealthData.org to provide access to thousands of geographically displayed publicly available health-related datasets. The site has also hosted a variety of locally curated datasets (eg., COVID-19 vaccination rates and community-derived health indicators), helping set community priorities and impacting outcomes. Usage statistics (available through Google Analytics) show returning visitors with a lower bounce rate (leaving a site after a single page access) and spent longer at the site than new visitors. Of the currently registered 1033 users, 51.7% were from within our host university, 20.1% were from another educational institution, and 28.2% identified as community members. Our assessments indicate that these data are useful and valued across a variety of domains. Continuing site improvement depends on new sources of locally relevant data, as well as increased usage of data beyond our local region.
With persistent incidence, incomplete vaccination rates, confounding respiratory illnesses, and few therapeutic interventions available, COVID-19 continues to be a burden on the pediatric population. During a surge, it is difficult for hospitals to direct limited healthcare resources effectively. While the overwhelming majority of pediatric infections are mild, there have been life-threatening exceptions that illuminated the need to proactively identify pediatric patients at risk of severe COVID-19 and other respiratory infectious diseases. However, a nationwide capability for developing validated computational tools to identify pediatric patients at risk using real-world data does not exist.
Methods:
HHS ASPR BARDA sought, through the power of competition in a challenge, to create computational models to address two clinically important questions using the National COVID Cohort Collaborative: (1) Of pediatric patients who test positive for COVID-19 in an outpatient setting, who are at risk for hospitalization? (2) Of pediatric patients who test positive for COVID-19 and are hospitalized, who are at risk for needing mechanical ventilation or cardiovascular interventions?
Results:
This challenge was the first, multi-agency, coordinated computational challenge carried out by the federal government as a response to a public health emergency. Fifty-five computational models were evaluated across both tasks and two winners and three honorable mentions were selected.
Conclusion:
This challenge serves as a framework for how the government, research communities, and large data repositories can be brought together to source solutions when resources are strapped during a pandemic.
Face masks reduce disease transmission by protecting the wearer from inhaled pathogens and reducing the emission of infectious aerosols. Although methods quantifying efficiency for wearer protection are established, current methods for assessing face mask containment efficiency rely on measurement of a low concentration of aerosols emitted from an infected or noninfected individual.
Methods:
A small port enabled the introduction of 0.05 µm sodium chloride particles at a constant rate behind the mask worn by a study participant. A condensation particle counter monitored ambient particle numbers 60 cm in front of the participant over 3-minute periods of rest, speaking, and coughing. The containment efficiency (%) for each mask and procedure was calculated as follows: 100 × (1 − average ambient concentration with face covering worn/average ambient concentration with a sham face covering in place). The protection efficiency (%) was also measured using previously published methods. The probability of transmission (%) from infected to uninfected (a function of both the containment efficiency and the protection efficiency) was calculated as follows: {1 − (containment efficiency/100)}×{1 − (protection efficiency/100)}×100.
Results:
The average containment efficiencies for each mask over all procedures and repeated measures were 94.6%, 60.9%, 38.8%, and 43.2%, respectively, for the N95 mask, the KN95 mask, the procedure face mask, and the gaiter. The corresponding protection efficiencies for each mask were 99.0%, 63.7%, 45.3%, and 24.2%, respectively. For example, the transmission probability for 1 infected and 1 uninfected individual in close proximity was ∼14.2% for KN95 masks, compared to 36%–39% when only 1 individual wore a KN95 mask.
Conclusion:
Overall, we detected a good correlation between the protection and containment that a face covering afforded to a wearer.
The success of agriculture relies on healthy bees to pollinate crops. Commercially managed pollinators are often kept under temperature-controlled conditions to better control development and optimize field performance. One such pollinator, the alfalfa leafcutting bee, Megachile rotundata, is the most widely used solitary bee in agriculture. Problematically, very little is known about the thermal physiology of M. rotundata or the consequences of artificial thermal regimes used in commercial management practices. Therefore, we took a broad look at the thermal performance of M. rotundata across development and the effects of commonly used commercial thermal regimes on adult bee physiology. After the termination of diapause, we hypothesized thermal sensitivity would vary across pupal metamorphosis. Our data show that bees in the post-diapause quiescent stage were more tolerant of low temperatures compared to bees in active development. We found that commercial practices applied during development decrease the likelihood of a bee recovering from another bout of thermal stress in adulthood, thereby decreasing their resilience. Lastly, commercial regimes applied during development affected the number of days to adult emergence, but the time of day that adults emerged was unaffected. Our data demonstrate the complex interactions between bee development and thermal regimes used in management. This knowledge can help improve the commercial management of these bees by optimizing the thermal regimes used and the timing of their application to alleviate negative downstream effects on adult performance.
The purpose of this document is to highlight practical recommendations to assist acute-care hospitals in prioritization and implementation of strategies to prevent healthcare-associated infections through hand hygiene. This document updates the Strategies to Prevent Healthcare-Associated Infections in Acute Care Hospitals through Hand Hygiene, published in 2014. This expert guidance document is sponsored by the Society for Healthcare Epidemiology (SHEA). It is the product of a collaborative effort led by SHEA, the Infectious Diseases Society of America, the Association for Professionals in Infection Control and Epidemiology, the American Hospital Association, and The Joint Commission, with major contributions from representatives of a number of organizations and societies with content expertise.
This study compares various morphometric features of two strains of broilers, selected and ‘relaxed’ (ie random-bred), raised under two feeding regimes, ad-libitum-fed and restricted-fed. We consider the possible consequences of the different body shapes on the musculoskeletal system. The ad-libitum-fed selected birds reached heavier bodyweights at younger ages, had wider girths, and developed large amounts of breast muscle which probably displaced their centre of gravity cranially. At cull weight, they had shorter legs than birds in the other groups and greater thigh-muscle masses; therefore, greater forces would have to be exerted by shorter lever arms in order to move the body. The tarsometatarsi were broader, providing increased resistance to greater loads, but the bones had a lower calcium and phosphorus content, which would theoretically make them weaker. Many of these morphological changes are likely to have detrimental effects on the musculoskeletal system and therefore compromise the walking ability and welfare of the birds.
This study tests the hypothesis that growth rate and bodyweight affect walking ability in broilers by comparing objective measurements of the spatial and temporal gait parameters of several groups of birds. Two strains of birds were used (relaxed and selected), raised on two feeding regimes (ad-libitum and restricted), and culled at the same final bodyweight (commercial cull weight of 2.4 kg). The ad-libitum-fed selected birds walked more slowly, with lower cadences, and took shorter steps. The steps were wider, and the toes were pointed outwards, resulting in a wider walking base. They kept their feet in contact with the ground for longer periods, having longer percentage stance times, shorter percentage swing times and increased double-contact times compared to the relaxed birds. These changes serve to increase stability during walking and are a likely consequence of the morphological changes in the selected broiler — in particular, the rapid growth of breast muscle moving the centre of gravity forward, and the relatively short legs compared to their bodyweight (see Corr et al, pp 145-157, this issue). This altered gait would be very inefficient and would rapidly tire the birds, and could help to explain the low level of activity seen in the modern broiler.
Late-life depression (LLD) is characterized by differences in resting state functional connectivity within and between intrinsic functional networks. This study examined whether clinical improvement to antidepressant medications is associated with pre-randomization functional connectivity in intrinsic brain networks.
Methods
Participants were 95 elders aged 60 years or older with major depressive disorder. After clinical assessments and baseline MRI, participants were randomized to escitalopram or placebo with a two-to-one allocation for 8 weeks. Non-remitting participants subsequently entered an 8-week trial of open-label bupropion. The main clinical outcome was depression severity measured by MADRS. Resting state functional connectivity was measured between a priori key seeds in the default mode (DMN), cognitive control, and limbic networks.
Results
In primary analyses of blinded data, lower post-treatment MADRS score was associated with higher resting connectivity between: (a) posterior cingulate cortex (PCC) and left medial prefrontal cortex; (b) PCC and subgenual anterior cingulate cortex (ACC); (c) right medial PFC and subgenual ACC; (d) right orbitofrontal cortex and left hippocampus. Lower post-treatment MADRS was further associated with lower connectivity between: (e) the right orbitofrontal cortex and left amygdala; and (f) left dorsolateral PFC and left dorsal ACC. Secondary analyses associated mood improvement on escitalopram with anterior DMN hub connectivity. Exploratory analyses of the bupropion open-label trial associated improvement with subgenual ACC, frontal, and amygdala connectivity.
Conclusions
Response to antidepressants in LLD is related to connectivity in the DMN, cognitive control and limbic networks. Future work should focus on clinical markers of network connectivity informing prognosis.
As COVID-19 was declared a health emergency in March 2020, there was immense demand for information about the novel pathogen. This paper examines the clinician-reported impact of Project ECHO COVID-19 Clinical Rounds on clinician learning. Primary sources of study data were Continuing Medical Education (CME) Surveys for each session from the dates of March 24, 2020 to July 30, 2020 and impact surveys conducted in November 2020, which sought to understand participants’ overall assessment of sessions. Quantitative analyses included descriptive statistics and Mann-Whitney testing. Qualitative data were analyzed through inductive thematic analysis. Clinicians rated their knowledge after each session as significantly higher than before that session. 75.8% of clinicians reported they would ‘definitely’ or ‘probably’ use content gleaned from each attended session and clinicians reported specific clinical and operational changes made as a direct result of sessions. 94.6% of respondents reported that COVID-19 Clinical Rounds helped them provide better care to patients. 89% of respondents indicated they ‘strongly agree’ that they would join ECHO calls again.COVID-19 Clinical Rounds offers a promising model for the establishment of dynamic peer-to-peer tele-mentoring communities for low or no-notice response where scientifically tested or clinically verified practice evidence is limited.
This study aimed to determine the probability of hearing recovery in patients with idiopathic sudden sensorineural hearing loss following salvage intratympanic steroids
Method
A retrospective review of all patients receiving salvage intratympanic steroid injections for idiopathic sudden sensorineural hearing loss was performed (January 2014 to December 2019). Twenty-two patients were identified, of whom 15 met inclusion criteria. Pre- and post-treatment audiograms were compared with the unaffected ear. Hearing recovery was categorised based on American Academy of Otolaryngology Head and Neck Surgery criteria.
Results
Only 1 patient out of 15 (6.7 per cent) made a partial recovery, and the remainder were non-responders. The median duration of time between symptom onset and first salvage intratympanic steroid treatment was 52 days (range, 14–81 days). No adverse reactions were observed.
Conclusion
‘Real world’ patients with idiopathic sudden sensorineural hearing loss present differently to those in the literature. Sudden sensorineural hearing loss should be diagnosed with care and intratympanic steroid injections initiated early if considered appropriate. Patients should make an informed decision on treatment based on prognostic factors and local success rates.
Few investigations have evaluated the validity of current body composition technology among racially and ethnically diverse populations. This study assessed the validity of common body composition methods in a multi-ethnic sample stratified by race and ethnicity. One hundred and ten individuals (55 % female, age: 26·5 (sd 6·9) years) identifying as Asian, African American/Black, Caucasian/White, Hispanic, Multi-racial and Native American were enrolled. Seven body composition models (dual-energy X-ray absorptiometry (DXA), air displacement plethysmography (ADP), two bioelectrical impedance devices (BIS, IB) and three multi-compartment models) were evaluated against a four-compartment criterion model by assessing total error (TE) and standard error of the estimate. For the total sample, measures of % fat and fat-free mass (FFM) from multi-compartment models were all excellent to ideal (% fat: TE = 0·94–2·37 %; FFM: TE = 0·72–1·78 kg) compared with the criterion. % fat measures were very good to excellent for DXA, ADP and IB (TE = 2·52–2·89 %) and fairly good for BIS (TE = 4·12 %). For FFM, single device estimates were good (BIS; TE = 3·12 kg) to ideal (DXA, ADP, IB; TE = 1·21–2·15 kg). Results did not vary meaningfully between each race and ethnicity, except BIS was not valid for African American/Black, Caucasian/White and Multi-racial participants for % fat (TE = 4·3–4·9 %). The multi-compartment models evaluated can be utilised in a multi-ethnic sample and in each individual race and ethnicity to obtain highly valid results for % fat and FFM. Estimates from DXA, ADP and IB were also valid. The BIS may demonstrate greater TE for all racial and ethnic cohorts and results should be interpreted cautiously.
This SHEA white paper identifies knowledge gaps and challenges in healthcare epidemiology research related to coronavirus disease 2019 (COVID-19) with a focus on core principles of healthcare epidemiology. These gaps, revealed during the worst phases of the COVID-19 pandemic, are described in 10 sections: epidemiology, outbreak investigation, surveillance, isolation precaution practices, personal protective equipment (PPE), environmental contamination and disinfection, drug and supply shortages, antimicrobial stewardship, healthcare personnel (HCP) occupational safety, and return to work policies. Each section highlights three critical healthcare epidemiology research questions with detailed description provided in supplementary materials. This research agenda calls for translational studies from laboratory-based basic science research to well-designed, large-scale studies and health outcomes research. Research gaps and challenges related to nursing homes and social disparities are included. Collaborations across various disciplines, expertise and across diverse geographic locations will be critical.
Necrotising otitis externa is a progressive infection of the external auditory canal which extends to affect the temporal bone and adjacent structures. Progression of the disease process can result in serious sequelae, including cranial nerve palsies and death. There is currently no formal published treatment guideline.
Objective
This study aimed to integrate current evidence and data from our own retrospective case series in order to develop a guideline to optimise necrotising otitis externa patient management.
Methods
A retrospective review of necrotising otitis externa cases within NHS Lothian, Scotland, between 2013 and 2018, was performed, along with a PubMed review.
Results
Prevalent presenting signs, symptoms and patient demographic data were established. Furthermore, features of cases associated with adverse outcomes were defined. A key feature of the guideline is defining at-risk patients with initial intensive treatment. Investigations and outcomes are assessed and treatment adjusted appropriately.
Conclusion
This multi-departmental approach has facilitated the development of a succinct, systematic guideline for the management of necrotising otitis externa. Initial patient outcomes appear promising.
To update current estimates of non–device-associated pneumonia (ND pneumonia) rates and their frequency relative to ventilator associated pneumonia (VAP), and identify risk factors for ND pneumonia.
Design:
Cohort study.
Setting:
Academic teaching hospital.
Patients:
All adult hospitalizations between 2013 and 2017 were included. Pneumonia (device associated and non–device associated) were captured through comprehensive, hospital-wide active surveillance using CDC definitions and methodology.
Results:
From 2013 to 2017, there were 163,386 hospitalizations (97,485 unique patients) and 771 pneumonia cases (520 ND pneumonia and 191 VAP). The rate of ND pneumonia remained stable, with 4.15 and 4.54 ND pneumonia cases per 10,000 hospitalization days in 2013 and 2017 respectively (P = .65). In 2017, 74% of pneumonia cases were ND pneumonia. Male sex and increasing age we both associated with increased risk of ND pneumonia. Additionally, patients with chronic bronchitis or emphysema (hazard ratio [HR], 2.07; 95% confidence interval [CI], 1.40–3.06), congestive heart failure (HR, 1.48; 95% CI, 1.07–2.05), or paralysis (HR, 1.72; 95% CI, 1.09–2.73) were also at increased risk, as were those who were immunosuppressed (HR, 1.54; 95% CI, 1.18–2.00) or in the ICU (HR, 1.49; 95% CI, 1.06–2.09). We did not detect a change in ND pneumonia risk with use of chlorhexidine mouthwash, total parenteral nutrition, all medications of interest, and prior ventilation.
Conclusion:
The incidence rate of ND pneumonia did not change from 2013 to 2017, and 3 of 4 nosocomial pneumonia cases were non–device associated. Hospital infection prevention programs should consider expanding the scope of surveillance to include non-ventilated patients. Future research should continue to look for modifiable risk factors and should assess potential prevention strategies.
To update current estimates of non–device-associated urinary tract infection (ND-UTI) rates and their frequency relative to catheter-associated UTIs (CA-UTIs) and to identify risk factors for ND-UTIs.
Design:
Cohort study.
Setting:
Academic teaching hospital.
Patients:
All adult hospitalizations between 2013 and 2017 were included. UTIs (device and non-device associated) were captured through comprehensive, hospital-wide active surveillance using Centers for Disease Control and Prevention case definitions and methodology.
Results:
From 2013 to 2017 there were 163,386 hospitalizations (97,485 unique patients) and 1,273 UTIs (715 ND-UTIs and 558 CA-UTIs). The rate of ND-UTIs remained stable, decreasing slightly from 6.14 to 5.57 ND-UTIs per 10,000 hospitalization days during the study period (P = .15). However, the proportion of UTIs that were non–device related increased from 52% to 72% (P < .0001). Female sex (hazard ratio [HR], 1.94; 95% confidence interval [CI], 1.50–2.50) and increasing age were associated with increased ND-UTI risk. Additionally, the following conditions were associated with increased risk: peptic ulcer disease (HR, 2.25; 95% CI, 1.04–4.86), immunosuppression (HR, 1.48; 95% CI, 1.15–1.91), trauma admissions (HR, 1.36; 95% CI, 1.02–1.81), total parenteral nutrition (HR, 1.99; 95% CI, 1.35–2.94) and opioid use (HR, 1.62; 95% CI, 1.10–2.32). Urinary retention (HR, 1.41; 95% CI, 0.96–2.07), suprapubic catheterization (HR, 2.28; 95% CI, 0.88–5.91), and nephrostomy tubes (HR, 2.02; 95% CI, 0.83–4.93) may also increase risk, but estimates were imprecise.
Conclusion:
Greater than 70% of UTIs are now non–device associated. Current targeted surveillance practices should be reconsidered in light of this changing landscape. We identified several modifiable risk factors for ND-UTIs, and future research should explore the impact of prevention strategies that target these factors.
We examined the epidemiology of invasive meningococcal disease (IMD) in the Republic of Ireland (ROI) between epidemiological year (EY) 1996/1997 and EY2015/2016. Over the 20 EYs, 3707 cases were reported with annual incidence rates per 100 000 peaking at 11.6 in EY1999/2000, decreasing significantly to 1.5 in EY2015/2016. The highest disease burden was in infants and children <5, whereas adults aged ⩾65 years experienced the highest case fatality ratio (CFR) of 15.7% but over the study period the median annual CFR remained low (4.4%). Meningococcal serogroup B (menB) dominated (78%), followed by menC (17%), menW (1%) and menY (1%). The incidence of menC IMD declined significantly in all age groups after menC vaccine introduction in 2000. MenB incidence also declined over the 20 EYs with decreasing trends in all age groups under 65, including an almost 50% decrease in infants over the final four EYs. IMD incidence in the ROI has declined, partly attributable to menC vaccination success, coupled with a spontaneous decline in menB. However, recent gradual increases in non-menB IMD and the introduction of vaccines targeting menB demand continued detailed surveillance to accurately monitor trends and to assess vaccine impact.
Molecular characterization of pediatric low-grade glioma (pLGG) over the last decade has identified recurrent alterations, most commonly involving BRAF, and less frequently other pathways including MYB and MYBL1. Many of these molecular markers have been exploited clinically to aid in diagnosis and treatment decisions. However, their frequency and prognostic significance remain unknown. Further, a significant portion of cases do not have any of these alterations and what underlies these cases remains unknown. To address this we compiled a cohort of 562 patients diagnosed at SickKids from 1990-2017. We identified molecular alterations in 454 cases (81% of the cohort). The most frequent events were those involving BRAF; either as fusions (most commonly with KIAA1549 (30%)) or V600E mutations (17%) and NF-1 (22%). Less frequently, we identified recurrent FGFR1 fusions and mutations (3%), MYB/MYBL alterations (2%), H3F3AK27M (2%) or IDH1R132H (0.5%) mutations, as well as other novel rare events. Survival analysis revealed significantly better progression-free survival (PFS) and overall survival (OS) of KIAA1549-BRAF fused patients compared to BRAFV600E with 10-year OS 97.7% (95%, CI 95.5-100) and 83.9% (95%, CI 72.5-95.6), respectively. In addition to survival, molecular alterations predicted differences in response to conventional therapeutics; BRAF fused patients showed a 46% response-rate, versus only 14% in V600E patients. pLGGs harboring H3F3AK27M progressed early with median PFS of 11 months. In patients with MYB/MYBL1, FGFR1/FGFR2 alterations, we observed only one death (FGFR1N546K case). The work here represents the largest cohort of pLGGs with molecular profiling and their impact on the clinical behaviour of the disease.
Thirty pyrite samples from a wide range of localities were analysed using relative comparator and k0 neutron activation analysis (NAA) techniques at the University of Missouri Research Reactor, Columbia, Missouri, USA (MURR) and the Australian Nuclear Science and TechnologyOrganisation, Lucas Heights, NSW, Australia (ANSTO), respectively. Statistical analyses of the trace-element data produced by the two methods showed a generally good correlation, with the majority of elemental concentrations of paired data reported by MURR and ANSTO being indistinguishableat a 0.05 significance level. Trace-element analyses of pyrite from Navajún in Spain by both techniques compare well with published data. There is evidence for contamination by Al, Na and Ti in one set of samples, this is likely to have been introduced by contact with a plastic usedin sample preparation.