We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
An alternative to an “all or none” approach to contact precautions for patients with MRSA carriage may be a “risk-tailored” approach – using gloves and gowns only for certain high-risk activities, locations, or roles.
Methods:
We distributed a discrete choice experiment to healthcare personnel (HCPs) in three cities. Respondents were presented with eight choice sets, each consisting of two hypothetical policy options for glove and gown use to prevent MRSA transmission. In each comparison, respondents selected their preferred option. Using mixed logit modeling we calculated utility derived from each policy component, probability of uptake for the most favored policies, and heterogeneity in preferences based on HCP role.
Results:
In total, 326 HCPs completed the survey. 237 (54%) respondents reported wearing gloves and gowns ‘all the time’ when required. Respondents’ preferred policy with the highest utility score was to use gloves and gown for all HCPs roles (utility, 0.17; 95% CI, 0.12 to 0.23), in high-risk settings (utility, 0.12; 95% CI 0.07–0.18), when touching the patient (utility, 0.11; 95% CI 0.06–0.17). Sixty-three percent (95% CI 60–66) would support a risk-tailored approach over an approach where contact precautions are used by all HCPs in all settings and for all activities. Support varied by HCP role (p < 0.02), with the strongest probability of support from physicians and advanced practice providers (77%, 95% CI 72%–82%) and the least support from environmental services personnel (45%, 95% CI 37%–53%).
Conclusions:
This discrete choice survey demonstrates that most HCPs prefer a risk-tailored approach to contact precautions when caring for patients with MRSA.
HOPE (National Institute for Health and Care Research Global Health Research Group on Homelessness and Mental Health in Africa) aims to develop and evaluate interventions that address the unmet needs of people who are homeless and have severe mental illness (SMI) living in three African countries in ways that are rights-based, contextually grounded, scalable and sustainable.
Methods
We will work in the capital city (Addis Ababa) in Ethiopia, a regional city (Tamale) in Ghana, and the capital city (Nairobi) and a rural county (Makueni) in Kenya to understand different approaches to intervention needed across varied settings.
We will be guided by the MRC/NIHR framework on complex interventions and implementation frameworks and emphasise co-production. Formative work will include synthesis of global evidence (systematic review, including grey literature, and a Delphi consensus exercise) on interventions and approaches to homelessness and SMI. We will map contexts; conduct focused ethnography to understand lived experiences of homelessness and SMI; carry out a cross-sectional survey of people who are homeless (n = 750 Ghana/Ethiopia; n = 350 Kenya) to estimate prevalence of SMI and identify prioritised needs; and conduct in-depth interviews and focus group discussions with key stakeholders to understand experiences, challenges and opportunities for intervention. This global and local evidence will feed into Theory of Change (ToC) workshops with stakeholders to establish agreement about valued primary outcomes, map pathways to impact and inform selection and implementation of interventions. Intervention packages to address prioritised needs will be co-produced, piloted and optimised for feasibility and acceptability using participatory action research. We will use rights-based approaches and focus on community-based care to ensure sustainability. Realist approaches will be employed to analyse how contextual variation affects mechanisms and outcomes to inform methods for a subsequent evaluation of larger scale implementation. Extensive capacity-strengthening activities will focus on equipping early career researchers and peer researchers. People with lived experience of SMI and policymakers are an integral part of the research team. Community engagement is supported by working closely with multisectoral Community Advisory Groups.
Conclusions
HOPE will develop evidence to support action to respond to the needs and preferences of people experiencing homelessness and SMI in diverse settings in Africa. We are creating a new partnership of researchers, policymakers, community members and people with lived experience of SMI and homelessness to enable African-led solutions. Key outputs will include contextually relevant practice and policy guidance that supports achievement of inclusive development.
“All or none” approaches to the use of contact precautions for methicillin-resistant Staphylococcus aureus (MRSA) both fail to recognize that transmission risk varies. This qualitative study assessed healthcare personnel perspectives regarding the feasibility of a risk-tailored approach to use contact precautions for MRSA more strategically in the acute care setting.
Most students in MD-PhD programs take a leave of absence from medical school to complete PhD training, which promotes a natural loss of clinical skills and knowledge and could negatively impact a student’s long-term clinical knowledge. To address this concern, clinical refresher courses in the final year of PhD training have traditionally been used; however, effectiveness of such courses versus a longitudinal clinical course spanning all PhD training years is unclear.
Methods:
The University of Alabama at Birmingham MD-PhD Program implemented a comprehensive continuing clinical education (CCE) course spanning PhD training years that features three course components: (1) clinical skills; (2) clinical knowledge; and (3) specialty exposure activities. To evaluate course effectiveness, data from an anonymous student survey completed at the end of each semester were analyzed.
Results:
Five hundred and ninety-seven surveys were completed by MD-PhD students from fall 2014 to 2022. Survey responses indicated that the majority of students found the course helpful to: maintain clinical skills and knowledge (544/597, 91% and 559/597, 94%; respectively), gain exposure to clinical specialties (568/597, 95%), and prepare them for responsibilities during clinical clerkships. During semesters following lockdowns from the COVID-19 pandemic, there were significant drops in students’ perceived preparedness.
Conclusions:
Positive student survey feedback and improved preparedness to return to clinic after development of the course suggests the CCE course is a useful approach to maintain clinical knowledge during research training.
Situated within the public will and political will framework, this paper explores frames to address the social issue of gender pay inequity. Specifically, the authors examine whether demographic characteristics affect perceived acceptability of different frames describing gender pay inequity and perceptions of this social issue. First, the authors identified 26 terms used to discuss gender pay inequity; this list was narrowed to 12, representing four categories. Next, the authors solicited sentiment reactions to those frames and perceptions of gender pay inequity. Taken together, the results indicated that although respondents had consistently positive reactions to the frames fair pay, equal pay, and pay fairness, perceptions varied across demographic groups. The biggest effects were consistently for political party-related variables. One frame, strategic compensation practices, emerged as a value-neutral frame that could potentially be used to reframe the issue and re-engage business and political stakeholders who do not perceive gender pay inequity as problematic.
Innovative shoe insoles, designed to enhance sensory information on the plantar surface of the feet, could help to improve walking in people with Multiple Sclerosis.
Objective:
To compare the effects of wearing textured versus smooth insoles, on measures of gait, foot sensation and patient-reported outcomes, in people with Multiple Sclerosis.
Methods:
A prospective, randomised controlled trial was conducted with concealed allocation, assessor blinding and intention-to-treat analysis. Thirty ambulant men and women with multiple sclerosis (MS) (Disease Steps rating 1–4) were randomly allocated to wear textured or smooth insoles for 12 weeks. Self-reported insole wear and falls diaries were completed over the intervention period. Laboratory assessments of spatiotemporal gait patterns, foot sensation and proprioception, and patient-reported outcomes, were performed at Weeks 0 (Baseline 1), 4 (Baseline 2) and 16 (Post-Intervention). The primary outcome was the size of the mediolateral base of support (stride/step width) when walking over even and uneven surfaces. Independent t-tests were performed on change from baseline (average of baseline measures) to post-intervention.
Results:
There were no differences in stride width between groups, when walking over the even or uneven surfaces (P ≥ 0.20) at post-intervention. There were no between-group differences for any secondary outcomes including gait (all P values > 0.23), foot sensory function (all P values ≥ 0.08) and patient-reported outcomes (all P values ≥ 0.23).
Conclusions:
In our small trial, prolonged wear of textured insoles did not appear to alter walking or foot sensation in people with MS who have limited foot sensory loss. Further investigation is needed to explore optimal insole design.
Clinical Trial Registration:
Australian and New Zealand Clinical Trials Registry (ACTRN12615000421538).
To explore communities’ perspectives on the factors in the social food environment that influence dietary behaviours in African cities.
Design:
A qualitative study using participatory photography (Photovoice). Participants took and discussed photographs representing factors in the social food environment that influence their dietary behaviours. Follow-up in-depth interviews allowed participants to tell the ‘stories’ of their photographs. Thematic analysis was conducted, using data-driven and theory-driven (based on the socio-ecological model) approaches.
Setting:
Three low-income areas of Nairobi (n 48) in Kenya and Accra (n 62) and Ho (n 32) in Ghana.
Participants:
Adolescents and adults, male and female aged ≥13 years.
Results:
The ‘people’ who were most commonly reported as influencers of dietary behaviours within the social food environment included family members, friends, health workers and food vendors. They mainly influenced food purchase, preparation and consumption, through (1) considerations for family members’ food preferences, (2) considerations for family members’ health and nutrition needs, (3) social support by family and friends, (4) provision of nutritional advice and modelling food behaviour by parents and health professionals, (5) food vendors’ services and social qualities.
Conclusions:
The family presents an opportunity for promoting healthy dietary behaviours among family members. Peer groups could be harnessed to promote healthy dietary behaviours among adolescents and youth. Empowering food vendors to provide healthier and safer food options could enhance healthier food sourcing, purchasing and consumption in African low-income urban communities.
People diagnosed with a severe mental illness (SMI) are at elevated risk of dying prematurely compared to the general population. We aimed to understand the additional risk among people with SMI after discharge from inpatient psychiatric care, when many patients experience an acute phase of their illness.
Methods
In the Clinical Practice Research Datalink (CPRD) GOLD and Aurum datasets, adults aged 18 years and older who were discharged from psychiatric inpatient care in England between 2001 and 2018 with primary diagnoses of SMI (schizophrenia, bipolar disorder, other psychoses) were matched by age and gender with up to five individuals with SMI and without recent hospital stays. Using survival analysis approaches, cumulative incidence and adjusted hazard ratios were estimated for all-cause mortality, external and natural causes of death, and suicide. All analyses were stratified by younger, middle and older ages and also by gender.
Results
In the year after their discharge, the risk of dying by all causes examined was higher than among individuals with SMI who had not received inpatient psychiatric care recently. Suicide risk was 11.6 times (95% CI 6.4–20.9) higher in the first 3 months and remained greater at 2–5 years after discharge (HR 2.3, 1.7–3.2). This risk elevation remained after adjustment for self-harm in the 6 months prior to the discharge date. The relative risk of dying by natural causes was raised in the first 3 months (HR 1.6, 1.3–1.9), with no evidence of elevation during the second year following discharge.
Conclusions
There is an additional risk of death by suicide and natural causes for people with SMI who have been recently discharged from inpatient care over and above the general risk among people with the same diagnosis who have not recently been treated as an inpatient. This mortality gap shows the importance of continued focus, following discharge, on individuals who require inpatient care.
To assess the burden of respiratory virus coinfections with severe acute respiratory coronavirus virus 2 (SARS-CoV-2), this study reviewed 4,818 specimens positive for SARS-CoV-2 and tested using respiratory virus multiplex testing. Coinfections with SARS-CoV-2 were uncommon (2.8%), with enterovirus or rhinovirus as the most prevalent target (88.1%). Respiratory virus coinfection with SARS-CoV-2 remains low 1 year into the coronavirus disease 2019 (COVID-19) pandemic.
This article addresses how French academics, doctors and state bureaucrats formulated sex work as a pathology, an area of inquiry that had to be studied in the interest of public safety. French colonisation in the Levant extended the reach of this ‘expertise’ from the metropole to Lebanon under the guise of public health. Knowledge produced by academics was used to buttress colonial state policy, which demanded that sex workers be contained to protect society against medical contagion. No longer drawing conclusions based on speculation, the medical establishment asserted its authority by harnessing modern advances in science and uniting them with extensive observation. ‘Empirical facts’ replaced ‘opinions’, as doctors forged new approaches to studying and containing venereal disease. They accomplished this through the use of statistics and new methods of diagnosing and treating maladies. Their novel approach was used to treat sex workers and to support commercial sex work policy both at home and abroad. Sex workers became the objects of scientific study and were consequently problematised by the state in medicalised terms.
The critical period for weed control (CPWC) adds value to integrated weed management by identifying the period during which weeds need to be controlled to avoid yield losses exceeding a defined threshold. However, the traditional application of the CPWC does not identify the timing of control needed for weeds that emerge late in the critical period. In this study, CPWC models were developed from field data in high-yielding cotton crops during three summer seasons from 2005 to 2008, using the mimic weed, common sunflower, at densities of two to 20 plants per square meter. Common sunflower plants were introduced at up to 450 growing degree days (GDD) after crop planting and removed at successive 200 GDD intervals after introduction. The CPWC models were described using extended Gompertz and logistic functions that included weed density, time of weed introduction, and time of weed removal (logistic function only) in the relationships. The resulting models defined the CPWC for late-emerging weeds, identifying a period after weed emergence before weed control was required to prevent yield loss exceeding the yield-loss threshold. When weeds emerged in sufficient numbers toward the end of the critical period, the model predicted that crop yield loss resulting from competition by these weeds would not exceed the yield-loss threshold until well after the end of the CPWC. These findings support the traditional practice of ensuring weeds are controlled before crop canopy closure, with later weed control inputs used as required.
We compared the rates of hospital-onset secondary bacterial infections in patients with coronavirus disease 2019 (COVID-19) with rates in patients with influenza and controls, and we investigated reports of increased incidence of Enterococcus infections in patients with COVID-19.
Design:
Retrospective cohort study.
Setting:
An academic quaternary-care hospital in San Francisco, California.
Patients:
Patients admitted between October 1, 2019, and October 1, 2020, with a positive SARS-CoV-2 PCR (N = 314) or influenza PCR (N = 82) within 2 weeks of admission were compared with inpatients without positive SARS-CoV-2 or influenza tests during the study period (N = 14,332).
Methods:
National Healthcare Safety Network definitions were used to identify infection-related ventilator-associated complications (IVACs), probable ventilator-associated pneumonia (PVAP), bloodstream infections (BSIs), and catheter-associated urinary tract infections (CAUTIs). A multiple logistic regression model was used to control for likely confounders.
Results:
COVID-19 patients had significantly higher rates of IVAC and PVAP compared to controls, with adjusted odds ratios of 4.7 (95% confidence interval [CI], 1.7–13.9) and 10.4 (95 % CI, 2.1–52.1), respectively. COVID-19 patients had higher incidence of BSI due to Enterococcus but not BSI generally, and whole-genome sequencing of Enterococcus isolates demonstrated that nosocomial transmission did not explain the increased rate. Subanalyses of patients admitted to the intensive care unit and patients who required mechanical ventilation revealed similar findings.
Conclusions:
COVID-19 is associated with an increased risk of IVAC, PVAP, and Enterococcus BSI compared with hospitalized controls, which is not fully explained by factors such as immunosuppressive treatments and duration of mechanical ventilation. The mechanism underlying increased rates of Enterococcus BSI in COVID-19 patients requires further investigation.
The mental health status of indigenous people in Bangladesh has attracted little or no attention. The objective of the present study is to determine the extent of symptoms of anxiety and depression in the two largest indigenous communities in Bangladesh.
Methods
In total, 240 participants were recruited, 120 from each of the Marma and Chakma communities with an overall mean age of 44.09 years (s.d. 15.73). Marma people were older (mean ages 48.92 v. 39.25, p < 0.001). Participants completed the Anxiety Scale (AS) and Depression Scale (DS) that have been developed and standardised in Bangladesh in the Bangla (Bengali) language.
Results
Results indicated that anxiety and depression scores were elevated in both communities, 59.2% of the participants scoring above the cut-off for clinical significance on AS and 58.8% of the participants scoring above the cut-off for clinical significance on DS. Marma people compared to Chakma people were more anxious (M = 59.49 v. 43.00, p < 0.001) and more depressed (M = 106.78 v. 82.30, p < 0.001). The demographic variables of age, sex and socioeconomic status were weakly or inconsistently related to scores. In the Marma people, females scored higher on both AS and DS, but in the Chakma community, males scored higher on AS and the same on DS.
Conclusion
The finding of significant anxiety and depression in communities with such limited mental health services is a matter of concern and emphasises the need to formulate and implement appropriate mental health policies for indigenous people in Bangladesh and other parts of the world.
Various host and parasite factors interact to determine the outcome of infection. We investigated the effects of two factors on the within-host dynamics of malaria in mice: initial infectious dose and co-infection with a helminth that limits the availability of red blood cells (RBCs). Using a statistical, time-series approach to model the within-host ‘epidemiology’ of malaria, we found that increasing initial dose reduced the time to peak cell-to-cell parasite propagation, but also reduced its magnitude, while helminth co-infection delayed peak cell-to-cell propagation, except at the highest malaria doses. Using a mechanistic model of within-host infection dynamics, we identified dose-dependence in parameters describing host responses to malaria infection and uncovered a plausible explanation of the observed differences in single vs co-infections. Specifically, in co-infections, our model predicted a higher background death rate of RBCs. However, at the highest dose, when intraspecific competition between malaria parasites would be highest, these effects of co-infection were not observed. Such interactions between initial dose and co-infection, although difficult to predict a priori, are key to understanding variation in the severity of disease experienced by hosts and could inform studies of malaria transmission dynamics in nature, where co-infection and low doses are the norm.
To describe interfacility transfer communication (IFTC) methods for notification of multidrug-resistant organism (MDRO) status in a diverse sample of acute-care hospitals.
Design:
Cross-sectional survey.
Participants:
Hospitals within the Society for Healthcare Epidemiology of America (SHEA) Research Network (SRN).
Methods:
SRN members completed an electronic survey on protocols and methods for IFTC. We assessed differences in IFTC frequency, barriers, and perceived benefit by presence of an IFTC protocol.
Results:
Among 136 hospital representatives who were sent the survey, 54 (40%) responded, of whom 72% reported having an IFTC protocol in place. The presence of a protocol did not differ significantly by hospital size, academic affiliation, or international status. Of those with IFTC protocols, 44% reported consistent notification of MDRO status (>75% of the time) to receiving facilities, as opposed to 13% from those with no IFTC protocol (P = .04). Respondents from hospitals with IFTC protocols reported significantly fewer barriers to communication compared to those without (2.8 vs 4.3; P = .03). Overall, however, most respondents (56%) reported a lack of standardization in communication. Presence of an IFTC protocol did not affect whether respondents perceived IFTC protocols as having a significant impact on infection prevention or antimicrobial stewardship.
Conclusions:
Most respondents reported having an IFTC protocol, which was associated with reduced communication barriers at transfer. Standardization of protocols and clarity about expectations for sending and receipt of information related to MDRO status may facilitate IFTC and promote appropriate and timely infection prevention practices.
Energy deficit is common during prolonged periods of strenuous physical activity and limited sleep, but the extent to which appetite suppression contributes is unclear. The aim of this randomised crossover study was to determine the effects of energy balance on appetite and physiological mediators of appetite during a 72-h period of high physical activity energy expenditure (about 9·6 MJ/d (2300 kcal/d)) and limited sleep designed to simulate military operations (SUSOPS). Ten men consumed an energy-balanced diet while sedentary for 1 d (REST) followed by energy-balanced (BAL) and energy-deficient (DEF) controlled diets during SUSOPS. Appetite ratings, gastric emptying time (GET) and appetite-mediating hormone concentrations were measured. Energy balance was positive during BAL (18 (sd 20) %) and negative during DEF (–43 (sd 9) %). Relative to REST, hunger, desire to eat and prospective consumption ratings were all higher during DEF (26 (sd 40) %, 56 (sd 71) %, 28 (sd 34) %, respectively) and lower during BAL (–55 (sd 25) %, −52 (sd 27) %, −54 (sd 21) %, respectively; Pcondition < 0·05). Fullness ratings did not differ from REST during DEF, but were 65 (sd 61) % higher during BAL (Pcondition < 0·05). Regression analyses predicted hunger and prospective consumption would be reduced and fullness increased if energy balance was maintained during SUSOPS, and energy deficits of ≥25 % would be required to elicit increases in appetite. Between-condition differences in GET and appetite-mediating hormones identified slowed gastric emptying, increased anorexigenic hormone concentrations and decreased fasting acylated ghrelin concentrations as potential mechanisms of appetite suppression. Findings suggest that physiological responses that suppress appetite may deter energy balance from being achieved during prolonged periods of strenuous activity and limited sleep.
Clarifying the relationship between depression symptoms and cardiometabolic and related health could clarify risk factors and treatment targets. The objective of this study was to assess whether depression symptoms in midlife are associated with the subsequent onset of cardiometabolic health problems.
Methods
The study sample comprised 787 male twin veterans with polygenic risk score data who participated in the Harvard Twin Study of Substance Abuse (‘baseline’) and the longitudinal Vietnam Era Twin Study of Aging (‘follow-up’). Depression symptoms were assessed at baseline [mean age 41.42 years (s.d. = 2.34)] using the Diagnostic Interview Schedule, Version III, Revised. The onset of eight cardiometabolic conditions (atrial fibrillation, diabetes, erectile dysfunction, hypercholesterolemia, hypertension, myocardial infarction, sleep apnea, and stroke) was assessed via self-reported doctor diagnosis at follow-up [mean age 67.59 years (s.d. = 2.41)].
Results
Total depression symptoms were longitudinally associated with incident diabetes (OR 1.29, 95% CI 1.07–1.57), erectile dysfunction (OR 1.32, 95% CI 1.10–1.59), hypercholesterolemia (OR 1.26, 95% CI 1.04–1.53), and sleep apnea (OR 1.40, 95% CI 1.13–1.74) over 27 years after controlling for age, alcohol consumption, smoking, body mass index, C-reactive protein, and polygenic risk for specific health conditions. In sensitivity analyses that excluded somatic depression symptoms, only the association with sleep apnea remained significant (OR 1.32, 95% CI 1.09–1.60).
Conclusions
A history of depression symptoms by early midlife is associated with an elevated risk for subsequent development of several self-reported health conditions. When isolated, non-somatic depression symptoms are associated with incident self-reported sleep apnea. Depression symptom history may be a predictor or marker of cardiometabolic risk over decades.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.