We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Ward round quality is a pivotal component of surgical care and is intimately associated with patient outcomes. Despite this, ward rounds remain largely understudied and underrepresented in medical literature. Accurate and thorough ward round documentation is known to improve communication and patient outcomes and to reduce hospital expenditure. This study aimed to determine the accuracy of ward round documentation.
Methods
A prospective observational cohort study was performed as a sub-analysis of a larger study by reviewing 135 audiovisual recordings of surgical ward rounds over two years at two hospitals. The recordings were transcribed verbatim, and content was designated a level of importance by an external reviewer. This was then compared to the written case notes to determine the accuracy and importance of omitted documentation. Patient age, sex, and length of stay, as well as the senior doctor leading and the intern documenting the ward round, were assessed using multivariable linear mixed-effect models to determine their impact on documentation accuracy.
Results
Nearly one-third (32.4%) of spoken information on the surgical ward round that was deemed “important”, including discharge plans and bookings for surgery, was absent from the patients’ electronic medical records. Additionally, in 11 percent of case notes there was a major conflict between the ward round discussion and what was documented. Younger patients (p=0.04) and patients who had been on the ward longer (p=0.005) were less likely to have accurate documentation. Some interns were significantly worse at documenting discussions than were others (p<0.0001). Day of the week, location, and the senior doctor present did not affect documentation accuracy.
Conclusions
This study demonstrates that a significant amount of important discussion during surgical ward rounds regarding patient care is not recorded accurately, or at all, in the patient medical record. This can lead to preventable patient complications and longer hospital stays, resulting in increased strain on hospital resources. This study emphasizes the need for further research to address this problem.
Special education enrollment increased in Flint following the 2014–2015 Flint Water Crisis, but lead exposure is not plausibly responsible. Labeling Flint children as lead poisoned and/or brain damaged may have contributed to rising special education needs (ie, nocebo effect). To better document this possibility, we surveyed schoolteachers and reviewed neuropsychological assessments of children for indications of negative labeling.
Methods
A survey of Flint and Detroit (control) public schoolteachers using a modified Illness Perception Questionnaire was conducted 5 years post-crisis. We also examined neuropsychological assessments from a recently settled class lawsuit.
Results
Relative to Detroit (n = 24), Flint teachers (n = 11) believed that a higher proportion of their students had harmful lead exposure (91.8% Flint vs 46% Detroit; P = 0.00034), were lead poisoned (51.3% vs 24.3%; P = 0.018), or brain damaged (28.8% vs 12.9%; P = 0.1), even though blood lead of Flint children was always less than half of that of Detroit children. Neuropsychological assessments diagnosed lead poisoning and/or brain damage from water lead exposure in all tested children (n = 8), even though none had evidence of elevated blood lead and a majority had prior learning disability diagnoses.
Conclusion
Teachers’ responses and neuropsychological assessments suggest Flint children were harmed by a nocebo effect.
Cross-species evidence suggests that the ability to exert control over a stressor is a key dimension of stress exposure that may sensitize frontostriatal-amygdala circuitry to promote more adaptive responses to subsequent stressors. The present study examined neural correlates of stressor controllability in young adults. Participants (N = 56; Mage = 23.74, range = 18–30 years) completed either the controllable or uncontrollable stress condition of the first of two novel stressor controllability tasks during functional magnetic resonance imaging (fMRI) acquisition. Participants in the uncontrollable stress condition were yoked to age- and sex-matched participants in the controllable stress condition. All participants were subsequently exposed to uncontrollable stress in the second task, which is the focus of fMRI analyses reported here. A whole-brain searchlight classification analysis revealed that patterns of activity in the right dorsal anterior insula (dAI) during subsequent exposure to uncontrollable stress could be used to classify participants' initial exposure to either controllable or uncontrollable stress with a peak of 73% accuracy. Previous experience of exerting control over a stressor may change the computations performed within the right dAI during subsequent stress exposure, shedding further light on the neural underpinnings of stressor controllability.
Anticholinergic medications block cholinergic transmission. The central effects of anticholinergic drugs can be particularly marked in patients with dementia. Furthermore, anticholinergics antagonise the effects of cholinesterase inhibitors, the main dementia treatment.
Objectives
This study aimed to assess anticholinergic drug prescribing among dementia patients before and after admission to UK acute hospitals.
Methods
352 patients with dementia were included from 17 hospitals in the UK. All were admitted to surgical, medical or Care of the Elderly wards in 2019. Information about patients’ prescriptions were recorded on a standardised form. An evidence-based online calculator was used to calculate the anticholinergic drug burden of each patient. The correlation between two subgroups upon admission and discharge was tested with Spearman’s Rank Correlation.
Results
Table 1 shows patient demographics. On admission, 37.8% of patients had an anticholinergic burden score ≥1 and 5.68% ≥3. At discharge, 43.2% of patients had an anticholinergic burden score ≥1 and 9.1% ≥3. The increase was statistically significant (rho 0.688; p=2.2x10-16). The most common group of anticholinergic medications prescribed at discharge were psychotropics (see Figure 1). Among patients prescribed cholinesterase inhibitors, 44.9% were also taking anticholinergic medications.
Conclusions
This multicentre cross-sectional study found that people with dementia are frequently prescribed anticholinergic drugs, even if also taking cholinesterase inhibitors, and are significantly more likely to be discharged with a higher anticholinergic drug burden than on admission to hospital.
Conflict of interest
This project was planned and executed by the authors on behalf of SPARC (Student Psychiatry Audit and Research Collaborative). We thank the National Student Association of Medical Research for allowing us use of the Enketo platform. Judith Harrison was su
Recent cannabis exposure has been associated with lower rates of neurocognitive impairment in people with HIV (PWH). Cannabis’s anti-inflammatory properties may underlie this relationship by reducing chronic neuroinflammation in PWH. This study examined relations between cannabis use and inflammatory biomarkers in cerebrospinal fluid (CSF) and plasma, and cognitive correlates of these biomarkers within a community-based sample of PWH.
Methods:
263 individuals were categorized into four groups: HIV− non-cannabis users (n = 65), HIV+ non-cannabis users (n = 105), HIV+ moderate cannabis users (n = 62), and HIV+ daily cannabis users (n = 31). Differences in pro-inflammatory biomarkers (IL-6, MCP-1/CCL2, IP-10/CXCL10, sCD14, sTNFR-II, TNF-α) by study group were determined by Kruskal–Wallis tests. Multivariable linear regressions examined relationships between biomarkers and seven cognitive domains, adjusting for age, sex/gender, race, education, and current CD4 count.
Results:
HIV+ daily cannabis users showed lower MCP-1 and IP-10 levels in CSF compared to HIV+ non-cannabis users (p = .015; p = .039) and were similar to HIV− non-cannabis users. Plasma biomarkers showed no differences by cannabis use. Among PWH, lower CSF MCP-1 and lower CSF IP-10 were associated with better learning performance (all ps < .05).
Conclusions:
Current daily cannabis use was associated with lower levels of pro-inflammatory chemokines implicated in HIV pathogenesis and these chemokines were linked to the cognitive domain of learning which is commonly impaired in PWH. Cannabinoid-related reductions of MCP-1 and IP-10, if confirmed, suggest a role for medicinal cannabis in the mitigation of persistent inflammation and cognitive impacts of HIV.
Hemiparetic walking after stroke is typically slow, asymmetric, and inefficient, significantly impacting activities of daily living. Extensive research shows that functional, intensive, and task-specific gait training is instrumental for effective gait rehabilitation, characteristics that our group aims to encourage with soft robotic exosuits. However, standard clinical assessments may lack the precision and frequency to detect subtle changes in intervention efficacy during both conventional and exosuit-assisted gait training, potentially impeding targeted therapy regimes. In this paper, we use exosuit-integrated inertial sensors to reconstruct three clinically meaningful gait metrics related to circumduction, foot clearance, and stride length. Our method corrects sensor drift using instantaneous information from both sides of the body. This approach makes our method robust to irregular walking conditions poststroke as well as usable in real-time applications, such as real-time movement monitoring, exosuit assistance control, and biofeedback. We validate our algorithm in eight people poststroke in comparison to lab-based optical motion capture. Mean errors were below 0.2 cm (9.9%) for circumduction, −0.6 cm (−3.5%) for foot clearance, and 3.8 cm (3.6%) for stride length. A single-participant case study shows our technique’s promise in daily-living environments by detecting exosuit-induced changes in gait while walking in a busy outdoor plaza.
The UK has longstanding problems with psychiatry recruitment. Various initiatives aim to improve psychiatry's image among medical students, but involve research and none are student-led. Providing opportunities to take part in psychiatry research and quality improvement could increase the number of students who choose to enter the speciality.
Objectives
We have developed the student psychiatry audit and research collaborative (SPARC), a student-led initiative for nationwide collaboration in high-quality research and audits.
Methods
Our model is inspired by the success of the UK Student audit and research in surgery (STARSurg). Area teams, located in medical schools, take part in multi-centre projects. The area teams consist of medical students, who have the main responsibility for collecting data; a junior doctor, to supervise the process; and a consultant, with overall responsibility for patient care. The data collected centrally and analysed by a team of medical students and doctors. Student leads from each site are named authors on resulting papers. All other students are acknowledged and are able to present the work.
Results
We have completed our first audits in Cardiff and London; other sites will return data in 2017. Student feedback indicated a high level of satisfaction with the project and interest in psychiatry as a future career.
Conclusions
This initiative aims to tackle the recruitment problems in psychiatry by giving students a chance to take part in high quality research and audits.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Mechanistic models (MMs) have served as causal pathway analysis and ‘decision-support’ tools within animal production systems for decades. Such models quantitatively define how a biological system works based on causal relationships and use that cumulative biological knowledge to generate predictions and recommendations (in practice) and generate/evaluate hypotheses (in research). Their limitations revolve around obtaining sufficiently accurate inputs, user training and accuracy/precision of predictions on-farm. The new wave in digitalization technologies may negate some of these challenges. New data-driven (DD) modelling methods such as machine learning (ML) and deep learning (DL) examine patterns in data to produce accurate predictions (forecasting, classification of animals, etc.). The deluge of sensor data and new self-learning modelling techniques may address some of the limitations of traditional MM approaches – access to input data (e.g. sensors) and on-farm calibration. However, most of these new methods lack transparency in the reasoning behind predictions, in contrast to MM that have historically been used to translate knowledge into wisdom. The objective of this paper is to propose means to hybridize these two seemingly divergent methodologies to advance the models we use in animal production systems and support movement towards truly knowledge-based precision agriculture. In order to identify potential niches for models in animal production of the future, a cross-species (dairy, swine and poultry) examination of the current state of the art in MM and new DD methodologies (ML, DL analytics) is undertaken. We hypothesize that there are several ways via which synergy may be achieved to advance both our predictive capabilities and system understanding, being: (1) building and utilizing data streams (e.g. intake, rumination behaviour, rumen sensors, activity sensors, environmental sensors, cameras and near IR) to apply MM in real-time and/or with new resolution and capabilities; (2) hybridization of MM and DD approaches where, for example, a ML framework is augmented by MM-generated parameters or predicted outcomes and (3) hybridization of the MM and DD approaches, where biological bounds are placed on parameters within a MM framework, and the DD system parameterizes the MM for individual animals, farms or other such clusters of data. As animal systems modellers, we should expand our toolbox to explore new DD approaches and big data to find opportunities to increase understanding of biological systems, find new patterns in data and move the field towards intelligent, knowledge-based precision agriculture systems.
The aim of this study was to assess the impact of a urinary tract infection (UTI) management bundle to reduce the treatment of asymptomatic bacteriuria (AB) and to improve the management of symptomatic UTIs.
Design
Before-and-after intervention study.
Settings
Tertiary-care hospital.
Patients
Consecutive sample of inpatients with positive single or mixed-predominant urine cultures collected and reported while admitted to the hospital.
Methods
The UTI management bundle consisted of nursing and prescriber education, modification of the reporting of positive urine cultures, and pharmacists’ prospective audit and feedback. A retrospective chart review of consecutive inpatients with positive urinary cultures was performed before and after implementation of the management bundle.
Results
Prior to the implementation of the management bundle, 276 patients were eligible criteria for chart review. Of these 276 patients, 165 (59·8%) were found to have AB; of these 165 patients with AB, 111 (67·3%) were treated with antimicrobials. Moreover, 268 patients met eligibility criteria for postintervention review. Of these 268, 133 patients (49·6%) were found to have AB; of these 133 with AB, 22 (16·5%) were treated with antimicrobials. Thus, a 75·5% reduction of AB treatment was achieved. Educational components of the bundle resulted in a substantial decrease in nonphysician-directed urine sample submission. Adherence to a UTI management algorithm improved substantially in the intervention period, with a notable decrease in fluoroquinolone prescription for empiric UTI treatment.
Conclusions
A UTI management bundle resulted in a dramatic improvement in the management of urinary tract infection, particularly a reduction in the treatment of AB and improved management of symptomatic UTI.
The construction of zirconium (Zr) budgets for metamorphic reactions in high-grade rocks provides new insight into zircon growth during metamorphism. In this study we target reactions involving garnet, as they enable zircon growth to be related to known pressure and temperature conditions. Two reactions involving the breakdown of Zr-bearing garnet from Rogaland, SW Norway have been investigated in detail, showing contrasting behaviour of Zr, with zircon formation being subject to the solubility of Zr in product phases. In the decompression reaction garnet + sillimanite + quartz → cordierite, Zr released during garnet breakdown cannot be incorporated into the cordierite structure, resulting in zircon nucleation and growth. In contrast, for the reaction garnet + biotite + sillimanite + quartz → osumilite + orthopyroxene + spinel + magnetite, no new zircon growth takes place, despite the garnet involved containing more than double the Zr concentration of the former reaction. In the latter case, all the Zr released by garnet breakdown can be detected in the product phases osumilite and orthopyroxene, thereby preventing growth of new metamorphic zircon. This study highlights the potential for high resolution geochronology in metamorphic rocks by relating zircon growth to specific metamorphic reactions.
Introduction: Emergency departments (ED) across Canada acknowledge the need to transform in order to provide high quality care for the increasing proportion of older patients presenting for treatment. Older people are more complex than younger ED users. They have a disproportionately high use of EDs, increased rates of hospitalization, and are more likely to suffer adverse events. The objective of this initiative was to develop minimum standards for the care of older people in the emergency department. Methods: We created a panel of international leaders in geriatrics and emergency medicine to develop a policy framework on minimum standards for care of older people in the ED. We conducted a literature review of international guidelines, frameworks, recommendations, and best practices for the acute care of older people and developed a draft standards document. This preliminary document was circulated to interdisciplinary members of the International Federation of Emergency Medicine (IFEM) geriatric emergency medicine (GEM) group. Following review, the standards were presented to the IFEM clinical practice group. At each step, verbal, written and online feedback were gathered and integrated into the final minimum standards document. Results: Following the developmental process, a series of eight minimum standard statements were created and accepted by IFEM. These standards utilise the IFEM Framework for Quality and Safety in the ED, and are centred on the recognition that older people are a core population of emergency health service users whose care needs are different from those of children and younger adults. They cover key areas, including the overall approach to older patients, the physical environment and equipment, personnel and training, policies and protocols, and strategies for navigating the health-care continuum. Conclusion: These standards aim to improve the evaluation, management and integration of care of older people in the ED in an effort to improve outcomes. The minimum standards represent a first step on which future activities can be built, including the development of specific indicators for each of the minimum standards. The standards are designed to apply across the spectrum of EDs worldwide, and it is hoped that they will act as a catalyst to change.
In autumn 2014, enterovirus D68 (EV-D68) cases presenting with severe respiratory or neurological disease were described in countries worldwide. To describe the epidemiology and virological characteristics of EV-D68 in England, we collected clinical information on laboratory-confirmed EV-D68 cases detected in secondary care (hospitals), between September 2014 and January 2015. In primary care (general practitioners), respiratory swabs collected (September 2013–January 2015) from patients presenting with influenza-like illness were tested for EV-D68. In secondary care 55 EV-D68 cases were detected. Among those, 45 cases had clinical information available and 89% (40/45) presented with severe respiratory symptoms. Detection of EV-D68 among patients in primary care increased from 0.4% (4/1074; 95% CI 0.1–1.0) (September 2013–January 2014) to 0.8% (11/1359; 95% CI 0.4–1.5) (September 2014–January 2015). Characterization of EV-D68 strains circulating in England since 2012 and up to winter 2014/2015 indicated that those strains were genetically similar to those detected in 2014 in USA. We recommend reinforcing enterovirus surveillance through screening respiratory samples of suspected cases.
Field research was conducted for 2 yr to evaluate response of corn and rice to simulated drift rates of a commercial premix of imazethapyr plus imazapyr [3:1 (w/w)]. Drift rates of the imazethapyr plus imazapyr premix represented 0.8, 1.6, 3.2, 6.3, and 12.5% of the usage rate of 63 g ai/ha (0.5, 1, 2, 4, and 7.9 g/ha, respectively). The imazethapyr plus imazapyr premix applied to six-leaf corn at 7.9 g/ha reduced height 11% compared with the nontreated control 7 days after treatment (DAT) but did not affect corn height 14 and 28 DAT. Corn yield was equivalent regardless of imazethapyr plus imazapyr rate and ranged from 10,200 to 11,500 kg/ha. At 28 DAT, rice height was reduced 12% when 7.9 g/ha of the imazethapyr plus imazapyr premix was applied early postemergence (EPOST) at two- to three-leaf and 14 and 5% when the imazethapyr plus imazapyr premix at 7.9 and 4 g/ha, respectively, was applied late postemergence (LPOST) at panicle differentiation. Reductions in mature rice height of 11 and 6% were observed when the imazethapyr plus imazapyr premix was applied LPOST at 7.9 and 4 g/ha, respectively, and a 5% reduction was observed for 7.9 g/ha of the imazethapyr plus imazapyr premix applied EPOST. Application of the imazethapyr plus imazapyr premix EPOST at 7.9 g/ha delayed heading in only 1 yr, but heading was delayed both years when applied LPOST. Rice yield was reduced 39 and 16% when the imazethapyr plus imazapyr premix was applied LPOST at 7.9 and 4 g/ha, respectively, compared with a 9% yield reduction for 7.9 g/ha applied EPOST.
Aberrant microbiota composition and function have been linked to several pathologies, including type 2 diabetes. In animal models, prebiotics induce favourable changes in the intestinal microbiota, intestinal permeability (IP) and endotoxaemia, which are linked to concurrent improvement in glucose tolerance. This is the first study to investigate the link between IP, glucose tolerance and intestinal bacteria in human type 2 diabetes. In all, twenty-nine men with well-controlled type 2 diabetes were randomised to a prebiotic (galacto-oligosaccharide mixture) or placebo (maltodextrin) supplement (5·5 g/d for 12 weeks). Intestinal microbial community structure, IP, endotoxaemia, inflammatory markers and glucose tolerance were assessed at baseline and post intervention. IP was estimated by the urinary recovery of oral 51Cr-EDTA and glucose tolerance by insulin-modified intravenous glucose tolerance test. Intestinal microbial community analysis was performed by high-throughput next-generation sequencing of 16S rRNA amplicons and quantitative PCR. Prebiotic fibre supplementation had no significant effects on clinical outcomes or bacterial abundances compared with placebo; however, changes in the bacterial family Veillonellaceae correlated inversely with changes in glucose response and IL-6 levels (r −0·90, P=0·042 for both) following prebiotic intake. The absence of significant changes to the microbial community structure at a prebiotic dosage/length of supplementation shown to be effective in healthy individuals is an important finding. We propose that concurrent metformin treatment and the high heterogeneity of human type 2 diabetes may have played a significant role. The current study does not provide evidence for the role of prebiotics in the treatment of type 2 diabetes.
The horse is a non-ruminant herbivore adapted to eating plant-fibre or forage-based diets. Some horses are stabled for most or the majority of the day with limited or no access to fresh pasture and are fed preserved forage typically as hay or haylage and sometimes silage. This raises questions with respect to the quality and suitability of these preserved forages (considering production, nutritional content, digestibility as well as hygiene) and required quantities. Especially for performance horses, forage is often replaced with energy dense feedstuffs which can result in a reduction in the proportion of the diet that is forage based. This may adversely affect the health, welfare, behaviour and even performance of the horse. In the past 20 years a large body of research work has contributed to a better and deeper understanding of equine forage needs and the physiological and behavioural consequences if these are not met. Recent nutrient requirement systems have incorporated some, but not all, of this new knowledge into their recommendations. This review paper amalgamates recommendations based on the latest understanding in forage feeding for horses, defining forage types and preservation methods, hygienic quality, feed intake behaviour, typical nutrient composition, digestion and digestibility as well as health and performance implications. Based on this, consensual applied recommendations for feeding preserved forages are provided.
This study uses a field experiment involving 251 adult participants to determine which messages related to climate change, extreme weather events, and decaying infrastructure are most effective in encouraging people to pay more for investments that could alleviate future water-quality risks. The experiment also assesses whether people prefer the investments to be directed toward gray or green infrastructure projects. Messages about global warming induced climate change and decaying infrastructure lead to larger contributions than messages about extreme weather events. The results suggest that people are likely to pay more for green infrastructure projects than for gray infrastructure projects.
To assess whether diet quality before or during pregnancy predicts adverse pregnancy and birth outcomes in a sample of Australian women.
Design
The Dietary Questionnaire for Epidemiological Studies was used to calculate diet quality using the Australian Recommended Food Score (ARFS) methodology modified for pregnancy.
Setting
A population-based cohort participating in the Australian Longitudinal Study on Women’s Health (ALSWH).
Subjects
A national sample of Australian women, aged 20–25 and 31–36 years, who were classified as preconception or pregnant when completing Survey 3 or Survey 5 of the ALSWH, respectively. The 1907 women with biologically plausible energy intake estimates were included in regression analyses of associations between preconception and pregnancy ARFS and subsequent pregnancy outcomes.
Results
Preconception and pregnancy groups were combined as no significant differences were detected for total and component ARFS. Women with gestational hypertension, compared with those without, had lower scores for total ARFS, vegetable, fruit, grain and nuts/bean/soya components. Women with gestational diabetes had a higher score for the vegetable component only, and women who had a low-birth-weight infant had lower scores for total ARFS and the grain component, compared with those who did not report these outcomes. Women with the highest ARFS had the lowest odds of developing gestational hypertension (OR=0·4; 95 % CI 0·2, 0·7) or delivering a child of low birth weight (OR=0·4; 95 % CI 0·2, 0·9), which remained significant for gestational hypertension after adjustment for potential confounders.
Conclusions
A high-quality diet before and during pregnancy may reduce the risk of gestational hypertension for the mother.