We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Objectives/Goals: To assess theory of mind and empathy in adolescents with Tourette syndrome (TS) and examine their association with social problems. This study aims to extend research in social cognition to an adolescent cohort with TS and identify a potential modifiable risk factor for social problems in TS that may serve as a novel intervention target. Methods/Study Population: We will enroll 50 adolescents with TS (ages 11–17) and 50 demographically matched controls along with one parent to complete a single in-person study visit. Adolescents with TS will be recruited through the Vanderbilt Center for TS and other Tic disorders. Controls will be recruited using university listservs and flyers posted in community and primary care settings. Adolescents will complete the NEPSY-II to assess theory of mind abilities and the Multifaceted Empathy Test – Juvenile to assess empathy with negative emotions. Parents will complete the Child Behavior Checklist to assess adolescent social problems. Results/Anticipated Results: Based on evidence of low self-other distinction in TS, we hypothesize TS adolescents will make more errors about the mental states of others (theory of mind) and report greater emotional reactions to faces (empathy) compared to controls. Further, greater social problems will be associated with greater disturbances in social cognition. To date, 15 adolescents with TS and 15 matched controls have completed the assessment (67% male; Mage = 14.33 in both groups). Within this sample, adolescents with TS experienced more social problems than controls (Cohen’s d = .74, p = .03). There were no between-group differences in theory of mind or empathy in this pilot sample. However, higher levels of both theory of mind and empathy were linked to experiencing greater social problems in the TS sample only (p’s < .05). Discussion/Significance of Impact: Preliminary findings suggest that while social cognition did not differ between groups, TS adolescents exhibiting high levels of theory of mind and empathy appear to struggle socially. This work could inform future interventions by highlighting the need to focus on social cognition and how these skills translate into social behaviors.
Although cognitive remediation (CR) improves cognition and functioning, the key features that promote or inhibit its effectiveness, especially between cognitive domains, remain unknown. Discovering these key features will help to develop CR for more impact.
Aim
To identify interrelations between cognition, symptoms, and functioning, using a novel network analysis approach and how CR affects these recovery outcomes.
Methods
A secondary analysis of randomized controlled trial data (N = 165) of CR in early psychosis. Regularized partial correlation networks were estimated, including symptoms, cognition, and functioning, for pre-, post-treatment, and change over time. Pre- and post-CR networks were compared on global strength, structure, edge invariance, and centrality invariance.
Results
Cognition, negative, and positive symptoms were separable constructs, with symptoms showing independent relationships with cognition. Negative symptoms were central to the CR networks and most strongly associated with change in functioning. Verbal and visual learning improvement showed independent relationships to improved social functioning and negative symptoms. Only visual learning improvement was positively associated with personal goal achievement. Pre- and post-CR networks did not differ in structure (M = 0.20, p = 0.45) but differed in global strength, reflecting greater overall connectivity in the post-CR network (S = 0.91, p = 0.03).
Conclusions
Negative symptoms influenced network changes following therapy, and their reduction was linked to improvement in verbal and visual learning following CR. Independent relationships between visual and verbal learning and functioning suggest that they may be key intervention targets to enhance social and occupational functioning.
Evidence for necrotising otitis externa (NOE) diagnosis and management is limited, and outcome reporting is heterogeneous. International best practice guidelines were used to develop consensus diagnostic criteria and a core outcome set (COS).
Methods
The study was pre-registered on the Core Outcome Measures in Effectiveness Trials (COMET) database. Systematic literature review identified candidate items. Patient-centred items were identified via a qualitative study. Items and their definitions were refined by multidisciplinary stakeholders in a two-round Delphi exercise and subsequent consensus meeting.
Results
The final COS incorporates 36 items within 12 themes: Signs and symptoms; Pain; Advanced Disease Indicators; Complications; Survival; Antibiotic regimes and side effects; Patient comorbidities; Non-antibiotic treatments; Patient compliance; Duration and cessation of treatment; Relapse and readmission; Multidisciplinary team management.
Consensus diagnostic criteria include 12 items within 6 themes: Signs and symptoms (oedema, otorrhoea, granulation); Pain (otalgia, nocturnal otalgia); Investigations (microbiology [does not have to be positive], histology [malignancy excluded], positive CT and MRI); Persistent symptoms despite local and/or systemic treatment for at least two weeks; At least one risk factor for impaired immune response; Indicators of advanced disease (not obligatory but mut be reported when present at diagnosis). Stakeholders were unanimous that there is no role for secondary, graded, or optional diagnostic items. The consensus meeting identified themes for future research.
Conclusion
The adoption of consensus-defined diagnostic criteria and COS facilitates standardised research reporting and robust data synthesis. Inclusion of patient and professional perspectives ensures best practice stakeholder engagement.
Edited by
David Kingdon, University of Southampton,Paul Rowlands, Derbyshire Healthcare NHS foundation Trust,George Stein, Emeritus of the Princess Royal University Hospital
Bipolar disorder is an affective disorder defined on the basis of the presence of periods of elevated mood. Patients often present with depression, and previous episodes of elevated mood may be missed if not specifically explored during assessment. Bipolar disorder may be difficult to differentiate from other conditions causing mood instability and impulsivity. It is important to identify comorbidities such as substance use, neurodiversity and physical illnesses. The first-line treatment for mania is antipsychotic medication. Antidepressants are reported to have little to no efficacy in treating bipolar depression on average. Lithium is not the only long-term prophylactic agent, but it remains the gold standard, with good evidence that it reduces mood episodes and adverse outcomes. Monitoring is required to ensure lithium level is optimised and potential side-effects minimised.
“Social isolation among older adults is associated with increased change of premature death; depression; dementia, disability from chronic diseases; poor mental health; increased use of health and support services; reduced quality of life; poor general health; and an increased number of falls.” (National Academies of Sciences, Engineering, and Medicine (2020).
Without question, the global pandemic has significantly exacerbated both the prevalence and awareness of social isolation and loneliness as a growing health and societal challenge for older populations.
“Because of growing calls for Canada’s health-care systems to identify, prevent and mitigate loneliness as part of COVID-19-related public health efforts, there is a unique opportunity to build capacity to identify and intervene with older adults who are experiencing social isolation or loneliness.” National Institute on Aging (2022).
Over the past two decades, the Canadian Coalition for Seniors’ Mental Health (CCSMH) has developed a number of internationally recognized clinical guidelines in support of mental health for older adults. CCSMH is responding to the growing mental health crisis of isolation and loneliness with the development of evidence-based guidelines, to support the vital work of health and social service providers across Canada. The focus of these guidelines is to develop a broad range of evidence-based, manageable, and stepped care approaches to identify and address social isolation and loneliness in older adults. It is recognized that this topic is extremely complex and vast in potential scope. Through the guidance of a national working group of experts, these guidelines will draw upon both academic and grey literature, as well as on the experience of a diversity of health and social service providers, older adults, and their caregivers. This project will also provide guidance, promoting wellness and reducing the risk of social isolation with targeted messaging, knowledge translation and useful tools for supporting social connection among those at highest risk.
This presentation will share the Guidelines’ preliminary recommendations, as well as data from two national surveys alongside other insights gained from ongoing research and stakeholder engagement.
Although risk markers for depressive disorders (DD) are dynamic, especially during adolescence, few studies have examined how change in risk levels during adolescence predict DD onset during transition to adulthood. We compared two competing hypotheses of the dynamic effects of risk. The risk escalation hypothesis posits that worsening of risk predicts DD onset beyond risk level. The chronic risk hypothesis posits that persistently elevated risk level, rather than risk change, predicts DD onset.
Methods
Our sample included 393 girls (baseline age 13.5–15.5 years) from the adolescent development of emotions and personality traits project. Participants underwent five diagnostic interviews and assessments of risk markers for DD at 9-month intervals and were re-interviewed at a 6-year follow-up. We focused on 17 well-established risk markers. For each risk marker, we examined the prospective effects of risk level and change on first DD onset at wave six, estimated by growth curve modeling using data from the first five waves.
Results
For 13 of the 17 depression risk markers, elevated levels of risk during adolescence, but not change in risk, predicted first DD onset during transition to adulthood, supporting the chronic risk hypothesis. Minimal evidence was found for the risk escalation hypothesis.
Conclusions
Participants who had a first DD onset during transition to adulthood have exhibited elevated levels of risk throughout adolescence. Researchers and practitioners should administer multiple assessments and focus on persistently elevated levels of risk to identify individuals who are most likely to develop DD and to provide targeted DD prevention.
Emotional functioning is linked to HIV-associated neurocognitive impairment, yet research on this association among diverse people with HIV (PWH) is scant. We examined emotional health and its association with neurocognition in Hispanic and White PWH.
Methods:
Participants included 107 Hispanic (41% primarily Spanish-speakers; 80% Mexican heritage/origin) and 216 White PWH (Overall age: M = 53.62, SD = 12.19; 86% male; 63% AIDS; 92% on antiretroviral therapy). Emotional health was assessed via the National Institute of Health Toolbox (NIHTB)-Emotion Battery, which yields T-scores for three factor-based summary scores (negative affect, social satisfaction, and psychological well-being) and 13 individual component scales. Neurocognition was measured via demographically adjusted fluid cognition T-scores from the NIHTB-cognition battery.
Results:
27%–39% of the sample had problematic socioemotional summary scores. Hispanic PWH showed less loneliness, better social satisfaction, higher meaning and purpose, and better psychological well-being than Whites (ps <.05). Within Hispanics, Spanish-speakers showed better meaning and purpose, higher psychological well-being summary score, less anger hostility, but greater fear affect than English speakers. Only in Whites, worse negative affect (fear affect, perceived stress, and sadness) was associated with worse neurocognition (p <.05); and in both groups, worse social satisfaction (emotional support, friendship, and perceived rejection) was linked with worse neurocognition (p <.05).
Conclusion:
Adverse emotional health is common among PWH, with subgroups of Hispanics showing relative strengths in some domains. Aspects of emotional health differentially relate to neurocogntition among PWH and cross-culturally. Understanding these varying associations is an important step towards the development of culturally relevant interventions that promote neurocognitive health among Hispanic PWH.
Many patients with Fontan physiology are unable to achieve the minimum criteria for peak effort during cardiopulmonary exercise testing. The purpose of this study is to determine the influence of physical activity and other clinical predictors related to achieving peak exercise criteria, signified by respiratory exchange ratio ≥ 1.1 in youth with Fontan physiology.
Methods:
Secondary analysis of a cross-sectional study of 8–18-year-olds with single ventricle post-Fontan palliation who underwent cardiopulmonary exercise testing (James cycle protocol) and completed a past-year physical activity survey. Bivariate associations were assessed by Wilcoxon rank-sum test and simple regression. Conditional inference forest algorithm was used to classify participants achieving respiratory exchange ratio > 1.1 and to predict peak respiratory exchange ratio.
Results:
Of the n = 43 participants, 65% were male, mean age was 14.0 ± 2.4 years, and 67.4% (n = 29) achieved respiratory exchange ratio ≥ 1.1. Despite some cardiopulmonary exercise stress test variables achieving statistical significance in bivariate associations with participants achieving respiratory exchange ratio > 1.1, the classification accuracy had area under the precision recall curve of 0.55. All variables together explained 21.4% of the variance in respiratory exchange ratio, with peak oxygen pulse being the most informative.
Conclusion:
Demographic, physical activity, and cardiopulmonary exercise test measures could not classify meeting peak exercise criteria (respiratory exchange ratio ≥ 1.1) at a satisfactory accuracy. Correlations between respiratory exchange ratio and oxygen pulse suggest the augmentation of stroke volume with exercise may affect the Fontan patient’s ability to sustain high-intensity exercise.
The Hierarchical Taxonomy of Psychopathology (HiTOP) has emerged out of the quantitative approach to psychiatric nosology. This approach identifies psychopathology constructs based on patterns of co-variation among signs and symptoms. The initial HiTOP model, which was published in 2017, is based on a large literature that spans decades of research. HiTOP is a living model that undergoes revision as new data become available. Here we discuss advantages and practical considerations of using this system in psychiatric practice and research. We especially highlight limitations of HiTOP and ongoing efforts to address them. We describe differences and similarities between HiTOP and existing diagnostic systems. Next, we review the types of evidence that informed development of HiTOP, including populations in which it has been studied and data on its validity. The paper also describes how HiTOP can facilitate research on genetic and environmental causes of psychopathology as well as the search for neurobiologic mechanisms and novel treatments. Furthermore, we consider implications for public health programs and prevention of mental disorders. We also review data on clinical utility and illustrate clinical application of HiTOP. Importantly, the model is based on measures and practices that are already used widely in clinical settings. HiTOP offers a way to organize and formalize these techniques. This model already can contribute to progress in psychiatry and complement traditional nosologies. Moreover, HiTOP seeks to facilitate research on linkages between phenotypes and biological processes, which may enable construction of a system that encompasses both biomarkers and precise clinical description.
Interactions with parents are integral in shaping the development of children’s emotional processes. Important aspects of these interactions are overall (mean level) affective experience and affective synchrony (linkages between parent and child affect across time). Respectively, mean-level affect and affective synchrony reflect aspects of the content and structure of dyadic interactions. Most research on parent–child affect during dyadic interactions has focused on infancy and early childhood; adolescence, however, is a key period for both normative emotional development and the emergence of emotional disorders. We examined affect in early to mid-adolescents (N = 55, Mage = 12.27) and their parents using a video-mediated recall task of 10-min conflict-topic discussions. Using multilevel modeling, we found evidence of significant level-2 effects (mean affect) and level-1 effects (affective synchrony) for parents and their adolescents. Level-2 and level-1 associations were differentially moderated by adolescent age and adolescent internalizing and externalizing symptoms. More specifically, parent–adolescent synchrony was stronger when adolescents were older and had more internalizing problems. Further, more positive adolescent mean affect was associated with more positive parent affect (and vice versa), but only for dyads with low adolescent externalizing problems. Results underscore the importance of additional research examining parent–child affect in adolescence.
Risk factors for depressive disorders (DD) change substantially over time, but the prognostic value of these changes remains unclear. Two basic types of dynamic effects are possible. The ‘Risk Escalation hypothesis’ posits that worsening of risk levels predicts DD onset above average level of risk factors. Alternatively, the ‘Chronic Risk hypothesis’ posits that the average level rather than change predicts first-onset DD.
Methods
We utilized data from the ADEPT project, a cohort of 496 girls (baseline age 13.5–15.5 years) from the community followed for 3 years. Participants underwent five waves of assessments for risk factors and diagnostic interviews for DD. For illustration purposes, we selected 16 well-established dynamic risk factors for adolescent depression, such as depressive and anxiety symptoms, personality traits, clinical traits, and social risk factors. We conducted Cox regression analyses with time-varying covariates to predict first DD onset.
Results
Consistently elevated risk factors (i.e. the mean of multiple waves), but not recent escalation, predicted first-onset DD, consistent with the Chronic Risk hypothesis. This hypothesis was supported across all 16 risk factors.
Conclusions
Across a range of risk factors, girls who had first-onset DD generally did not experience a sharp increase in risk level shortly before the onset of disorder; rather, for years before onset, they exhibited elevated levels of risk. Our findings suggest that chronicity of risk should be a particular focus in screening high-risk populations to prevent the onset of DDs. In particular, regular monitoring of risk factors in school settings is highly informative.
Given the aging population of people with HIV (PWH), along with increasing rates of binge drinking among both PWH and the general older adult population, this study examined the independent and interactive effects of HIV, binge drinking, and age on neurocognition.
Method:
Participants were 146 drinkers stratified by HIV and binge drinking status (i.e., ≥4 drinks for women and ≥5 drinks for men within approximately 2 h): HIV+/Binge+ (n = 30), HIV−/Binge+ (n = 23), HIV+/Binge− (n = 55), HIV−/Binge− (n = 38). All participants completed a comprehensive neuropsychological battery measuring demographically-corrected global and domain-specific neurocognitive T scores. ANCOVA models examined independent and interactive effects of HIV and binge drinking on neurocognitive outcomes, adjusting for overall alcohol consumption, lifetime substance use, sex, and age. Subsequent multiple linear regressions examined whether HIV/Binge group moderated the relationship between age and neurocognition.
Results:
HIV+/Binge+ participants had worse global neurocognition, processing speed, delayed recall, and working memory than HIV−/Binge− participants (p’s < .05). While there were significant main effects of HIV and binge drinking, their interaction did not predict any of those neurocognitive outcomes (p’s > .05). Significant interactions between age and HIV/Binge group showed that HIV+/Binge+ participants demonstrated steeper negative relationships between age and neurocognitive outcomes of learning, delayed recall, and motor skills compared to HIV−/Binge− participants (p’s < .05).
Conclusions:
Results showed adverse additive effects of HIV and binge drinking on neurocognitive functioning, with older adults demonstrating the most vulnerability to these effects. Findings support the need for interventions to reduce binge drinking, especially among older PWH.
The United States Centers for Disease Control and Prevention and the World Health Organization broadly categorize mass gathering events as high risk for amplification of coronavirus disease 2019 (COVID-19) spread in a community due to the nature of respiratory diseases and the transmission dynamics. However, various measures and modifications can be put in place to limit or reduce the risk of further spread of COVID-19 for the mass gathering. During this pandemic, the Johns Hopkins University Center for Health Security produced a risk assessment and mitigation tool for decision-makers to assess SARS-CoV-2 transmission risks that may arise as organizations and businesses hold mass gatherings or increase business operations: The JHU Operational Toolkit for Businesses Considering Reopening or Expanding Operations in COVID-19 (Toolkit). This article describes the deployment of a data-informed, risk-reduction strategy that protects local communities, preserves local health-care capacity, and supports democratic processes through the safe execution of the Republican National Convention in Charlotte, North Carolina. The successful use of the Toolkit and the lessons learned from this experience are applicable in a wide range of public health settings, including school reopening, expansion of public services, and even resumption of health-care delivery.
As rising seas, spreading wildfires, and unbearable heat shrink the expanse of the habitable earth, the prospect of a contracting world resonates in particular and forceful ways within the American imaginary. Recent American climate fiction responds to the specter of a shrinking world by reprising narratives of the American frontier, simultaneously unsettling and reanimating elements of these stories. This chapter pays attention to stories of neo-agrarian settlements, depictions of internal displacements and migrations, and portrayals of corporate collapse in the wake of dwindling carbon economies. It argues that American climate fiction can run retrograde, reiterating the very seizures of land and political suppressions that underwrote the American frontier. However, the radical environmental changes envisioned in this genre also intensify ongoing struggles for racial and economic justice in the United States, opening the possibility of more equitable forms of relation. Although the climatic future is often depicted as a brave new world, an unknown terrain, climate narratives must acknowledge rather than subsume history: A changed world must not be mistaken for a wholly new one.
Rock debris covers ~30% of glacier ablation areas in the Central Himalaya and modifies the impact of atmospheric conditions on mass balance. The thermal properties of supraglacial debris are diurnally variable but remain poorly constrained for monsoon-influenced glaciers over the timescale of the ablation season. We measured vertical debris profile temperatures at 12 sites on four glaciers in the Everest region with debris thickness ranging from 0.08 to 2.8 m. Typically, the length of the ice ablation season beneath supraglacial debris was 160 days (15 May to 22 October)—a month longer than the monsoon season. Debris temperature gradients were approximately linear (r2 > 0.83), measured as −40°C m–1 where debris was up to 0.1 m thick, −20°C m–1 for debris 0.1–0.5 m thick, and −4°C m–1 for debris greater than 0.5 m thick. Our results demonstrate that the influence of supraglacial debris on the temperature of the underlying ice surface, and therefore melt, is stable at a seasonal timescale and can be estimated from near-surface temperature. These results have the potential to greatly improve the representation of ablation in calculations of debris-covered glacier mass balance and projections of their response to climate change.
Background: When control mechanisms such as water temperature and biocide level are insufficient, Legionella, the causative bacteria of Legionnaires’ disease, can proliferate in water distribution systems in buildings. Guidance and oversight bodies are increasingly prioritizing water safety programs in healthcare facilities to limit Legionella growth. However, ensuring optimal implementation in large buildings is challenging. Much is unknown, and sometimes assumed, about whether building and campus characteristics influence Legionella growth. We used an extensive real-world environmental Legionella data set in the Veterans Health Administration (VHA) healthcare system to examine infrastructure characteristics and Legionella positivity. Methods: VHA medical facilities across the country perform quarterly potable water sampling of healthcare buildings for Legionella detection as part of a comprehensive water safety program. Results are reported to a standardized national database. We did an exploratory univariate analysis of facility-reported Legionella data from routine potable water samples taken in 2015 to 2018, in conjunction with infrastructure characteristics available in a separate national data set. This review examined the following characteristics: building height (number of floors), building age (reported construction year), and campus acreage. Results: The final data set included 201,936 water samples from 819 buildings. Buildings with 1–5 floors (n = 634) had a Legionella positivity rate of 5.3%, 6–10 floors (n = 104) had a rate of 6.4%, 11–15 floors (n = 36) had a rate of 8.1%, and 16–22 floors (n = 9) had a rate of 8.8%. All rates were significantly different from each other except 11–15 floors and 16–22 floors (P < .05, χ2). The oldest buildings (1800s) had significantly less (P < .05, χ2) Legionella positivity than those built between 1900 and 1939 and between 1940 and 1979, but they were no different than the newest buildings (Fig. 1). In newer buildings (1980–2019), all decades had buildings with Legionella positivity (Fig. 1 inset). Campus acreage varied from ~3 acres to almost 500 acres. Although significant differences were found in Legionella positivity for different campus sizes, there was no clear trend and campus acreage may not be a suitable proxy for the extent or complexity of water systems feeding buildings. Conclusions: The analysis of this large, real-world data set supports an assumption that taller buildings are more likely to be associated with Legionella detection, perhaps a result of more extensive piping. In contrast, the assumption that newer buildings are less associated with Legionella was not fully supported. These results demonstrate the variability in Legionella positivity in buildings, and they also provide evidence that can inform implementation of water safety programs.
Funding: None
Disclosures: Chetan Jinadatha, principal Investigator/Co-I: Research: NIH/NINR, AHRQ, NSF principal investigator: Research: Xenex Healthcare Services. Funds provided to institution. Inventor: Methods for organizing the disinfection of one or more items contaminated with biological agents. Owner: Department of Veterans Affairs. Licensed to Xenex Disinfection System, San Antonio, TX.
Chronic kidney disease continues to be under recognised and is associated with a significant global health burden and costs. An adverse intrauterine environment may result in a depleted nephron number and an increased risk of chronic kidney disease. Antenatal ultrasound was used to measure the foetal renal parenchymal thickness (RPT), as a novel method to estimate nephron number. Foetal renal artery blood flow was also assessed. This prospective, longitudinal study evaluated the foetal kidneys of 102 appropriately grown and 30 foetal growth-restricted foetuses between 20 and 37 weeks gestational age (GA) to provide vital knowledge on the influences foetal growth restriction has on the developing kidneys. The foetal RPT and renal artery blood flow were measured at least every 4 weeks using ultrasound. The RPT was found to be significantly thinner in growth-restricted foetuses compared to appropriately grown foetuses [likelihood ratio (LR) = 21.06, P ≤ 0.0001] and the difference increases with GA. In foetuses with the same head circumference, a growth-restricted foetus was more likely to have a thinner parenchyma than an appropriately grown foetus (LR = 8.9, P = 0.0028), supporting the principle that growth-restricted foetuses preferentially shunt blood towards the brain. No significant difference was seen in the renal arteries between appropriately grown and growth-restricted foetuses. Measurement of the RPT appears to be a more sensitive measure than current methods. It has the potential to identify infants with a possible reduced nephron endowment allowing for monitoring and interventions to be focused on individuals at a higher risk of developing future hypertension and chronic kidney disease.
Antimicrobial use in the surgical setting is common and frequently inappropriate. Understanding the behavioral context of antimicrobial use is a critical step to developing stewardship programs.
Design:
In this study, we employed qualitative methodologies to describe the phenomenon of antimicrobial use in 2 surgical units: orthopedic surgery and cardiothoracic surgery.
Setting:
This study was conducted at a public, quaternary, university-affiliated hospital.
Participants:
Healthcare professionals from the 2 surgical unit teams participated in the study.
Methods:
We used focused ethnographic and face-to-face semi-structured interviews to observe antimicrobial decision-making behaviors across the patient’s journey from the preadmission clinic to the operating room to the postoperative ward.
Results:
We identified 4 key themes influencing decision making in the surgical setting. Compartmentalized communication (theme 1) was observed with demarcated roles and defined pathways for communication (theme 2). Antimicrobial decisions in the operating room were driven by the most senior members of the team. These decisions, however, were delegated to more junior members of staff in the ward and clinic environment (theme 3). Throughout the patient’s journey, communication with the patient about antimicrobial use was limited (theme 4).
Conclusions:
Approaches to decision making in surgery are highly structured. Although this structure appears to facilitate smooth flow of responsibility, more junior members of the staff may be disempowered. In addition, opportunities for shared decision making with patients were limited. Antimicrobial stewardship programs need to recognize the hierarchal structure as well as opportunities to engage the patient in shared decision making.
The elimination of unwanted catch in mixed species fisheries is technically challenging given the complexity of fish behaviour within nets. Most approaches to date have employed technologies that modify the nets themselves or use physical sorting grids within the gear. There is currently increasing interest in the use of artificial light to either deter fish from entering the net, or to enhance their escapement from within the net. Here, we evaluated the differences in catch retained in a standard otter trawl, relative to the same gear fitted with a square mesh panel, or a square mesh panel fitted with LEDs. We found that the selectivity of the gear differed depending on water depth. When using a square mesh panel in shallow depths of 29–40 m the unwanted bycatch of whiting and haddock was reduced by 86% and 58% respectively. In deep, darker water (45–95 m), no change in catch was observed in the square-mesh panel treatment, however when LEDs were added to the square-mesh panel, haddock and flatfish catches were reduced by 47% and 25% respectively. These findings demonstrate the potential to improve the performance of bycatch reduction devices through the addition of light devices to enhance selectivity. The results also highlight species-specific and site-specific differences in the performance of bycatch reduction devices, and hence a more adaptive approach to reduce bycatch is probably required to maximize performance.