We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Community advisory boards (CABs) are a promising approach for strengthening patient and partner voices in community health center (CHC) evidence-based decision-making. This paper aims to describe how CHCs used CABs during the COVID-19 pandemic to improve the reach of testing among populations experiencing health disparities and identify transferable lessons for future implementation.
Methods:
This mixed methods study integrates brief quantitative surveys of community engagement (N = 20) and one-on-one qualitative interviews (N = 13) of staff and community partners engaged in CHC CABs with a cost analysis and qualitative feedback from CHC staff participating in an online learning community (N = 17).
Results:
Community partners and staff engaged in the CHC CABs reported high ratings of engagement, with all mean ratings of community engagement principles above a 4 (“very good” or “often”) out of 5. Qualitative findings provided a more in-depth understanding of experiences serving on the CHC CAB and highlighted how engagement principles such as trust and mutual respect were reflected in CAB practices. We developed a CHC CAB toolkit with strategies for governance and prioritization, cost estimates to ensure sustainment, guidance on integrating quality improvement expertise, testimonies from community members on the benefits of joining, and template agendas and facilitator training to ensure meeting success.
Conclusion:
In alignment with the Translational Science Benefits Model, this study expands research impact through comprehensive mixed methods measurement of community engagement and by transforming findings into an action-orientated guide for CHCs to implement CABs to guide evidence-based decision-making for community and public health impact.
Psychotic symptoms in adolescence are associated with social adversity and genetic risk for schizophrenia. This gene–environment interplay may be mediated by personality, which also develops during adolescence. We hypothesized that (i) personality development predicts later Psychosis Proneness Signs (PPS), and (ii) personality traits mediate the association between genetic risk for schizophrenia, social adversities, and psychosis.
Methods
A total of 784 individuals were selected within the IMAGEN cohort (Discovery Sample-DS: 526; Validation Sample-VS: 258); personality was assessed at baseline (13–15 years), follow-up-1 (FU1, 16–17 years), and FU2 (18–20 years). Latent growth curve models served to compute coefficients of individual change across 14 personality variables. A support vector machine algorithm employed these coefficients to predict PPS at FU3 (21–24 years). We computed mediation analyses, including personality-based predictions and self-reported bullying victimization as serial mediators along the pathway between polygenic risk score (PRS) for schizophrenia and FU3 PPS. We replicated the main findings also on 1132 adolescents recruited within the TRAILS cohort.
Results
Growth scores in neuroticism and openness predicted PPS with 65.6% balanced accuracy in the DS, and 69.5% in the VS Mediations revealed a significant positive direct effect of PRS on PPS (confidence interval [CI] 0.01–0.15), and an indirect effect, serially mediated by personality-based predictions and victimization (CI 0.006–0.01), replicated in the TRAILS cohort (CI 0.0004–0.004).
Conclusions
Adolescent personality changes may predate future experiences associated with psychosis susceptibility. PPS personality-based predictions mediate the relationship between PRS and victimization toward adult PPS, suggesting that gene–environment correlations proposed for psychosis are partly mediated by personality.
Inappropriate antibiotic use is a key driver of antibiotic resistance and one that can be mitigated through stewardship. A better understanding of current prescribing practices is needed to develop successful stewardship efforts. This study aims to identify factors that are associated with human cases of enteric illness receiving an antibiotic prescription. Cases of laboratory-confirmed enteric illness reported to the FoodNet Canada surveillance system between 2015 and 2019 were the subjects of this study. Laboratory data were combined with self-reported data collected from an enhanced case questionnaire that included demographic data, illness duration and symptoms, and antibiotic prescribing. The data were used to build univariable logistic regression models and a multivariable logistic regression model to explore what factors were associated with a case receiving an antibiotic prescription. The final multivariable model identified several factors as being significantly associated with cases being prescribed an antibiotic. Some of the identified associations indicate that current antibiotic prescribing practices include a substantial level of inappropriate use. This study provides evidence that antibiotic stewardship initiatives targeting infectious diarrhoea are needed to optimize antibiotic use and combat the rise of antibiotic resistance.
To evaluate whether an antimicrobial stewardship bundle (ASB) can safely empower frontline providers in the treatment of gram-negative bloodstream infections (GN-BSI).
Intervention and Method:
From March 2021 to February 2022, we implemented an ASB intervention for GN-BSI in the electronic medical record (EMR) to guide clinicians at the point of care to optimize their own antibiotic decision-making. We conducted a before-and-after quasi-experimental pre-bundle (preBG) and post-bundle (postBG) study evaluating a composite of in-hospital mortality, infection-related readmission, GN-BSI recurrence, and bundle-related outcomes.
Setting:
New York University Langone Health (NYULH), Tisch/Kimmel (T/K) and Brooklyn (BK) campuses, in New York City, New York.
Patients:
Out of 1097 patients screened, the study included 225 adults aged ≥18 years (101 preBG vs 124 postBG) admitted with at least one positive blood culture with a monomicrobial gram-negative organism.
Results:
There was no difference in the primary composite outcome (12.9% preBG vs. 7.3% postBG; P = 0.159) nor its individual components of in-hospital mortality, 30-day infection-related readmission, and GN-BSI recurrence. Vancomycin (VAN) discontinuation (DC) was done more frequently by the primary team in postBG (37.9% vs 66.7%; P < 0.001). In postBG, de-escalation done by the primary team increased by 8.8%, P = 0.310 and there was an 11.1% increase in the use of aminopenicillin-based antibiotics, P = 0.043.
Conclusions:
GN-BSI bundle worked as a nudge-based strategy to guide providers in VAN DC and increased de-escalation to aminopenicillin-based antibiotics without negatively impacting patient outcomes.
Emotional Intelligence (EI) plays a substantial role in shaping the behavior, overall well-being, and performance of individuals. In the context of healthcare, where professionals frequently confront a demanding work environment, there is a notable prevalence of high Psychological Distress (PD). Consequently, conflicts are a recurrent phenomenon within healthcare settings, exerting impacts on healthcare professionals, patients, and their families.
Objectives
Aims:
1. Investigate the link between Emotional Intelligence (EI) and conflict management among healthcare professionals.
2. Examine how Psychological Distress (PD) relates to conflict management in healthcare.
3. Explore age, specialization, and experience’s influence on EI dimensions.
4. Analyze EI’s impact on healthcare professionals’ conflict resolution choices.
5. Assess how demographics affect conflict resolution preferences among healthcare workers.
These aims explore EI, PD, demographics, and conflict management in healthcare, informing skill enhancement and improved conflict resolution practices.
Methods
This study involved 143 healthcare professionals from diverse regions of Greece. Electronic surveys gathered demographic data and assessed Emotional Intelligence (via a dedicated questionnaire), Psychological Distress (using the Kessler K6+ questionnaire), and Conflict Resolution strategies.
Results
The majority of participants were female (69.2%), with 42.7% aged 46-55 and 30.8% aged 36-45. Age was significantly associated with “Self-awareness” (P=0.032) and “Social Skills” (P=0.009 and 0.007) within Emotional Intelligence dimensions. Negative correlations emerged between Psychological Distress and Emotional Intelligence dimensions (-0.46 to -0.19). Additionally, Psychological Distress showed negative correlations with several Conflict Resolution dimensions: ‘Atmosphere’ (-0.20), ‘Doables’ (-0.28), ‘Mutual Benefit Agreements’ (-0.18), ‘Needs’ (-0.23), and ‘Extra Considerations’ (-0.27). Participants below 35 had higher scores in “Power” (p=0.002), while those aged 46 and above scored higher in “Options” (p=0.002 and 0.009) for conflict resolution.
Conclusions
In summary, this study underscores EI’s relevance in healthcare, especially its influence on PD and conflict resolution. Developing EI competencies offers promise for improving healthcare professionals’ emotional well-being and conflict-handling abilities, ultimately benefiting patient care and staff satisfaction. Further research and tailored interventions are warranted to advance this field at an academic level.
We present a theory of atypical development based on a developmental theory of the typical mind integrating developmental, cognitive, and psychometric theory and research. The paper comprises three parts. First, it outlines the theory of typical development. The theory postulates central cognitive mechanisms, such as relational integration, executive and inferential processes, and domain-specific processes underlying different environmental relations, such as visuospatial or quantitative relations. Cognitive development advances in cycles satisfying developmental priorities in mastering these systems, such as executive control from 2–6 years, inferential control from 7–11 years, and truth control from 12–18 years. Second, we discuss atypical development, showing how each neurodevelopmental disorder emerges from deficiencies in one or more of the processes comprising the architecture of the mind. Deficiencies in relational integration mechanisms, together with deficiencies in social understanding, yield autism spectrum disorder. Deficiencies in executive processes yield attention-deficit and hyperactivity disorder. Deficiencies in symbolic representation yield specialized learning difficulties, such as dyslexia and dyscalculia. Finally, we discuss clinical and educational implications, suggesting the importance of early diagnosis of malfunctioning in each of these dimensions and specific programs for their remediation.
Bivalve molluscs are a diverse group of animals with particular economic and ecological importance. Their morphological characteristics frequently confuse their identification leading to mislabelling of edible species. Genetic diversity is critical to the resilience of marine bivalve populations in the face of environmental stressors such as ocean acidification and warming. In this study, we characterized the phylogeny and defined the first DNA barcodes of six marine bivalves [Ostrea edulis (Linnaeus, 1758) Arca noae (Linnaeus, 1758), Pinctada radiata (Leach, 1814), Venus verrucosa (Linnaeus, 1758), Calllista chione (Linnaeus, 1758) and Ruditapes decussatus (Linnaeus, 1758)] sampled from different coastal areas of Aegean and Ionian Seas using the molecular markers cytochrome c oxidase subunit I (COI) and 18S ribosomal RNA (18S rRNA). Further, COI gene was employed to investigate the population genetic diversity since 18S rRNA exhibited no conspecific differences. The sequence of 18S rRNA successfully discriminated the bivalves at family or superfamily level but occasionally proved insufficient for species identification. Contrariwise, COI was highly informative and could reliably distinguish all species. Population haplotype diversity was moderate to high and was always accompanied by generally low nucleotide diversity, indicating genetically closely related haplotypes. The invasive Pinctada radiata was found to be panmictic even among distant sampling areas, while Ostrea edulis was the only species that exhibited moderate levels of population subdivision. Finally, here we report for the first time the presence of Ostrea stentina in Thermaikos Gulf sampled among Ostrea edulis specimens, demonstrating a new invasive bivalve species in Eastern Mediterranean.
The sensitivities of the Knott's test (four 20-μl sediment aliquots), quantitative buffy coat capillary tube method (QBC tube, 111 μl of whole blood) and direct blood smear (DBS, 20 μl of whole blood) were evaluated for the detection of microfilaraemia in dogs. Undiluted whole blood samples taken from 70 Dirofilaria immitis antigen-positive dogs and 10 serially diluted microfilaraemic blood samples at concentrations of 400, 200, 100, 50, 25 and 12 microfilariae (mff) ml−1 were examined. For filarial speciation, the buffy coat of QBC tubes was mixed with one drop of methylene blue–formalin solution and examined as a direct smear. In 52/70 microfilaraemic blood samples, the number of mff ranged from 12 to 321987 ml−1 (median: 3199 ml−1). The diagnostic sensitivity of the Knott's test, QBC tube method and DBS in undiluted blood samples attained the 100%, 98% and 92.3% levels, respectively. Eighteen dogs tested amicrofilaraemic by all three methods. At concentrations of 400 mff ml−1, a 100% sensitivity was found by all three methods, while at 200 mff ml−1 the Knott's test, QBC tube and DBS were 100%, 100% and 90% sensitive, respectively. The relevant figures at 100 mff ml−1 were 100%, 100% and 80%, at 50 mff ml−1 100%, 100% and 50%, at 25 mff ml−1 100%, 100% and 10% and at 12 mff ml−1 80%, 50% and 10%. At 50 and 25 mff ml−1, the DBS was less sensitive compared to the other two methods, while at 12 mff ml−1, only to the Knott's test. A significant correlation was found between the QBC tube method and Knott's test regarding mff speciation. Therefore, the QBC method may be considered a reliable alternative to the Knott's test for both the detection and speciation of mff in the dog.
This chapter seeks to promote both awareness and understanding of evidence-based psychosocial factors that enhance well-being, adjustment, and recovery in older people admitted to hospital.
The chapter begins by exploring ageing from biological, psychosocial, and existential perspectives. It then focusses upon the psychological sequel of illness and disability in this population and goes on to identify components of psychological well-being drawn from both qualitative and quantitative research studies that promote recovery in older people who have been admitted to hospital.
The chapter also explores the role of culture, faith, and ethnicity in the well-being of hospitalised older people and concludes by highlighting essential components in the development of a positive, recovery-focused culture of care.
This chapter seeks to promote both awareness and understanding of anxiety-based conditions that many older people experience in acute settings and in evidence-based medical and psychosocial interventions that support recovery.
The chapter begins by exploring and identifying the conditions, difficulties, and circumstances that give rise to anxiety in hospitalised older people. This is followed by a description of common anxiety types, their symptomatic presentation, and ther causes. The chapter goes on to explore those evidence-based medical and psychosocial treatment interventions that promote recovery and adjustment
Humankind's main defence against the virus that causes COVID-19 (SARS-CoV-2), besides vaccine development, was co-ordinated behaviour change. In many countries, co-ordination was assisted by tracking surveys designed to measure self-reported behaviour and attitudes. This paper describes an alternative, complementary approach, which was undertaken in close collaboration with officials in the Department of the Taoiseach (Irish Prime Minister). We adapted the Day Reconstruction Method (DRM) to develop the ‘Social Activity Measure’ (SAM). The study was conducted fortnightly for 18 months, with findings delivered directly to the Department. This paper describes the method and shows how SAM generated a detailed picture of where and why transmission risk occurred. By using the DRM, we built aggregate measures from narrative accounts of how individuals spent their previous day. SAM recorded the amount, location and type of social activity, including the incidence of close contact and mask-wearing, as well as compliance with public health restrictions by shops and businesses. The method also permitted a detailed analysis of how public perceptions and comprehension are related to behaviour. The results informed government communications and strategies for lifting public health restrictions. The method could be applied to other future situations that might require co-ordinated public behaviour over an extended period.
How do we engage with the threat of social and environmental degradation while creating and maintaining liveable and just worlds? Researchers from diverse backgrounds unpack this question through a series of original and committed contributions to this wide-ranging volume.
Several factors shape the neurodevelopmental trajectory. A key area of focus in neurodevelopmental research is to estimate the factors that have maximal influence on the brain and can tip the balance from typical to atypical development.
Methods
Utilizing a dissimilarity maximization algorithm on the dynamic mode decomposition (DMD) of the resting state functional MRI data, we classified subjects from the cVEDA neurodevelopmental cohort (n = 987, aged 6–23 years) into homogeneously patterned DMD (representing typical development in 809 subjects) and heterogeneously patterned DMD (indicative of atypical development in 178 subjects).
Results
Significant DMD differences were primarily identified in the default mode network (DMN) regions across these groups (p < 0.05, Bonferroni corrected). While the groups were comparable in cognitive performance, the atypical group had more frequent exposure to adversities and faced higher abuses (p < 0.05, Bonferroni corrected). Upon evaluating brain-behavior correlations, we found that correlation patterns between adversity and DMN dynamic modes exhibited age-dependent variations for atypical subjects, hinting at differential utilization of the DMN due to chronic adversities.
Conclusion
Adversities (particularly abuse) maximally influence the DMN during neurodevelopment and lead to the failure in the development of a coherent DMN system. While DMN's integrity is preserved in typical development, the age-dependent variability in atypically developing individuals is contrasting. The flexibility of DMN might be a compensatory mechanism to protect an individual in an abusive environment. However, such adaptability might deprive the neural system of the faculties of normal functioning and may incur long-term effects on the psyche.
There is a paucity of data guiding treatment duration of oral vancomycin for Clostridiodes difficile infection (CDI) in patients requiring concomitant systemic antibiotics.
Objectives:
To evaluate prescribing practices of vancomycin for CDI in patients that required concurrent systemic antibiotics and to determine whether a prolonged duration of vancomycin (>14 days), compared to a standard duration (10–14 days), decreased CDI recurrence.
Methods:
In this retrospective cohort study, we evaluated adult hospitalized patients with an initial episode of CDI who were treated with vancomycin and who received overlapping systemic antibiotics for >72 hours. Outcomes of interest included CDI recurrence and isolation of vancomycin-resistant Enterococcus (VRE).
Results:
Among the 218 patients included, 36% received a standard duration and 64% received a prolonged duration of treatment for a median of 13 days (11–14) and 20 days (16–26), respectively. Patients who received a prolonged duration had a longer median duration of systemic antibiotic overlap with vancomycin (11 vs 8 days; P < .001) and significantly more carbapenem use and infectious disease consultation. Recurrence at 8 weeks (12% standard duration vs 8% prolonged duration; P = .367), recurrence at 6 months (15% standard duration vs 10% prolonged duration; P = .240), and VRE isolation (3% standard duration vs 9% prolonged duration; P = .083) were not significantly different between groups. Discontinuation of vancomycin prior to completion of antibiotics was an independent predictor of 8-week recurrence on multivariable logistic regression (OR, 4.8; 95% CI, 1.3–18.1).
Conclusions:
Oral vancomycin prescribing relative to the systemic antibiotic end date may affect CDI recurrence to a greater extent than total vancomycin duration alone. Further studies are needed to confirm these findings.
Greece is a key EU entry country for unaccompanied migrant minors seeking safety but such children are frequently criminalised through detention processes. Giving voice to migrant children throughout, Papadopoulos promotes child-friendly practices and the safeguarding of fundamental rights.
Visuospatial skills are frequently assessed with drawing tests. Research has suggested that the use of drawing tasks in low educated groups may lack the ability to discriminate healthy individuals from clinical populations. The aims of this study were to investigate the validity of visuoconstructional tests in a sample of older Greek Australian immigrants and compare their performances to a matched sample of patients with Alzheimer’s disease (AD).
Participants and Methods:
We assessed visuoconstructional performances in a sample of 90 healthy older Greek Australians, with a primary school level of education, and compared performances to a demographically matched sample of 20 Greek Australians with a diagnosis of AD on four visuoconstructional drawing tests: Greek cross, four-pointed star, intersecting pentagons, and the Necker Cube.
Results:
While healthy participants tended to outperform the AD group on most copy tasks, high fail rates within the healthy sample were observed for the intersecting pentagons and Necker cube (78% and 73% fail rates respectively) when using established clinical cutoff scores. High rates of curved angle, omission, distorted relation between elements, spatial disorganization and three-dimensional design errors were found across the four-pointed star, intersecting pentagons, and the Necker cube in both healthy participants and those with AD. Exploratory receiver operating characteristic curve analysis revealed that, with perhaps the exception of the Greek cross, meaningful sensitivity and specificity could not be reached for the four-pointed star, intersecting pentagons, and Necker cube.
Conclusions:
Cognitively healthy immigrants with low education appear to be at a disadvantage when completing visuoconstructional drawing tests, as their performance may be misinterpreted as indicating cognitive impairment. Future research is needed to identify alternative approaches to assess visuoconstructional ability in low education older cohorts.
It is important to create and maintain durable hemodialysis (HD) access in health systems to reduce morbidity and maintain overall cost control in patients with end-stage kidney disease (ESKD). To evaluate the choice of HD vascular access creation procedures and their related economic costs, we aimed to identify economic evaluations on vascular access (VA) creation procedures in patients with ESKD.
Methods
A systematic literature review was conducted using the Cochrane methodology to identify cost-effectiveness analyses (CEAs), budget impact analyses, and cost analyses of various HD access creation procedures. Eligible publications published from 2012 onwards were retrieved by searching PubMed, Embase, and the Cochrane Library. The Consolidated Health Economic Evaluation Reporting Standards 2022 checklist and ISPOR Task Force guidelines were used to appraise the quality of the economic evaluations and budget impact analyses, respectively. Costs were adjusted for inflation and purchasing power parity and standardized to US dollars.
Results
A total of 40 economic evaluations met the inclusion criteria, including 28 cost analyses, three budget impact analyses, and nine CEAs. Widely evaluated procedures in the published literature were endovascular and surgical arteriovenous fistula (AVF), arteriovenous graft (AVG), and central venous catheterization (CVC). The results indicated that AVF was the most cost-effective strategy, followed by AVG, and CVC. Three studies showed that endovascular AVF was cost effective, compared with surgical AVF, and resulted in overall cost savings of about USD53 million dollars over a five-year period. Results of the quality assessment showed that budget impact analyses scored 63 percent, while the average score for economic evaluations was 58 percent.
Conclusions
It was challenging to identify a single effective method of managing vascular access due to the substantial heterogeneity among VA creation techniques. However, most of the included economic evaluations showed that AVF was a cost-effective method of VA creation relative to other identified techniques for patients with ESKD on HD.
A high proportion of patients with end-stage kidney disease (ESKD) are treated with hemodialysis (HD). To lower morbidity and maintain overall cost control in patients with ESKD, it is crucial for health systems to establish and maintain durable hemodialysis (HD) access. Our objective was to assess the budget impact of utilizing the ‘WavelinQ Endo Arteriovenous Fistula (AVF) system’ (WavelinQ) for HD patients.
Methods
A one-year economic model from the Hospital (Flinders Medical Centre, FMC) perspective was developed with Australian epidemiological and costing data. Clinical data were collected from real-world sources. The incident (n=50) and prevalent (n=250) cohorts were based on FMC utilization patterns. The current standard of care was surgical AVF (sAVF) and/or central venous catheters (CVC). With introduction of WavelinQ into practice, the substitution rate was set at 50 percent in new patients and ten percent amongst existing patients. Index procedure and reinterventions costs for the patient were based on the weighted average cost using National Efficient Price Determination 2020 to 21. Total costs preWavelinQ introduction were compared to post WavelinQ substitution to determine the budget impact.
Results
Based on FMC expected patient cohort and WavelinQ substitution rates, the mean annual cost savings per incident and prevalent patient were AUD26,873 and AUD3,549, respectively, which lead to overall mean annual cost savings per patient of AUD7,437. The calculated per patient additional upfront cost of AUD7,010 with the WavelinQ index procedure versus sAVF was more than offset by the savings due to less post-procedure reinterventions. Overall, at the assumed substitution rates with WavelinQ, the model predicted a cost saving of approximately AUD2.2 million dollars for FMC.
Conclusions
The use of WavelinQ is expected to lead to cost savings of AUD2.2 million dollars from the FMC perspective. Hospitals should consider not just the increase in upfront costs but also potential savings from less reintervention procedures. There is a need for continued research on the budget impact of different HD modalities across multiple settings.
African (Bantu) philosophy conceptualizes morality through ubuntu, which emphasizes the role of community in producing moral agents. This community is characterized by practices that respond to and value interdependence, such as care, cooperation, and respect for elders and ancestral knowledge. While there have been attributions of morality to nonhuman animals in the interdisciplinary animal morality debate, this debate has focused on Western concepts. We argue that the ubuntu conception of morality as a communal practice applies to some nonhuman animals. African elephant communities are highly cooperative and structured around elders; they alloparent, protect their communities, mourn their dead, and pass on cultural knowledge between generations. Identifying these as important moral practices, ubuntu provides a theoretical framework to expand our ethical concern for elephants to their communities. In practice, this will deepen our understanding of the wrongness of atrocities like culling for population management or trophy hunting.