We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To describe Candida auris infections from two different geographical regions within a large health-system, both of which have experienced a significant increase in the occurrence of C. auris.
Design:
Multicenter, retrospective, descriptive analysis across a large healthcare system.
Methods:
Patients were included in this study if they were admitted as an inpatient between January 1, 2021 and September 30, 2022 and had a clinical specimen that grew C. auris.
Results:
A total of 321 patients were included. The clinical outcomes of included patients were comparable between geographical regions (Western and Eastern), with the exception of patients who experienced mortality or transitioned to hospice care at discharge (Western 32.1% vs Eastern 19.1%, P = .014). Over one-third of patients required mechanical ventilation at any point during their admission, while greater than half of the total study population had receipt of a blood transfusion. Approximately 25.2% of all patients received hemodialysis, while 24.3% received total parental nutrition during their hospital stay. More than 50% of patients in both regions required an admission to the intensive care unit at any time-frame during their stay. Fluconazole-resistant isolates were more prevalent in the Western region, but both regions demonstrated a high prevalence of resistance.
Conclusion:
Patients identified with C. auris were characterized by significant underlying morbidity and disease burden. Further studies are warranted to identify infection prevention best practices to reduce transmission and reduce mortality through earlier identification and appropriate antifungal therapy.
This paper describes the development process of a mobile app-based version of the World Health Organization mental health Gap Action Programme Intervention Guide, testing of the app prototypes, and its functionality in the assessment and management of people with mental health conditions in Nepal. Health workers’ perception of feasibility and acceptability of using mobile technology in mental health care was assessed during the inspiration phase (N = 43); the ideation phase involved the creation of prototypes; and prototype testing was conducted over multiple rounds with 15 healthcare providers. The app provides provisional diagnoses and treatment options based on reported symptoms. Participants found the app prototype useful in reminding them of the process of assessment and management of mental disorders. Some challenges were noted, these included a slow app prototype with multiple technical problems, including difficulty in navigating ‘yes’/‘no’ options, and there were challenges reviewing detailed symptoms of a particular disorder using a “more information” icon. The initial feasibility work suggests that if the technical issues are addressed, the e-mhGAP warrants further research to understand if it is a useful method in improving the detection of people with mental health conditions and initiation of evidence-based treatment in primary healthcare facilities.
To estimate the prevalence of unmet needs for assistance among middle-aged and older adults with subjective cognitive decline (SCD) in the US and to evaluate whether unmet needs were associated with health-related quality of life (HRQOL).
Design:
Cross-sectional
Setting:
US – 50 states, District of Columbia, and Puerto Rico
Participants:
Community-dwelling adults aged 45 years and older who completed the Cognitive Decline module on the 2015-–2018 Behavioral Risk Factor Surveillance System reported experiencing SCD and always, usually, or sometimes needed assistance with day-to-day activities because of SCD (n = 6,568).
Measurements:
We defined SCD as confusion or memory loss that was happening more often or getting worse over the past 12 months. Respondents with SCD were considered to have an unmet need for assistance if they sometimes, rarely, or never got the help they needed with day-to-day activities. We measured three domains of HRQOL: (1) mental (frequent mental distress, ≥14 days of poor mental health in the past 30 days), (2) physical (frequent physical distress, ≥14 days of poor physical health in the past 30 days), and (3) social (SCD always, usually, or sometimes interfered with the ability to work, volunteer, or engage in social activities outside the home). We used log-binomial regression models to estimate prevalence ratios (PRs). All estimates were weighted.
Results:
In total, 40.2% of people who needed SCD-related assistance reported an unmet need. Among respondents without depression, an unmet need was associated with a higher prevalence of frequent mental distress (PR = 1.55, 95% CI: 1.12–2.13, p = 0.007). Frequent physical distress and social limitations did not differ between people with met and unmet needs.
Conclusions:
Middle-aged and older adults with SCD-related needs for assistance frequently did not have those needs met, which could negatively impact their mental health. Interventions to identify and meet the unmet needs among people with SCD may improve HRQOL.
OBJECTIVES/GOALS: Active surveillance (AS) is a recognized strategy to manage low-risk prostate cancer (PCa) in the absence of cancer progression. Little prospective data exists on the decisional factors associated with selecting and adhering to AS in the absence of cancer progression. We developed a survey instrument to predict AS uptake and adherence. METHODS/STUDY POPULATION: We utilized a three-step process to develop and refine a survey instrument designed to predict AS uptake and adherence among men with low-risk PCa: 1) We identified relevant conceptual domains based on prior research and a literature review. 2) We conducted 21 semi-structured concept elicitation interviews to identify patient-perceived barriers and facilitators to AS uptake and adherence among men with a low-risk PCa who had been on AS for ≥1 year. The identified concepts became the basis of our draft survey instrument. 3) We conducted two rounds of cognitive interviews with men with low-risk PCa (n = 12; n = 6) to refine and initially validate the instrument. RESULTS/ANTICIPATED RESULTS: Relevant concepts identified from the initial interviews included the importance of patient: knowledge of their PCa risk, value in delaying treatment, trust in urologist and the AS surveillance protocol, and perceived social support. Initially, the survey was drafted as a single instrument to be administered after a patient had selected AS comprising sections on patient health, AS selection, and AS adherence. Based on the first round of cognitive interviews, we revised the single instrument into two surveys to track shifts in patient preference and experience. The first, administered at diagnosis, focuses on selection, and the second, a 6-month follow up, focuses on adherence. Following revisions, participants indicated the revised 2-part instrument was clear and not burdensome to complete. DISCUSSION/SIGNIFICANCE OF IMPACT: The instrument’s content validity was evaluated through cognitive interviews, which supported that the survey items’ intended and understood meanings were isomorphic. In the next phase, we plan to conduct a large-scale prospective cohort study to evaluate the predictive validity, after which it will be available for public research use.
At present, analysis of diet and bladder cancer (BC) is mostly based on the intake of individual foods. The examination of food combinations provides a scope to deal with the complexity and unpredictability of the diet and aims to overcome the limitations of the study of nutrients and foods in isolation. This article aims to demonstrate the usability of supervised data mining methods to extract the food groups related to BC. In order to derive key food groups associated with BC risk, we applied the data mining technique C5.0 with 10-fold cross-validation in the BLadder cancer Epidemiology and Nutritional Determinants study, including data from eighteen case–control and one nested case–cohort study, compromising 8320 BC cases out of 31 551 participants. Dietary data, on the eleven main food groups of the Eurocode 2 Core classification codebook, and relevant non-diet data (i.e. sex, age and smoking status) were available. Primarily, five key food groups were extracted; in order of importance, beverages (non-milk); grains and grain products; vegetables and vegetable products; fats, oils and their products; meats and meat products were associated with BC risk. Since these food groups are corresponded with previously proposed BC-related dietary factors, data mining seems to be a promising technique in the field of nutritional epidemiology and deserves further examination.
The updated common rule, for human subjects research, requires that consents “begin with a ‘concise and focused’ presentation of the key information that will most likely help someone make a decision about whether to participate in a study” (Menikoff, Kaneshiro, Pritchard. The New England Journal of Medicine. 2017; 376(7): 613–615.). We utilized a community-engaged technology development approach to inform feature options within the REDCap software platform centered around collection and storage of electronic consent (eConsent) to address issues of transparency, clinical trial efficiency, and regulatory compliance for informed consent (Harris, et al. Journal of Biomedical Informatics 2009; 42(2): 377–381.). eConsent may also improve recruitment and retention in clinical research studies by addressing: (1) barriers for accessing rural populations by facilitating remote consent and (2) cultural and literacy barriers by including optional explanatory material (e.g., defining terms by hovering over them with the cursor) or the choice of displaying different videos/images based on participant’s race, ethnicity, or educational level (Phillippi, et al. Journal of Obstetric, Gynecologic, & Neonatal Nursing. 2018; 47(4): 529–534.).
Methods:
We developed and pilot tested our eConsent framework to provide a personalized consent experience whereby users are guided through a consent document that utilizes avatars, contextual glossary information supplements, and videos, to facilitate communication of information.
Results:
The eConsent framework includes a portfolio of eight features, reviewed by community stakeholders, and tested at two academic medical centers.
Conclusions:
Early adoption and utilization of this eConsent framework have demonstrated acceptability. Next steps will emphasize testing efficacy of features to improve participant engagement with the consent process.
To describe an outbreak of bacteremia caused by vancomycin-sensitive Enterococcus faecalis (VSEfe).
Design:
An investigation by retrospective case control and molecular typing by whole-genome sequencing (WGS).
Setting:
A tertiary-care neonatal unit in Melbourne, Australia.
Methods:
Risk factors for 30 consecutive neonates with VSEfe bacteremia from June 2011 to December 2014 were analyzed using a case control study. Controls were neonates matched for gestational age, birth weight, and year of birth. Isolates were typed using WGS, and multilocus sequence typing (MLST) was determined.
Results:
Bacteremia for case patients occurred at a median time after delivery of 23.5 days (interquartile range, 14.9–35.8). Previous described risk factors for nosocomial bacteremia did not contribute to excess risk for VSEfe. WGS typing results designated 43% ST179 as well as 14 other sequence types, indicating a polyclonal outbreak. A multimodal intervention that included education, insertion checklists, guidelines on maintenance and access of central lines, adjustments to the late onset sepsis antibiotic treatment, and the introduction of diaper bags for disposal of soiled diapers after being handled inside the bed, led to termination of the outbreak.
Conclusions:
Typing using WGS identified this outbreak as predominately nonclonal and therefore not due to cross transmission. A multimodal approach was then sought to reduce the incidence of VSEfe bacteremia.
Objective: Traumatic brain injury (TBI) sustained in childhood is associated with poor social outcomes. This study investigated the role of theory of mind (ToM) as a mediator of the relation between TBI and peer rejection/victimization and reciprocated friendships, as well as the moderating effect of parental nurturance on those relationships. Method: Participants were children of 8–13 years old (M = 10.45, SD = 1.47), including 13 with severe TBI, 39 with complicated mild/moderate TBI, and 32 children with orthopedic injuries. Data on peer rejection/victimization and friendship were collected in school classrooms using the Extended Class Play and friendship nominations. Parents rated parental nurturance using the Child-Rearing Practices Report. Finally, ToM was measured based on children’s average performance across three tasks measuring different aspects of ToM. Results: Severe TBI was associated with poorer ToM, greater peer rejection/victimization, and fewer reciprocated friendships. ToM mediated the relation between severe TBI and peer rejection/victimization (i.e., severe TBI predicted poorer ToM, which in turn predicted greater rejection/victimization). Parental nurturance significantly moderated this relation, such that the mediating effect of ToM was significant only at low and average levels of parental nurturance, for both severe and complicated mild/moderate TBI groups. Neither the mediating effect of ToM nor the moderating effect of parental nurturance was significant for reciprocated friendships. Conclusion: High parental nurturance may mitigate the negative effects of ToM deficits on risk of peer rejection/victimization among children with TBI. Interventions designed to increase parental nurturance or ToM may promote better social outcomes among children with TBI.
A national need is to prepare for and respond to accidental or intentional disasters categorized as chemical, biological, radiological, nuclear, or explosive (CBRNE). These incidents require specific subject-matter expertise, yet have commonalities. We identify 7 core elements comprising CBRNE science that require integration for effective preparedness planning and public health and medical response and recovery. These core elements are (1) basic and clinical sciences, (2) modeling and systems management, (3) planning, (4) response and incident management, (5) recovery and resilience, (6) lessons learned, and (7) continuous improvement. A key feature is the ability of relevant subject matter experts to integrate information into response operations. We propose the CBRNE medical operations science support expert as a professional who (1) understands that CBRNE incidents require an integrated systems approach, (2) understands the key functions and contributions of CBRNE science practitioners, (3) helps direct strategic and tactical CBRNE planning and responses through first-hand experience, and (4) provides advice to senior decision-makers managing response activities. Recognition of both CBRNE science as a distinct competency and the establishment of the CBRNE medical operations science support expert informs the public of the enormous progress made, broadcasts opportunities for new talent, and enhances the sophistication and analytic expertise of senior managers planning for and responding to CBRNE incidents.
This study tests novel methods for automatically identifying annual layers in a shallow Antarctic ice core (WDC05Q) using images that were collected with an optical scanner at the US National Ice Core Laboratory. A new method of optimized variance maximization (OVM) modeled the density-related changes in annual layer thickness directly from image variance. This was done by using multi-objective complex (MOCOM) parameter optimization to drive a low-pass filtering scheme. The OVM-derived changes in annual layer thickness corresponded well with the results of an independent glaciochemical interpretation of the core. Individual annual cycles in image brightness were then identified by using OVM results to apply a depth-varying low-pass filter and fitting a second-order polynomial to a locally detrended neighborhood. The resulting map of annual cycles agreed to within 1% of the overall annual count of the glaciochemical interpretation. Agreement on the presence of specific annual layer features was 96%. It was also shown that the MOCOM parameter optimization could calibrate the image-based results to match directly the date of a specific volcanic marker.
A novel ‘selection curve’ method is developed to interpret annual layers in the West Antarctic ice sheet (WAIS) Divide ice core based on dielectric properties (DEP). Because dielectric measurements are non-contact and represent the integrated response of the ice volume, they are particularly useful for the brittle zone of the core. Seasonal differences in ice chemistry create an annual signal in DEP, though multiple peaks of varying strength within a year may complicate the interpretation of annual layers. The selection curve algorithm uses a spline curve whose shape selects successive annual peaks in plots of DEP. This spline curve was scaled to the average annual-layer thickness at a given depth, where the layer thickness was best estimated using the fast Fourier transform (FFT) power spectrum within a sliding 10 m window. To explore the accuracy and stability of the method, several spline curves were generated from varying lengths of calibration data taken from multiple depths in the WAIS core. Using 50 m of manually interpreted calibration data, the selection curve method matched a manual interpretation throughout the entire 1200 m dataset to within 2% root-mean-square error (RMSE). This method is equally applicable to glaciochemical and other time/depth series measurements.
Objectives: The current study examines whether psychosocial outcomes following pediatric traumatic brain injury (TBI) vary as a function of children’s rejection sensitivity (RS), defined as their disposition to be hypersensitive to cues of rejection from peers. Methods: Children ages 8–13 with a history of severe TBI (STBI, n=16), complicated mild/moderate TBI (n=35), or orthopedic injury (OI, n=49) completed measures assessing self-esteem and RS on average 3.28 years post-injury (SD=1.33, range=1.25–6.34). Parents reported on their child’s emotional and behavioral functioning and social participation. Results: Regression analyses found moderation of group differences by RS for three outcomes: social participation, self-perceptions of social acceptance, and externalizing behavior problems. Conditional effects at varying levels of RS indicated that externalizing problems and social participation were significantly worse for children with STBI at high levels of RS, compared to children with OI. Social participation for the STBI group remained significantly lower than the OI group at mean levels of RS, but not at low levels of RS. At high levels of RS, self-perceptions of social acceptance were lower for children with moderate TBI compared to OI, but group differences were not significant at mean or low levels of RS. No evidence of moderation was found for global self-worth, self-perceptions of physical appearance or athletic ability, or internalizing problems. Conclusions: The findings highlight the salient nature of social outcomes in the context of varying levels of RS. These findings may have implications for the design of interventions to improve social outcomes following TBI. (JINS, 2017, 23, 451–459)
Objectives: This study examined whether children with distinct brain disorders show different profiles of strengths and weaknesses in executive functions, and differ from children without brain disorder. Methods: Participants were children with traumatic brain injury (N=82; 8–13 years of age), arterial ischemic stroke (N=36; 6–16 years of age), and brain tumor (N=74; 9–18 years of age), each with a corresponding matched comparison group consisting of children with orthopedic injury (N=61), asthma (N=15), and classmates without medical illness (N=68), respectively. Shifting, inhibition, and working memory were assessed, respectively, using three Test of Everyday Attention: Children’s Version (TEA-Ch) subtests: Creature Counting, Walk-Don’t-Walk, and Code Transmission. Comparison groups did not differ in TEA-Ch performance and were merged into a single control group. Profile analysis was used to examine group differences in TEA-Ch subtest scaled scores after controlling for maternal education and age. Results: As a whole, children with brain disorder performed more poorly than controls on measures of executive function. Relative to controls, the three brain injury groups showed significantly different profiles of executive functions. Importantly, post hoc tests revealed that performance on TEA-Ch subtests differed among the brain disorder groups. Conclusions: Results suggest that different childhood brain disorders result in distinct patterns of executive function deficits that differ from children without brain disorder. Implications for clinical practice and future research are discussed. (JINS, 2017, 23, 529–538)
This study examined differences in friendship quality between children with traumatic brain injury (TBI) and orthopedic injury (OI) and behavioral outcomes for children from both groups. Participants were 41 children with TBI and 43 children with OI (M age=10.4). Data were collected using peer- and teacher-reported measures of participants’ social adjustment and parent-reported measures of children’s post-injury behaviors. Participants and their mutually nominated best friends also completed a measure of the quality of their friendships. Children with TBI reported significantly more support and satisfaction in their friendships than children with OI. Children with TBI and their mutual best friend were more similar in their reports of friendship quality compared to children with OI and their mutual best friends. Additionally, for children with TBI who were rejected by peers, friendship support buffered against maladaptive psychosocial outcomes, and predicted skills related to social competence. Friendship satisfaction was related to higher teacher ratings of social skills for the TBI group only. Positive and supportive friendships play an important role for children with TBI, especially for those not accepted by peers. Such friendships may protect children with TBI who are rejected against maladaptive psychosocial outcomes, and promote skills related to social competence. (JINS, 2014, 21, 1–10)
Edited by
Stephen Taylor, Professor in the History of Early Modern England at the University of Durham,Grant Tapsell, Lecturer in Early Modern History, University of Oxford and Fellow and Tutor at Lady Margaret Hall
Modern accounts of the re-establishment of the Church of England in 1660-2 have usually focussed on the politics of court and parliament, on set pieces such as the Worcester House and Savoy conferences, and on the revival of cathedral communities and the machinery of diocesan government. Ordination, by contrast, has been largely neglected. Robert Bosher declared that it was ‘not a major issue’; other historians have noted that re-ordination, namely the requirement that presbyterians take episcopal orders to remain within the ministry, was highly contentious and that its inclusion in the Act of Uniformity of 1662 helped to swell the numbers ejected after St Bartholomew's day, but even this important point has not been systematically pursued. A thorough study of the number and pattern of ordinations in 1660-2, building on evidence in the Clergy of the Church of England Database, gives us a rather different view of the restoration of the Church in three important ways. First, this was an extraordinary and unsettled period. Very large numbers of candidates, among them many former presbyterians, obtained episcopal orders. These were dispensed by a minority of bishops, including several holding Scottish and Irish sees, with little regard for canonical regulations. The return of the customary administration of ordination only dates from the very end of 1662. Second, the fact that only a handful of bishops regularly conferred orders reveals starkly different practices of ordination among the episcopate and the paradox of high churchmen, such as Sheldon, leaving the restocking of the parish ministry to bishops who were more accommodating to tender puritan consciences.
Clinical supervision is key to the delivery and governance of effective psychological work. We place increasing emphasis on the evidence base in our clinical decision making, and yet there is no comparable body of information to inform our supervisory practice. This is a serious problem for psychological therapists; there is an urgent need for theoretically driven and empirically evaluated approaches to supervision, and the training of such skills. This preliminary evaluation examined the impact of a 5-day training designed for Improving Access to Psychological Therapies (IAPT) supervisors new to the role. A within-subject, repeated-measures design was used to compare self-assessed supervision competencies over the course of training. Twenty-eight IAPT supervisors completed 5 days’ training based on the Supervision Competencies Framework and IAPT Supervision Guidance. Significant improvements were found in ratings of generic, specific, applied and meta-supervision competencies, as well as overall competency. This evaluation gives preliminary support for the impact of training on supervisory competencies. There are clear limitations, particularly the lack of objective measures and comparison training. Nevertheless, in the context of a very limited evidence base to date, the study contributes to a more robust approach to developing supervisory competence in clinical practice.
The purpose of this article is to set the context for this special issue of Disaster Medicine and Public Health Preparedness on the allocation of scarce resources in an improvised nuclear device incident. A nuclear detonation occurs when a sufficient amount of fissile material is brought suddenly together to reach critical mass and cause an explosion. Although the chance of a nuclear detonation is thought to be small, the consequences are potentially catastrophic, so planning for an effective medical response is necessary, albeit complex. A substantial nuclear detonation will result in physical effects and a great number of casualties that will require an organized medical response to save lives. With this type of incident, the demand for resources to treat casualties will far exceed what is available. To meet the goal of providing medical care (including symptomatic/palliative care) with fairness as the underlying ethical principle, planning for allocation of scarce resources among all involved sectors needs to be integrated and practiced. With thoughtful and realistic planning, the medical response in the chaotic environment may be made more effective and efficient for both victims and medical responders.
(Disaster Med Public Health Preparedness. 2011;5:S20-S31)
Social communication involves influencing what other people think and feel about themselves. We use the term conative theory of mind (ToM) to refer to communicative interactions involving one person trying to influence the mental and emotional state of another, paradigmatic examples of which are irony and empathy. This study reports how children with traumatic brain injury (TBI) understand ironic criticism and empathic praise, on a task requiring them to identify speaker belief and intention for direct conative speech acts involving literal truth, and indirect speech acts involving either ironic criticism or empathic praise. Participants were 71 children in the chronic state of a single TBI and 57 age- and gender-matched children with orthopedic injuries (OI). Group differences emerged on indirect speech acts involving conation (i.e., irony and empathy), but not on structurally and linguistically identical direct speech acts, suggesting specific deficits in this aspect of social cognition in school-age children with TBI. Deficits in children with mild-moderate TBI were less widespread and more selective than those of children with more severe injuries. Deficits in understanding the social, conative function of indirect speech acts like irony and empathy have widespread and deep implications for social function in children with TBI. (JINS, 2013, 19, 1–11)