We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We assessed healthcare workers’ knowledge, attitudes, and practices around disposable personal protective equipment (PPE) use. We observed that healthcare workers are interested in sustainable policies and identified areas for policy changes to reduce PPE waste.
Objectives/Goals: Upon diagnosis, patients with acute myeloid leukemia (AML) have significant information needs. Given its recent increase in popularity, patients may use ChatGPT to access information about AML. We will examine the quality, reliability, and readability of information that ChatGPT provides in response to frequently asked questions (FAQs) about AML. Methods/Study Population: From FAQs on the top 3 patient-facing websites about AML, we derived 26 questions, written in lay terms, about AML diagnosis, treatment, prognosis, and functional impact. We queried ChatGPT-4o on 10/14/2024 using a new Google account with no prior history. We asked each question in a separate chat window once, verbatim, and without prompt engineering. After calibration, 5 oncologists independently reviewed ChatGPT responses. We assessed quality via the Global Quality Scale (GQS), scored from 1 (poor) to 5 (excellent) based on flow, topic coverage, and usefulness. For reliability, we assessed whether each response addresses the query and is factually accurate, elaborating on specific inaccuracies. For readability, we assessed Flesch-Kincaid Grade Level, Gunning Fog Index, and Simple Measure of Gobbledygook. Results/Anticipated Results: This will be a descriptive analysis of ChatGPT responses. For quality and reliability assessments, we will report Fleiss’ kappa for inter-rater reliability and expect substantial agreement or greater (≥0.61). Per prior studies in other domains, we hypothesize that ChatGPT responses will have good quality on average (i.e., GQS score near 4). We hypothesize that nearly all responses will address their query and will mostly be accurate; a minority of responses may have partial inaccuracies. Finally, we hypothesize that readability metrics will suggest that a higher educational level (e.g., college-level education) is required for comprehension. Overall, these findings will help elucidate strengths and limitations of ChatGPT for AML and guide discussion of factors patients should be aware of when using ChatGPT. Discussion/Significance of Impact: No prior study has examined the educational quality of ChatGPT for AML. Our study will detail whether patients are receiving trustworthy and meaningful information, identify misinformation, and provide guidance to oncologists when recommending information resources to patients or fielding questions that patients may raise after using ChatGPT.
Comprehensive planning for family reunification following a disaster is complex and often underdeveloped, especially in hospitals. The 2013 and subsequent 2021 National Pediatric Readiness Project revealed less than half of hospitals had disaster plans that addressed the needs of children. Leveraging quality improvement (QI) language and methodology allows for alignment and engagement of hospital leaders and personnel unaccustomed to disaster planning. We aimed to create a family reunification plan which would enable child-safe reunification within 4 hours of an event using quality improvement methodology. QI tools such as the fishbone diagram, key driver diagram, and process maps enhanced the planning process. We then utilized the Plan-Do-Study-Act model to test and revise our plan. Active involvement of key stakeholders was crucial. By using quality improvement methodology, hospital personnel unfamiliar with disaster management helped develop and improve our hospital’s family reunification plan.
Understanding and forecasting mortality by cause is an essential branch of actuarial science, with wide-ranging implications for decision-makers in public policy and industry. To accurately capture trends in cause-specific mortality, it is critical to consider dependencies between causes of death and produce forecasts by age and cause coherent with aggregate mortality forecasts. One way to achieve these aims is to model cause-specific deaths using compositional data analysis (CODA), treating the density of deaths by age and cause as a set of dependent, nonnegative values that sum to one. A major drawback of standard CODA methods is the challenge of zero values, which frequently occur in cause-of-death mortality modeling. Thus, we propose using a compositional power transformation, the $\alpha$-transformation, to model cause-specific life-table death counts. The $\alpha$-transformation offers a statistically rigorous approach to handling zero value subgroups in CODA compared to ad hoc techniques: adding an arbitrarily small amount. We illustrate the $\alpha$-transformation in England and Wales and US death counts by cause from the Human Cause-of-Death database, for cardiovascular-related causes of death. The results demonstrate the $\alpha$-transformation improves forecast accuracy of cause-specific life-table death counts compared with log-ratio-based CODA transformations. The forecasts suggest declines in the proportions of deaths from major cardiovascular causes (myocardial infarction and other ischemic heart diseases).
Evidence-based insertion and maintenance bundles are effective in reducing the incidence of central line-associated bloodstream infections (CLABSI) in intensive care unit (ICU) settings. We studied the adoption and compliance of CLABSI prevention bundle programs and CLABSI rates in ICUs in a large network of acute care hospitals across Canada.
A translator of Shakespeare in the twenty-first century must perform many roles: editor to solve textual cruxes, director to determine the dramatic function of the scene; and adaptor and poet to recreate the drama in a target language. Translations activate potentialities of Shakespeare’s text, which is resonant in different meanings.
Non-Suicidal Self-Injury (NSSI) occurs when direct, deliberate harm is caused to one's physical body without intention of suicide. Approximately 22.1% of youth worldwide would engage in NSSI in their lifetimes. Due to the increased risk of harm and future suicide attempts, NSSI is a behaviour that warrants attention and has been identified as a condition in need of further study. While some studies have examined the prevalence and experiences of NSSI in Singapore, there is a lack of detailed studies on the presentation and overall phenomenology of NSSI in the local context. This study aims to assess the characteristics of NSSI using the Non-Suicidal Self-Injury – Assessment Tool (NSSI-AT) in a cross-sectional design. We investigated the functions, characteristics, and personal experiences of local youths who engage in NSSI for the development and improvement of patient-centred care.
Methods
121 youths between 12 and 25 years old were recruited from the National University Health System. The study included patients seeking treatment for mood disorders and have self-reported NSSI behaviours such as cutting, hitting, and scratching prior to or at the time of visit. Outcomes for the NSSI–AT, including the actions, functions, frequency, age of onset, initial motivations, severity, practice patterns, disclosure, and treatment experiences of self-harm, were reported using descriptive analysis. Personal reflections were analysed using thematic analysis.
Results
Participants were mostly female (n = 86, 71.1%) with a mean age of 16.2 years (SD = 2.33). Many participants engaged in NSSI actions such as cutting, scratching, and banging on objects, to manage high-pressure agitating and low-pressure depressive emotional states. Most participants started engaging in NSSI in early adolescence (mean = 13.0 years old, SD = 2.37, range = 7–23) and have hurt themselves more severely than intended (n = 79, 65.3%). When reflecting on overall NSSI experiences, participants had similar levels of ambivalence toward NSSI and growth due to NSSI. Participants also gave encouragement to others going through similar experiences and reported the negative aspects of self-harm.
Conclusion
Findings support emotional regulation as a function of NSSI in the local population, where self-harm was not generally used for social communication purposes. Findings also suggest that youths may be more vulnerable to NSSI during early adolescence, corresponding to a time of substantial life changes. This study also demonstrated the individuality of NSSI experiences among the local youth, highlighting the importance of having a person-centred approach in NSSI treatment. Taken together, this highlights the need to develop interventions that can effectively serve this age group and their specific challenges.
The COVID-19 pandemic led to an initial increase in the incidence of carbapenem-resistant Enterobacterales (CRE) from clinical cultures in South-East Asia hospitals, which was unsustained as the pandemic progressed. Conversely, there was a decrease in CRE incidence from surveillance cultures and overall combined incidence. Further studies are needed for future pandemic preparedness.
To identify risk factors for mortality in intensive care units (ICUs) in Asia.
Design:
Prospective cohort study.
Setting:
The study included 317 ICUs of 96 hospitals in 44 cities in 9 countries of Asia: China, India, Malaysia, Mongolia, Nepal, Pakistan, Philippines, Sri Lanka, Thailand, and Vietnam.
Participants:
Patients aged >18 years admitted to ICUs.
Results:
In total, 157,667 patients were followed during 957,517 patient days, and 8,157 HAIs occurred. In multiple logistic regression, the following variables were associated with an increased mortality risk: central-line–associated bloodstream infection (CLABSI; aOR, 2.36; P < .0001), ventilator-associated event (VAE; aOR, 1.51; P < .0001), catheter-associated urinary tract infection (CAUTI; aOR, 1.04; P < .0001), and female sex (aOR, 1.06; P < .0001). Older age increased mortality risk by 1% per year (aOR, 1.01; P < .0001). Length of stay (LOS) increased mortality risk by 1% per bed day (aOR, 1.01; P < .0001). Central-line days increased mortality risk by 2% per central-line day (aOR, 1.02; P < .0001). Urinary catheter days increased mortality risk by 4% per urinary catheter day (aOR, 1.04; P < .0001). The highest mortality risks were associated with mechanical ventilation utilization ratio (aOR, 12.48; P < .0001), upper middle-income country (aOR, 1.09; P = .033), surgical hospitalization (aOR, 2.17; P < .0001), pediatric oncology ICU (aOR, 9.90; P < .0001), and adult oncology ICU (aOR, 4.52; P < .0001). Patients at university hospitals had the lowest mortality risk (aOR, 0.61; P < .0001).
Conclusions:
Some variables associated with an increased mortality risk are unlikely to change, such as age, sex, national economy, hospitalization type, and ICU type. Some other variables can be modified, such as LOS, central-line use, urinary catheter use, and mechanical ventilation as well as and acquisition of CLABSI, VAE, or CAUTI. To reduce mortality risk, we shall focus on strategies to reduce LOS; strategies to reduce central-line, urinary catheter, and mechanical ventilation use; and HAI prevention recommendations.
Understanding the size of oil droplets released from a jet in crossflow is crucial for estimating the trajectory of hydrocarbons and the rates of oil biodegradation/dissolution in the water column. We present experimental results of an oil jet with a jet-to-crossflow velocity ratio of 9.3. The oil was released from a vertical pipe 25 mm in diameter with a Reynolds number of 25 000. We measured the size of oil droplets near the top and bottom boundaries of the plume using shadowgraph cameras and we also filmed the whole plume. In parallel, we developed a multifluid large eddy simulation model to simulate the plume and coupled it with our VDROP population balance model to compute the local droplet size. We accounted for the slip velocity of oil droplets in the momentum equation and in the volume fraction equation of oil through the local, mass-weighted average droplet rise velocity. The top and bottom boundaries of the plume were captured well in the simulation. Larger droplets shaped the upper boundary of the plume, and the mean droplet size increased with elevation across the plume, most likely due to the individual rise velocity of droplets. At the same elevation across the plume, the droplet size was smaller at the centre axis as compared with the side boundaries of the plume due to the formation of the counter-rotating vortex pair, which induced upward velocity at the centre axis and downward velocity near the sides of the plume.
Providing good end-of-life (EOL) care for noncancer patients has been made a national priority in Singapore. A combined medical and nursing ward-based intervention known as the EOL care plan was piloted in a general medicine ward at our institution, aiming to guide key aspects of EOL care. The aim of this study is to assess the EOL care plan's effect on EOL care for general medicine patients.
Method
We conducted a retrospective cohort study on inpatients who died in a general ward under the discipline “General Medicine” from May to October 2019. We collected data around symptom management, rationalization of care and communication with families. The primary analysis compared care received by patients who died in the pilot ward with that of a control group of patients who died in other wards.
Results
In total, 112 records were included in the analysis. Pain assessment was more common in the pilot ward compared with the control group (35.3% vs. 6.3%, p < 0.001), as were anti-psychotic prescriptions for delirium (64.7% vs. 24.4%, p = 0.001). Fewer patients received blood glucose monitoring in the last 48 h of life in the pilot ward (69.5% vs. 35.3%, p = 0.007). There were also less frequent parameters monitoring in the pilot ward (p < 0.004).
Significance of results
The implementation of the EOL care plan was associated with process-level indicators of better EOL care, suggesting that it could have a significant positive impact when implemented on a wider scale.
Background: Identification of hospitalized patients with enteric multidrug-resistant organism (MDRO) carriage, combined with implementation of targeted infection control interventions, may help reduce MDRO transmission. However, the optimal surveillance approach has not been defined. We sought to determine whether daily serial rectal surveillance for MDROs detects more incident cases (acquisition) of MDRO colonization in medical intensive care unit (MICU) patients than admission and discharge surveillance alone. Methods: Prospective longitudinal observational single-center study from January 11, 2017, to January 11, 2018. Inclusion criteria were ≥3 consecutive MICU days and ≥2 rectal or stool swabs per MICU admission. Daily rectal or stool swabs were collected from patients and cultured for MDROs, including vancomycin-resistant Enterococcus (VRE), carbapenem-resistant Enterobacterales (CRE), third-generation cephalosporin-resistant Enterobacterales (3GCR), and extended-spectrum β-lactamase–producing Enterobacterales (ESBL-E) (as a subset of 3GCR). MDRO detection at any time during the MICU stay was used to calculate prevalent colonization. Incident colonization (acquisition) was defined as new detection of an MDRO after at least 1 prior negative swab. We then determined the proportion of prevalent and incident cases detected by daily testing that were also detected when only first swabs (admission) and last swabs (discharge) were tested. Data were analyzed using SAS version 9.4 software. Results: In total, 939 MICU stays of 842 patients were analyzed. Patient characteristics were median age 64 years (interquartile range [IQR], 51–74), median MICU length of stay 5 days (IQR, 3–8), median number of samples per admission 3 (IQR, 2–5), and median Charlson index 4 (IQR, 2–7). Prevalent colonization with any MDRO was detected by daily swabbing in 401 stays (42.7%). Compared to daily serial swabbing, an admission- and discharge-only approach detected ≥86% of MDRO cases (ie, overall prevalent MDRO colonization). Detection of incident MDRO colonization by an admission- or discharge-only approach would have detected fewer cases than daily swabbing (Figure 1); ≥34% of total MDRO acquisitions would have been missed. Conclusions: Testing patients upon admission and discharge to an MICU may fail to detect MDRO acquisition in more than one-third of patients, thereby reducing the effectiveness of MDRO control programs that are targeted against known MDRO carriers. The poor performance of a single discharge swab may be due to intermittent or low-level MDRO shedding, inadequate sampling, or transient MDRO colonization. Additional research is needed to determine the optimal surveillance approach of enteric MDRO carriage.
Sleep disturbance is common in gestational parents during pregnancy and postpartum periods. This study evaluated the feasibility and efficacy of a scalable cognitive behavioural therapy (CBT) sleep intervention tailored for these periods.
Methods
This is a two-arm, parallel-group, single-blind, superiority randomised controlled trial. Nulliparous females without severe medical/psychiatric conditions were randomised 1:1 to CBT or attention- and time-matched control. All participants received a 1 h telephone session and automated multimedia emails from the third trimester until 6 months postpartum. Outcomes were assessed with validated instruments at gestation weeks 30 (baseline) and 35 (pregnancy endpoint), and postpartum months 1.5, 3, 6 (postpartum endpoint), 12 and 24.
Results
In total, 163 eligible participants (age M ± s.d. = 33.35 ± 3.42) were randomised. The CBT intervention was well accepted, with no reported adverse effect. Intention-to-treat analyses showed that compared to control, receiving CBT was associated with lower insomnia severity and sleep disturbance (two primary outcomes), and lower sleep-related impairment at the pregnancy endpoint (p values ⩽ 0.001), as well as at 24 months postpartum (p ranges 0.012–0.052). Group differences across the first postpartum year were non-significant. Participants with elevated insomnia symptoms at baseline benefitted substantially more from CBT (v. control), including having significantly lower insomnia symptoms throughout the first postpartum year. Group differences in symptoms of depression or anxiety were non-significant.
Conclusions
A scalable CBT sleep intervention is efficacious in buffering against sleep disturbance during pregnancy and benefitted sleep at 2-year postpartum, especially for individuals with insomnia symptoms during pregnancy. The intervention holds promise for implementation into routine perinatal care.
Polycystic ovary syndrome (PCOS) is associated with a higher prevalence of sleep disturbances and obesity. Treatment of PCOS includes modifying lifestyle behaviours associated with weight management. However, poor sleep in the non-PCOS population has been associated with poorer lifestyle behaviours. The aim was to investigate whether sleep disturbance confounds or modifies the association between lifestyle factors and PCOS. This was a cross-sectional analysis from the Australian Longitudinal Study on Women’s Health cohort aged 31–36 years in 2009 were analysed (n 6067, 464 PCOS, 5603 non-PCOS). Self-reported data were collected on PCOS, anthropometry, validated modified version of the Active Australia Physical Activity survey, validated FFQ and sleep disturbances through latent class analysis. Women with PCOS had greater adverse sleep symptoms including severe tiredness (P = 0·001), difficulty sleeping (P < 0·001) and restless sleep (P < 0·001), compared with women without PCOS. Women with PCOS also had higher energy consumption (6911 (sd 2453) v. 6654 (sd 2215) kJ, P = 0·017), fibre intake (19·8 (sd 7·8) v. 18·9 (sd 6·9) g, P = 0·012) and diet quality (dietary guidelines index (DGI)) (88·1 (sd 11·6) v. 86·7 (sd 11·1), P = 0·008), lower glycaemic index (50·2 (sd 4·0) v. 50·7 (sd 3·9), P = 0·021) and increased sedentary behaviour (6·3 (sd 2·8) v. 5·9 (sd 2·8) h, P = 0·009). There was a significant interaction between PCOS and sleep disturbances for DGI (P = 0·035), therefore only for women who had adequate sleep was PCOS associated with a higher DGI. For women with poorer sleep, there was no association between PCOS and DGI. The association between PCOS and improved diet quality may only be maintained if women can obtain enough good quality sleep.
Background: Carbapenem-resistant Enterobacteriaceae (CRE) are endemic in the Chicago region. We assessed the regional impact of a CRE control intervention targeting high-prevalence facilities; that is, long-term acute-care hospitals (LTACHs) and ventilator-capable skilled nursing facilities (vSNFs). Methods: In July 2017, an academic–public health partnership launched a regional CRE prevention bundle: (1) identifying patient CRE status by querying Illinois’ XDRO registry and periodic point-prevalence surveys reported to public health, (2) cohorting or private rooms with contact precautions for CRE patients, (3) combining hand hygiene adherence, monitoring with general infection control education, and guidance by project coordinators and public health, and (4) daily chlorhexidine gluconate (CHG) bathing. Informed by epidemiology and modeling, we targeted LTACHs and vSNFs in a 13-mile radius from the coordinating center. Illinois mandates CRE reporting to the XDRO registry, which can also be manually queried or generate automated alerts to facilitate interfacility communication. The regional intervention promoted increased automation of alerts to hospitals. The prespecified primary outcome was incident clinical CRE culture reported to the XDRO registry in Cook County by month, analyzed by segmented regression modeling. A secondary outcome was colonization prevalence measured by serial point-prevalence surveys for carbapenemase-producing organism colonization in LTACHs and vSNFs. Results: All eligible LTACHs (n = 6) and vSNFs (n = 9) participated in the intervention. One vSNF declined CHG bathing. vSNFs that implemented CHG bathing typically bathed residents 2–3 times per week instead of daily. Overall, there were significant gaps in infection control practices, especially in vSNFs. Also, 75 Illinois hospitals adopted automated alerts (56 during the intervention period). Mean CRE incidence in Cook County decreased from 59.0 cases per month during baseline to 40.6 cases per month during intervention (P < .001). In a segmented regression model, there was an average reduction of 10.56 cases per month during the 24-month intervention period (P = .02) (Fig. 1), and an estimated 253 incident CRE cases were averted. Mean CRE incidence also decreased among the stratum of vSNF/LTACH intervention facilities (P = .03). However, evidence of ongoing CRE transmission, particularly in vSNFs, persisted, and CRE colonization prevalence remained high at intervention facilities (Table 1). Conclusions: A resource-intensive public health regional CRE intervention was implemented that included enhanced interfacility communication and targeted infection prevention. There was a significant decline in incident CRE clinical cases in Cook County, despite high persistent CRE colonization prevalence in intervention facilities. vSNFs, where understaffing or underresourcing were common and lengths of stay range from months to years, had a major prevalence challenge, underscoring the need for aggressive infection control improvements in these facilities.
Funding: The Centers for Disease Control and Prevention (SHEPheRD Contract No. 200-2011-42037)
Disclosures: M.Y.L. has received research support in the form of contributed product from OpGen and Sage Products (now part of Stryker Corporation), and has received an investigator-initiated grant from CareFusion Foundation (now part of BD).
Depression is a major cause of disability in adolescents. Higher dietary fibre intake has been associated with lower depressive symptoms in adults, but there is a lack of research in adolescents. We examined the association between dietary fibre intake (Commonwealth Scientific and Industrial Research Organisation (CSIRO) FFQ) and depressive symptoms (Beck Depression Inventory for Youth) in adolescents with prospective data from the Raine Study Gen2 14- and 17-year follow-ups (n 1260 and 653). Odds of moderate/extreme (clinically relevant) depressive symptoms by quartile of fibre intake were calculated using mixed-effects logistic regression for all participants, in a paired sample without moderate/extreme depressive symptoms at 14 years and in a sub-sample of participants with available inflammatory data at the ages of 14 and 17 years (n 718 and 547). Odds of moderate/extreme depressive symptoms were lower in the fourth (highest) quartile of overall fibre intake (OR 0·273, 95 % CI 0·09, 0·81) compared with the first (lowest) quartile, adjusting for sex, age, energy intake, adiposity, and family and lifestyle factors. However, further adjustment for dietary patterns attenuated the results. Associations of depressive symptoms with cereal or fruit and vegetable fibre intake were not significant in the final model. Adjustment for inflammation had no effect on OR. The association between a higher dietary fibre intake and lower odds of clinically relevant depressive symptoms may be more reflective of a high-fibre diet with all its accompanying nutrients than of an independent effect of fibre.
We prove that sums of length about $q^{3/2}$ of Hecke eigenvalues of automorphic forms on $\operatorname{SL}_{3}(\mathbf{Z})$ do not correlate with $q$-periodic functions with bounded Fourier transform. This generalizes the earlier results of Munshi and Holowinsky–Nelson, corresponding to multiplicative Dirichlet characters, and applies, in particular, to trace functions of small conductor modulo primes.
Integration of photonic devices on silicon (Si) substrates is a key method in enabling large scale manufacturing of Si-based photonic–electronic circuits for next generation systems with high performance, small form factor, low power consumption, and low cost. Germanium (Ge) is a promising material due to its pseudo-direct bandgap and its compatibility with Si-CMOS processing. In this article, we present our recent progress on achieving high quality germanium-on-silicon (Ge/Si) materials. Subsequently, the performance of various functional devices such as photodetectors, lasers, waveguides, and sensors that are fabricated on the Ge/Si platform are discussed. Some possible future works such as the incorporation of tin (Sn) into Ge will be proposed. Finally, some applications based on a fully monolithic integrated photonic–electronic chip on an Si platform will be highlighted at the end of this article.