We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study presents the black hole accretion history of obscured active galactic nuclei (AGNs) identified from the JWST CEERS survey by Chien et al. (2024) using mid-infrared (MIR) SED fitting. We compute black hole accretion rates (BHARs) to estimate the black hole accretion density (BHAD), $\rho_{L_{\text{disk}}}$, across $0 \lt z \lt 4.25$. MIR luminosity functions (LFs) are also constructed for these sources, modeled with modified Schechter and double power law forms, and corresponding BHAD, $\rho_{\text{LF}}$, is derived by integrating the LFs and multiplying by the luminosity. Both $\rho_{\text{LF}}$ extend to luminosities as low as $10^7 \, {\rm L}_{\odot}$, two orders of magnitude fainter than pre-JWST studies. Our results show that BHAD peaks between redshifts 1 and 3, with the peak varying by method and model, $z \simeq$ 1 - 2 for $\rho_{L_{\text{disk}}}$ and the double power law, and $z \simeq$ 2 - 3 for the modified Schechter function. A scenario where AGN activity peaks before cosmic star formation would challenge existing black hole formation theories, but our present study, based on early JWST observations, provides an initial exploration of this possibility. At $z \sim 3$, $\rho_{\text{LF}}$ appears higher than X-ray estimates, suggesting that MIR observations are more effective in detecting obscured AGNs missed by X-ray observations. However, given the overlapping error bars, this difference remains within the uncertainties and requires confirmation with larger samples. These findings highlight the potential of JWST surveys to enhance the understanding of co-evolution between galaxies and AGNs.
Adults living with obesity have a higher risk of eating disorders and disordered eating behaviours such as binge eating(1,2). However, the prevalence of disordered eating/eating disorders in adults presenting for obesity treatment is unknown and this information is needed to guide service provision. This systematic review aimed to estimate the prevalence of disordered eating/eating disorders in adults presenting for obesity treatment. Embase, MEDLINE and PsycINFO were searched to March 2024. Eligible studies (k) measured disordered eating/eating disorders in adults with overweight/obesity presenting for obesity treatment and included ≥ 325 participants to ensure a representative sample. Prevalence estimates were synthesised using random effect meta-analysis. 81 studies were included (n = 92,002, 75.9% female, median (IQR) age 44 (6) years, BMI 45 (11) kg/m2. Most studies were conducted in the United States (k = 44) and Italy (k = 15). Most prevalence data related to binge eating disorder or binge eating severity. The pooled prevalence of binge eating disorder, assessed by clinical interview, was 17% (95% CI: 12–22, 95% prediction interval (PI): 0–42, k = 19, n = 13447, τ2 = 0.01) using DSM-IV criteria and 12% (95% CI: 5–20, 95% PI: 0–40, k = 9, n = 7680, τ2 = 0.01) using DSM-V criteria. The pooled prevalence for severe binge eating (Binge Eating Scale score > 25) was 12% (95% CI: 8–16, 95% PI: 0–31, k = 18, n = 12136, τ2 = 0.01). For binge eating disorder, measured by clinical interview, the prevalence range for females and males was 14.9 to 27.0% (k = 12), and 4.0 to 24.1% (k = 3) respectively. For moderate to severe binge eating (Binge Eating Scale score ≥ 18) the prevalence for females and males ranged from 20.0 to 32.8%, and 7.1 to 77.5% (k = 2). Three studies reported prevalence by ethnicity. The prevalence of severe binge eating (Binge Eating Scale scores ≥ 27) was 9.5 to 41.7% in white populations (k = 2), 7.5 to 35.8% in black populations (k = 2), and 5.7% in Hispanic populations (k = 1). One study reported binge eating disorder, assessed by clinical interview, for white, black and Hispanic populations and reported prevalence of 15.3%, 11.3% and 11.4% respectively. Overall, there was high variability in the prevalence of binge eating and binge eating disorder in adults presenting for obesity treatment, with available data indicating prevalence can range up to 42%. It is important to identify which population level factors drive this heterogeneity to inform service provision however, the limited data highlights a significant knowledge gap in the reporting of eating disorders in underrepresented populations which needs to be addressed.
Eggs contain several nutrients that are positively linked to neurological function, including phospholipids (regulate neurotransmitter receptors), choline (produces acetylcholine, involved in memory and learning(1), tryptophan (produces serotonin, reduces depression-like symptoms(2), and docosahenaenoic acid (DHA, important for neurogenesis and synaptic plasticity(3). However, most studies examining egg intake and cognition have been observational, primarily focusing on cognitive decline with aging. Few studies have explored the impact of egg consumption on cognition in young people, with a paucity of high-quality study designs used. This randomised controlled trial (RCT) aimed to examine the impact of egg consumption on cognition, including interoception, risk-taking, decision-making and reaction-time in young adults. Ninety healthy young adults (aged 18–40) were recruited to a 6-week parallel-arm RCT. Participants were randomised to either the intervention group (n = 45) who consumed 2 whole eggs/day or the control group (n = 45) who avoided eating eggs. Participants completed 4 computer-based validated cognitive tests at baseline and end-of-study. The Schandry task measured interoception (heartbeat perception (Hb) in 25s, 35s and 55s rounds), the balloon analogue risk task (BART) measured risk-taking, the Iowa Gambling Task (IGT) assessed decision-making, and the Deary-Liewald test measured simple reaction time (SRT) and choice reaction time (CRT). Changes in cognitive task results between groups were analysed via linear mixed models analysis using SPSS, with results controlled for sleepiness (measured by Karolinska Sleepiness Scale) and gender. Continuous data is presented as mean ± SD where parametric, or median (IQR) where non-parametric. Of the n = 90 participants enrolled, n = 89 completed the 6-week program and were included in data analysis (Egg n = 45, Control n = 44). Participants were 71% female, aged 24.0 (9) y with a mean BMI of 23.2 ± 3.4 kg/m2. Most individuals reported a prior habitual egg intake of 2–4 times/week (49.4%). At baseline, cognitive task results between groups were not significantly different. At study completion, interoception was not impacted by egg consumption (Hb: 25s p = 0.931; 35s p = 0.936; 55s p = 0.679), nor was risk-taking (BART p = 0.828), decision making (IGT p = 0.923), or reaction-time (SRT p = 0.625, CRT p = 0.839). Consuming 2 eggs/day for 6-weeks did not affect cognitive test results in healthy young adults, when compared to those who avoided eggs, suggesting no detrimental effect of regular egg consumption (14 eggs per week). This is the first RCT to explore the impact of egg consumption on the above parameters of cognition. Given the null findings, future research should explore longer intervention durations or alternative cognitive assessments to fully understand the potential cognitive effects of egg consumption.
The need for collaborative and transparent sharing of COVID-19 clinical trial and large-scale observational study data to accelerate scientific discovery and inform clinical practice is critical. Responsible data-sharing requires addressing challenges associated with data privacy and confidentiality, data linkage, data quality, variable harmonization, data formats, and comprehensive metadata documentation to produce a high-quality, contextually rich, findable, accessible, interoperable, and reusable (FAIR) dataset. This communication explores the experiences and lessons learned from sharing National Heart Lung and Blood Institute (NHLBI) COVID-19 clinical trial (including adaptive platform trials) and cohort study datasets through the NHLBI BioData Catalyst® (BDC) ecosystem, focusing on the challenges and successes of harmonizing these datasets for broader research use. Our findings highlight the importance of establishing standardized data formats, adopting common data elements and creating and maintaining robust data governance structures that address common challenges (i.e., data privacy and data-sharing limitations resulting from informed consent). These efforts resulted in a set of comprehensive and interoperable datasets from 5 clinical trials and 13 cohort studies that will enable downstream reuse in analyses and collaborations. The principles and strategies outlined, derived through experience with consortia data, can lay the groundwork for advancing collaborative and efficient data sharing.
Hong Kong’s 3-year dynamic zero-COVID policy has caused prolonged exposure to stringent, pervasive anti-epidemic measures, which poses additional stressors on emotional well-being through pandemic fatigue, beyond the incumbent fear of the pandemic.
Aims
To investigate how major policy shifts in the zero-COVID strategy have corresponded with changing relationships between emotional well-being, pandemic fatigue from policy adherence, and pandemic fear, following the pandemic peak to a living-with-COVID policy.
Method
A three-wave repeated cross-sectional study (N = 2266) was conducted on the Chinese working-age population (18–64 years) during the peak outbreak (Wave 1), and subsequent policy shifts towards a living-with-COVID policy during the initial relaxation (Wave 2) and full relaxation (Wave 3) of anti-epidemic measures from March 2022 to March 2023. Non-parametric tests, consisting of robust analysis of covariance tests and quantile regression analysis, were performed.
Results
The severity of all measures was lowered after Wave 1; however, extreme pandemic fears reported in Wave 2 (n = 38, 7.7%) were associated with worse emotional well-being than the pandemic peak (Wave 1), which then subsided in Wave 3. Pandemic fatigue posed greater negative emotional well-being in Wave 1, whereas pandemic fear was the dominant predictor in Waves 2 and 3.
Conclusions
Pandemic fatigue and pandemic fear together robustly highlight the psychological cost of prolonged pandemic responses, expanding on a framework for monitoring and minimising the unintended mental health ramifications of anti-epidemic policies.
Research participants” feedback about their participation experiences offers critical insights for improving programs. A shared Empowering the Participant Voice (EPV) infrastructure enabled a multiorganization collaborative to collect, analyze, and act on participants’ feedback using validated participant-centered measures.
Methods:
A consortium of academic research organizations with Clinical and Translational Science Awards (CTSA) programs administered the Research Participant Perception Survey (RPPS) to active or recent research participants. Local response data also aggregated into a Consortium database, facilitating analysis of feedback overall and for subgroups.
Results:
From February 2022 to June 2024, participating organizations sent surveys to 28,096 participants and received 5045 responses (18%). Respondents were 60% female, 80% White, 13% Black, 2% Asian, and 6% Latino/x. Most respondents (85–95%) felt respected and listened to by study staff; 68% gave their overall experience the top rating. Only 60% felt fully prepared by the consent process. Consent, feeling valued, language assistance, age, study demands, and other factors were significantly associated with overall experience ratings. 63% of participants said that receiving a summary of the study results would be very important to joining a future study. Intersite scores differed significantly for some measures; initiatives piloted in response to local findings raised experience scores.
Conclusion:
RPPS results from 5045 participants from seven CTSAs provide a valuable evidence base for evaluating participants’ research experiences and using participant feedback to improve research programs. Analyses revealed opportunities for improving research practices. Sites piloting local change initiatives based on RPPS findings demonstrated measurable positive impact.
Non-native plants negatively impact ecosystems via a variety of mechanisms, including in forested riparian areas. Japanese knotweed [Polygonum cuspidatum Siebold & Zucc.] and its hybrids (referred to as Polygonum spp. hereafter) are widely spread throughout North America and can impact flora and fauna of riparian habitats. Thus, information improving our ability to understand and predict the potential spread and colonization of Polygonum spp. is valuable. One dispersal mechanism is hydrochory (i.e., dispersal by water), including the downstream dispersal of viable stems that can facilitate rapid invasion within a watershed. We used passive integrated transponder (PIT) telemetry in experimental releases of Polygonum spp. stems to track the downstream transport of Polygonum spp. in a small (second-order) stream in northern New Hampshire, USA, in the summers of 2021 and 2022. A total of 180 (90 each year) Polygonum spp. stems were released at three sites within the stream reach, with 185 (∼98%) being recaptured at least once, with a total of 686 recaptures. Individual relocated stems moved a maximum distance of 30 to 875 m downstream in 2021 and 13 to 1,233 m in 2022 during regular flows; however, a high-streamflow event in July 2021 flushed out all remaining stems downstream of the study area. Generalized additive mixed models (GAMMs) identified site-specific differences in stem movement rates and a general reduction in movement rates with increased duration of time elapsed since post-release. In general, Polygonum spp. stems moved farther downstream in sites with lower channel sinuosity, although other fine-scale habitat factors (e.g., water depth, habitat type, and presence of wood and debris jams) likely contribute to the ability for Polygonum spp. to further disperse or otherwise be retained within the channel. Thus, stream morphology and stream flow are likely to affect where Polygonum spp. stems will be retained and potentially reestablish. Predictive tools identifying areas of higher probability of hydrochory-based dispersal could help to focus removal efforts when employed or to identify riparian habitats at highest risk for spread.
There is a growing trend for studies run by academic and nonprofit organizations to have regulatory submission requirements. As a result, there is greater reliance on REDCap, an electronic data capture (EDC) widely used by researchers in these organizations. This paper discusses the development and implementation of the Rapid Validation Process (RVP) developed by the REDCap Consortium, aimed at enhancing regulatory compliance and operational efficiency in response to the dynamic demands of modern clinical research. The RVP introduces a structured validation approach that categorizes REDCap functionalities, develops targeted validation tests, and applies structured and standardized testing syntax. This approach ensures that REDCap can meet regulatory standards while maintaining flexibility to adapt to new challenges. Results from the application of the RVP on recent successive REDCap software version releases illustrate significant improvements in testing efficiency and process optimization, demonstrating the project’s success in setting new benchmarks for EDC system validation. The project’s community-driven responsibility model fosters collaboration and knowledge sharing and enhances the overall resilience and adaptability of REDCap. As REDCap continues to evolve based on feedback from clinical trialists, the RVP ensures that REDCap remains a reliable and compliant tool, ready to meet regulatory and future operational challenges.
Latinx populations are underrepresented in clinical research. Asking Latinx research participants about their research experiences, barriers, and facilitators could help to improve research participation for these populations.
Methods:
The Salud Estres y Resilencia (SER) Hispano cohort study is a longitudinal cohort study of young adult Latinx immigrants whose design and conduct were tailored for their study population. We administered the Research Participant Perception Survey (RPPS) to SER Hispano participants to assess their experiences in the study. We describe overall results from the RPPS and compare results of surveys administered to SER Hispano participants via email versus telephone.
Results:
Of 340 participants who were contacted with the RPPS, 142 (42%) responded. Among respondents, 53 (37%) responded by initial email contact; and 89 (63%) responded by subsequent phone contact. The majority of respondents were between 35 and 44 years of age (54%), female (76%), and of Cuban origin (50%). Overall, research participants expressed high satisfaction with their research experience; 84% stated that they would “definitely” recommend research participation to friends and family, with no significant difference by method of survey administration (P = 0.45). The most common factor that was chosen that would influence future research participation was having summary results of the research shared with them (72%).
Conclusion:
We found that culturally tailored studies can be good experiences for Latinx research participants; and we found that use of the RPPS can be administered successfully, particularly when administered by more than one method, including telephone, to evaluate and to improve research experiences for this population.
Eggs are highly digestible, nutrient-rich and are a valuable source of protein and choline, thereby promoting a range of health benefits. Several studies have found an association between protein intake and gastrointestinal microbial diversity(1), while bacterial fermentation of undigested protein in the large bowel can produce short-chain fatty acids, such as butyrate, positively influencing host metabolic health, gut integrity and immune function(2). On the other hand, dietary choline stimulates gastrointestinal bacterial production of trimethylamine and the prothrombotic compound trimethylamine-N-oxide (TMAO)(3). Despite these established links, limited studies have explored the effects of whole egg intake on indices of gastrointestinal health. This systematic literature review aimed to synthesise research that has investigated the impact of egg-supplemented diets or egg consumption on markers of gastrointestinal health including microbiome, function and symptoms. This review was conducted in accordance with PRISMA guidelines. Five databases (Ovid Medline, Embase, CINAHL Plus, SCOPUS, and PsychInfo), and reference lists of relevant papers, were searched from inception until April 2023. Studies were included if they examined the link between whole chicken egg consumption and gastrointestinal health in healthy adults (aged>16). Indices of gastrointestinal health were defined as any outcomes related to gastrointestinal factors, including symptoms, microbiome, inflammation, colonic fermentation and TMAO. Reviews and case studies were excluded. All studies underwent risk of bias assessment. Overall, 548 studies were identified and 19 studies were included after screening. Eight of these were randomised controlled trials (RCTs), 8 cross-sectional and 3 prospective cohort studies. Participants ranged in number between 20-32,166 and in age between 18–84 years. Study periods varied between 3–14 weeks for RCTs and 6 months–12.5 years for prospective cohort studies. RCTs examined intakes between 1–4 eggs/day, with the majority examining 3 eggs/day (n = 6). The primary outcome across 15 articles was TMAO levels, with most reporting no significant associations (n = 13). Five studies examined inflammation with inconsistent findings ranging from no alterations (in TNF-α, IL-8, CRP), increases (in anti-inflammatory marker LTB5, TNF-α), and decreases (in IL-6, CRP). Lastly, 7 studies explored alterations in microbiome. Two RCTs and 2 cross-sectional trials reported no alterations in microbial diversity in response to eggs. Meanwhile, 2 cross-sectional and 1 prospective study linked specific bacteria to consistent egg intake. Eggs were associated with species that produce butyrate (E.rectale, F.prausnitzii, M.smithii, and R.bromii), and protect against metabolic syndrome (A.muciniphila). This systematic review found that egg consumption did not increase levels of the undesirable biomarker TMAO and were associated with butyrate-producing bacteria. Evidence regarding the effect of egg intake on inflammation was inconsistent. This review revealed the general lack of available research investigating whole eggs and gastrointestinal health. Future carefully designed RCTs are required to improve understanding of how eggs may influence the gastrointestinal microbiome and colonic fermentation.
Eggs provide several nutrients that have been linked to neurological function. Phospholipids, which comprise 30% of lipids in egg yolk, modulate neurotransmitter receptors and have been shown to lower reaction time in healthy adults(1). Eggs are also high in choline (340mg per egg), a building block for acetylcholine, a neurotransmitter involved in memory, learning and attention(2). Finally, eggs contain the omega-3 fatty acid docosahexaenoic acid (DHA) (25mg per egg), which has roles in neurological function including neurogenesis, synaptic plasticity and myelination(3). The impact of whole egg consumption on cognition has not been widely explored. This systematic review aimed to consolidate studies that investigated frequency of egg consumption or egg-supplemented diets on cognitive function. This review followed PRISMA guidelines and involved a search of five databases (Ovid Medline, Embase, CINAHL Plus, SCOPUS, and PsychInfo) from inception until April 2023. Included studies examined the link between whole chicken egg consumption and brain function, including cognitive decline, memory, risk-taking, reaction-time, decision-making, and executive function, in healthy adults (aged>16 y). All studies underwent risk of bias assessment. Twelve studies were included in the review. Four were prospective cohort studies, 4 were retrospective, 3 cross-sectional and 1 was a randomised controlled trial (RCT). Participant numbers, with the exception of the RCT, ranged between 178-9028 and were aged between 42-97 years. Duration of prospective studies varied from 2-5 years. Egg intake was measured via food frequency questionnaires (n = 8), 24-hr diet recalls (n = 2), a 4-day food record (n = 1) and a 7-day food record (n = 1). The RCT provided 2 DHA-fortified eggs/day compared to 2 whole eggs/day for 8 weeks. The primary outcome across 9 studies was cognitive decline, followed by memory (n = 7), reaction-time (n = 2), attention (n = 2), and executive function (n = 1). For outcome measures, studies used 9 different validated task-oriented tools (including the Montreal Cognitive Assessment n = 3, and California Verbal Learning Test n = 2), or 4 self-completed questionnaires. Several studies found no significant associations between egg consumption and cognitive decline (n = 4) or memory (n = 2). Conversely, 5 studies reported significant inverse associations between egg consumption and rates of cognitive decline. The RCT found that reaction-times were faster on both whole eggs and DHA-eggs after 8 weeks (p>0.05 between groups). Although conflicting results were found, more studies showed a greater frequency of habitual egg consumption to be associated with reduced cognitive decline. However, the variety of outcome measures across studies make direct comparisons challenging, preventing definitive conclusions about the impact of eggs on cognitive health. This review highlights the need for future RCTs.
OBJECTIVES/GOALS: The correction of spinopelvic parameters is associated with better outcomes in patients with adult spinal deformity (ASD). This study presents a novel artificial intelligence (AI) tool that automatically predicts spinopelvic parameters from spine x-rays with high accuracy and without need for any manual entry. METHODS/STUDY POPULATION: The AI model was trained/validated on 761 sagittal whole-spine x-rays to predict the following parameters: Sagittal Vertical Axis (SVA), Pelvic Tilt (PT), Pelvic Incidence (PI), Sacral Slope (SS), Lumbar Lordosis (LL), T1-Pelvic Angle (T1PA), and L1-Pelvic Angle (L1PA). A separate test set of 40 x-rays was labeled by 4 reviewers including fellowship-trained spine surgeons and a neuroradiologist. Median errors relative to the most senior reviewer were calculated to determine model accuracy on test and cropped-test (i.e. lumbosacral) images. Intraclass correlation coefficients (ICC) were used to assess inter-rater reliability RESULTS/ANTICIPATED RESULTS: The AI model exhibited the following median (IQR) parameter errors: SVA[2.1mm (8.5mm), p=0.97], PT [1.5° (1.4°), p=0.52], PI[2.3° (2.4°), p=0.27], SS[1.7° (2.2°), p=0.64], LL [2.6° (4.0°), p=0.89], T1PA [1.3° (1.1°), p=0.41], and L1PA [1.3° (1.2°), p=0.51]. The parameter errors on cropped lumbosacral images were: LL[2.9° (2.6°), p=0.80] and SS[1.9° (2.2°), p=0.78]. The AI model exhibited excellent reliability at all parameters in both whole-spine (ICC: 0.92-1.0) and lumbosacral x-rays: (ICC: 0.92-0.93). DISCUSSION/SIGNIFICANCE: Our AI model accurately predicts spinopelvic parameters with excellent reliability comparable to fellowship-trained spine surgeons and neuroradiologists. Utilization of predictive AI tools in spine-imaging can substantially aid in patient selection and surgical planning.
Recent research has shown the potential of speleothem δ13C to record a range of environmental processes. Here, we report on 230Th-dated stalagmite δ13C records for southwest Sulawesi, Indonesia, over the last 40,000 yr to investigate the relationship between tropical vegetation productivity and atmospheric methane concentrations. We demonstrate that the Sulawesi stalagmite δ13C record is driven by changes in vegetation productivity and soil respiration and explore the link between soil respiration and tropical methane emissions using HadCM3 and the Sheffield Dynamic Global Vegetation Model. The model indicates that changes in soil respiration are primarily driven by changes in temperature and CO2, in line with our interpretation of stalagmite δ13C. In turn, modelled methane emissions are driven by soil respiration, providing a mechanism that links methane to stalagmite δ13C. This relationship is particularly strong during the last glaciation, indicating a key role for the tropics in controlling atmospheric methane when emissions from high-latitude boreal wetlands were suppressed. With further investigation, the link between δ13C in stalagmites and tropical methane could provide a low-latitude proxy complementary to polar ice core records to improve our understanding of the glacial–interglacial methane budget.
People with dementia are more prone to premature nursing home placement after hospitalization due to physical and mental deconditioning which makes care-at- home more difficult. This study aimed to evaluate the effect of a post hospital discharge transitional care program on reduction of nursing home placement in people with dementia.
Methods:
A matched case-control study was conducted between 2018 and 2021. A transitional care program using case management approach was developed. Participants enrolled the program by self-enrolment or referral from hospitals or NGOs. Community-dwelling people with dementia discharged from hospitals received a four- week residential care at a dementia care centre with intensive nursing care, physiotherapy and group activities promoting social engagement, followed by eight- week day care rehabilitation activities to improve their mobility and cognitive functioning. They were matched on a 1:5 ratio by age and sex to people with dementia discharged from a convalescent hospital who did not participate in this program for comparison. The study outcome was nursing home admission, measured three months (i.e. post-intervention), six months, and nine months after hospital discharge. Multinomial logistic regression was conducted to investigate factors associated with nursing home placement at each measurement time-point.
Results:
361 hospital admission episodes (n=67 interevntion, n=294 control) were examined. The regression results showed that participants in the intervention group were significantly less likely to be admitted to nursing home three months (OR = 0.023, 95% CI: 0.003-0.201, p = .001) and six months (OR = 0.094, 95% CI: 0.025-0.353, p = .001) than the controls after hospital discharge, but the intervention effect did not sustain nine months after hospital discharge. Longer hospital length of stay, and hospital admission due to dementia, mental disturbances such as delirium, or mental disorders IPA_Abstract_PDP_20230119_clean 2 such as schizophrenia significantly predicted nursing home admission three months and six months after hospital discharge.
Conclusion:
The transitional care program could help reduce nursing home placement in people with dementia after hospital discharge. To sustain the intervention effect, more continual support after the intervention as well as family caregiver training would be required.
Translation is the process of turning observations in the research laboratory, clinic, and community into interventions that improve people’s health. The Clinical and Translational Science Awards (CTSA) program is a National Center for Advancing Translational Sciences (NCATS) initiative to advance translational science and research. Currently, 64 “CTSA hubs” exist across the nation. Since 2006, the Houston-based Center for Clinical Translational Sciences (CCTS) has assembled a well-integrated, high-impact hub in Texas that includes six partner institutions within the state, encompassing ∼23,000 sq. miles and over 16 million residents. To achieve the NCATS goal of “more treatments for all people more quickly,” the CCTS promotes diversity and inclusion by integrating underrepresented populations into clinical studies, workforce training, and career development. In May 2023, we submitted the UM1 application and six “companion” proposals: K12, R25, T32-Predoctoral, T32-Postdoctoral, and RC2 (two applications). In October 2023, we received priority scores for the UM1 (22), K12 (25), T32-Predoctoral (20), and T32-Postdoctoral (23), which historically fall within the NCATS funding range. This report describes the grant preparation and submission approach, coupled with data from an internal survey designed to assimilate feedback from principal investigators, writers, reviewers, and administrative specialists. Herein, we share the challenges faced, the approaches developed, and the lessons learned.
Our earth is immersed in the near-earth space plasma environment, which plays a vital role in protecting our planet against the solar-wind impact and influencing space activities. It is significant to investigate the physical processes dominating the environment, for deepening our scientific understanding of it and improving the ability to forecast the space weather. As a crucial part of the National Major Scientific and Technological Infrastructure–Space Environment Simulation Research Infrastructure (SESRI) in Harbin, the Space Plasma Environment Research Facility (SPERF) builds a system to replicate the near-earth space plasma environment in the laboratory. The system aims to simulate the three-dimensional (3-D) structure and processes of the terrestrial magnetosphere for the first time in the world, providing a unique platform to reveal the physics of the 3-D asymmetric magnetic reconnection relevant to the earth's magnetopause, wave–particle interaction in the earth's radiation belt, particles’ dynamics during the geomagnetic storm, etc. The paper will present the engineering design and construction of the near-earth space plasma simulation system of the SPERF, with a focus on the critical technologies that have been resolved to achieve the scientific goals. Meanwhile, the possible physical issues that can be studied based on the apparatus are sketched briefly. The earth-based system is of great value in understanding the space plasma environment and supporting space exploration.
Aphis spiraecola Patch is one of the most economically important tree fruit pests worldwide. The pyrethroid insecticide lambda-cyhalothrin is commonly used to control A. spiraecola. In this 2-year study, we quantified the resistance level of A. spiraecola to lambda-cyhalothrin in different regions of the Shaanxi province, China. The results showed that A. spiraecola had reached extremely high resistance levels with a 174-fold resistance ratio (RR) found in the Xunyi region. In addition, we compared the enzymatic activity and expression level of P450 genes among eight A. spiraecola populations. The P450 activity of A. spiraecola was significantly increased in five regions (Xunyi, Liquan, Fengxiang, Luochuan, and Xinping) compared to susceptible strain (SS). The expression levels of CYP6CY7, CYP6CY14, CYP6CY22, P4504C1-like, P4506a13, CYP4CZ1, CYP380C47, and CYP4CJ2 genes were significantly increased under lambda-cyhalothrin treatment and in the resistant field populations. A L1014F mutation in the sodium channel gene was found and the mutation rate was positively correlated with the LC50 of lambda-cyhalothrin. In conclusion, the levels of lambda-cyhalothrin resistance of A. spiraecola field populations were associated with P450s and L1014F mutations. Our combined findings provide evidence on the resistance mechanism of A. spiraecola to lambda-cyhalothrin and give a theoretical basis for rational and effective control of this pest species.
We explored the utility of the standardized infection ratio (SIR) for surgical site infection (SSI) reporting in an Australian jurisdiction.
Design:
Retrospective chart review.
Setting:
Statewide SSI surveillance data from 2013 to 2019.
Patients:
Individuals who had cardiac bypass surgery (CABG), colorectal surgery (COLO), cesarean section (CSEC), hip prosthesis (HPRO), or knee prosthesis (KPRO) procedures.
Methods:
The SIR was calculated by dividing the number of observed infections by the number of predicted infections as determined using the National Healthcare Safety Network procedure-specific risk models. In line with a minimum precision criterion, an SIR was not calculated if the number of predicted infections was <1.
Results:
A SIR >0 (≥1 observed SSI, predicted number of SSI ≥1, no missing covariates) could be calculated for a median of 89.3% of reporting quarters for CABG, 75.0% for COLO, 69.0% for CSEC, 0% for HPRO, and 7.1% for KPRO. In total, 80.6% of the reporting quarters, when the SIR was not calculated, were due to no observed infections or predicted infections <1, and 19.4% were due to missing covariates alone. Within hospitals, the median percentage of quarters during which zero infections were observed was 8.9% for CABG, 20.0% for COLO, 25.4% for CSEC, 67.3% for HPRO, and 71.4% for KPRO.
Conclusions:
Calculating an SIR for SSIs is challenging for hospitals in our regional network, primarily because of low event numbers and many facilities with predicted infections <1. Our SSI reporting will continue to use risk-indexed rates, in tandem with SIR values when predicted number of SSI ≥1.