We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The migratory phase is a critical time for Fasciola hepatica as it must locate, penetrate and migrate through the alimentary tract to the liver parenchyma whilst under attack from the host immune response. Here, scanning and transmission electron microscopy were used to monitor the in vitro effects of sera (with, and without, complement depletion) on F. hepatica newly excysted juveniles (NEJs) and flukes recovered at 7, 35, 70 and 98 days post infection (dpi) from the liver and bile ducts of rats. Test sera were from these F. hepatica-infected rats. A F. hepatica NEJ-specific rabbit antiserum was also used. All fluke stages demonstrated release of the tegumental glycocalyx and microvesicles and intense activity within the tegumental syncytium characterized by eccrine secretion of T-0/T-1/T-2 secretory bodies with subsequent microvillar formation and shedding of microvesicles from the apical plasma membrane. Exposure of both NEJs and 35 dpi flukes to 35 and 70 dpi rat sera produced significant amounts of eccrine-derived secretory material and putative attached immunocomplex. Rabbit anti-F. hepatica NEJ-specific antiserum produced similar responses at the NEJ tegument, including binding of putative immunocomplex to the surface, but with additional blistering of some regions of the apical plasma membrane. Our data suggest that immune sera stimulates multiple interrelated secretory mechanisms to maintain the integrity of the tegumental barrier in response to immune attack. Concurrent release of microvesicles may also serve to both divert the immune response away from the fluke itself and permit delivery of immunomodulatory cargo to immune effector cells.
Ensuring more equitable transformations requires addressing how different contextual dimensions of identity, such as gender and class, hinder equity. However, previous analyses on equity have addressed these dimensions separately. We suggest advancing beyond these methods by integrating intersectional analysis into the distributive, procedural, and recognition aspects of equity when examining social–ecological transformations. A review of 37 studies on social–ecological transformation shows that social–ecological transformation scholars commonly addressed social, spatial, and environmental transformations. In contrast, few studies have gone into depth in analyzing the reasons for power imbalances. We encourage scholars to use critical questions to reflect on social–ecological transformations collectively.
Technical summary
Ensuring equity in social–ecological transformations involves understanding how aspects of identity – such as gender, age, and class – affect experiences on the path to sustainability. Previous studies have often focused on one dimension of difference, but an intersectionality framework is essential for recognizing interconnected identities. In this paper, we review 37 empirical studies on social–ecological transformations, identifying key assets of transformation, including economic, social, cultural, political, spatial, environmental, and knowledge-based assets. We apply an analytical framework based on intersectional equity, incorporating intersectionality in equity analysis, which examines how power dynamics contribute to inequities in distribution, procedure, and recognition. Our findings show that social, spatial, and environmental assets of transformation are the most frequently mentioned in our sampled literature, together with benefits, costs, inclusiveness, and knowledge of equity dimensions. Power imbalances occurred the most often, while different aspects of identity were mentioned only in two-thirds of the studies. We believe an intersectional equity approach will help better conceptualize transformation concerning (in)equity. Based on our reflections, we suggest critical questions encouraging scholars to evaluate them iteratively with an interdisciplinary group.
Social media summary
An intersectional equity approach is key to just social–ecological transformations. We review 37 studies to show why.
Mental disorders and physical-health conditions frequently co-occur, impacting treatment outcomes. While most prior research has focused on single pairs of mental disorders and physical-health conditions, this study explores broader associations between multiple mental disorders and physical-health conditions.
Methods
Using the Norwegian primary-care register, this population-based cohort study encompassed all 2 203 553 patients born in Norway from January 1945 through December 1984, who were full-time residents from January 2006 until December 2019 (14 years; 363 million person-months). Associations between seven mental disorders (sleep disturbance, anxiety, depression, acute stress reaction, substance-use disorders, phobia/compulsive disorder, psychosis) and 16 physical-health conditions were examined, diagnosed according to the International Classification of Primary Care.
Results
Of 112 mental-disorder/physical-health condition pairs, 96% of associations yielded positive and significant ORs, averaging 1.41 and ranging from 1.05 (99.99% CI 1.00–1.09) to 2.38 (99.99% CI 2.30–2.46). Across 14 years, every mental disorder was associated with multiple different physical-health conditions. Across 363 million person-months, having any mental disorder was associated with increased subsequent risk of all physical-health conditions (HRs:1.40 [99.99% CI 1.35–1.45] to 2.85 [99.99% CI 2.81–2.89]) and vice versa (HRs:1.56 [99.99% CI 1.54–1.59] to 3.56 [99.99% CI 3.54–3.58]). Associations were observed in both sexes, across age groups, and among patients with and without university education.
Conclusions
The breadth of associations between virtually every mental disorder and physical-health condition among patients treated in primary care underscores a need for integrated mental and physical healthcare policy and practice. This remarkable breadth also calls for research into etiological factors and underlying mechanisms that can explain it.
In this cohort profile article we describe the lifetime major depressive disorder (MDD) database that has been established as part of the BIObanks Netherlands Internet Collaboration (BIONIC). Across the Netherlands we collected data on Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) lifetime MDD diagnosis in 132,850 Dutch individuals. Currently, N = 66,684 of these also have genomewide single nucleotide polymorphism (SNP) data. We initiated this project because the complex genetic basis of MDD requires large population-wide studies with uniform in-depth phenotyping. For standardized phenotyping we developed the LIDAS (LIfetime Depression Assessment Survey), which then was used to measure MDD in 11 Dutch cohorts. Data from these cohorts were combined with diagnostic interview depression data from 5 clinical cohorts to create a dataset of N = 29,650 lifetime MDD cases (22%) meeting DSM-5 criteria and 94,300 screened controls. In addition, genomewide genotype data from the cohorts were assembled into a genomewide association study (GWAS) dataset of N = 66,684 Dutch individuals (25.3% cases). Phenotype data include DSM-5-based MDD diagnoses, sociodemographic variables, information on lifestyle and BMI, characteristics of depressive symptoms and episodes, and psychiatric diagnosis and treatment history. We describe the establishment and harmonization of the BIONIC phenotype and GWAS datasets and provide an overview of the available information and sample characteristics. Our next step is the GWAS of lifetime MDD in the Netherlands, with future plans including fine-grained genetic analyses of depression characteristics, international collaborations and multi-omics studies.
Cognitive training is a non-pharmacological intervention aimed at improving cognitive function across a single or multiple domains. Although the underlying mechanisms of cognitive training and transfer effects are not well-characterized, cognitive training has been thought to facilitate neural plasticity to enhance cognitive performance. Indeed, the Scaffolding Theory of Aging and Cognition (STAC) proposes that cognitive training may enhance the ability to engage in compensatory scaffolding to meet task demands and maintain cognitive performance. We therefore evaluated the effects of cognitive training on working memory performance in older adults without dementia. This study will help begin to elucidate non-pharmacological intervention effects on compensatory scaffolding in older adults.
Participants and Methods:
48 participants were recruited for a Phase III randomized clinical trial (Augmenting Cognitive Training in Older Adults [ACT]; NIH R01AG054077) conducted at the University of Florida and University of Arizona. Participants across sites were randomly assigned to complete cognitive training (n=25) or an education training control condition (n=23). Cognitive training and the education training control condition were each completed during 60 sessions over 12 weeks for 40 hours total. The education training control condition involved viewing educational videos produced by the National Geographic Channel. Cognitive training was completed using the Posit Science Brain HQ training program, which included 8 cognitive training paradigms targeting attention/processing speed and working memory. All participants also completed demographic questionnaires, cognitive testing, and an fMRI 2-back task at baseline and at 12-weeks following cognitive training.
Results:
Repeated measures analysis of covariance (ANCOVA), adjusted for training adherence, transcranial direct current stimulation (tDCS) condition, age, sex, years of education, and Wechsler Test of Adult Reading (WTAR) raw score, revealed a significant 2-back by training group interaction (F[1,40]=6.201, p=.017, η2=.134). Examination of simple main effects revealed baseline differences in 2-back performance (F[1,40]=.568, p=.455, η2=.014). After controlling for baseline performance, training group differences in 2-back performance was no longer statistically significant (F[1,40]=1.382, p=.247, η2=.034).
Conclusions:
After adjusting for baseline performance differences, there were no significant training group differences in 2-back performance, suggesting that the randomization was not sufficient to ensure adequate distribution of participants across groups. Results may indicate that cognitive training alone is not sufficient for significant improvement in working memory performance on a near transfer task. Additional improvement may occur with the next phase of this clinical trial, such that tDCS augments the effects of cognitive training and results in enhanced compensatory scaffolding even within this high performing cohort. Limitations of the study include a highly educated sample with higher literacy levels and the small sample size was not powered for transfer effects analysis. Future analyses will include evaluation of the combined intervention effects of a cognitive training and tDCS on nback performance in a larger sample of older adults without dementia.
Cognitive training has shown promise for improving cognition in older adults. Aging involves a variety of neuroanatomical changes that may affect response to cognitive training. White matter hyperintensities (WMH) are one common age-related brain change, as evidenced by T2-weighted and Fluid Attenuated Inversion Recovery (FLAIR) MRI. WMH are associated with older age, suggestive of cerebral small vessel disease, and reflect decreased white matter integrity. Higher WMH load associates with reduced threshold for clinical expression of cognitive impairment and dementia. The effects of WMH on response to cognitive training interventions are relatively unknown. The current study assessed (a) proximal cognitive training performance following a 3-month randomized control trial and (b) the contribution of baseline whole-brain WMH load, defined as total lesion volume (TLV), on pre-post proximal training change.
Participants and Methods:
Sixty-two healthy older adults ages 65-84 completed either adaptive cognitive training (CT; n=31) or educational training control (ET; n=31) interventions. Participants assigned to CT completed 20 hours of attention/processing speed training and 20 hours of working memory training delivered through commercially-available Posit Science BrainHQ. ET participants completed 40 hours of educational videos. All participants also underwent sham or active transcranial direct current stimulation (tDCS) as an adjunctive intervention, although not a variable of interest in the current study. Multimodal MRI scans were acquired during the baseline visit. T1- and T2-weighted FLAIR images were processed using the Lesion Segmentation Tool (LST) for SPM12. The Lesion Prediction Algorithm of LST automatically segmented brain tissue and calculated lesion maps. A lesion threshold of 0.30 was applied to calculate TLV. A log transformation was applied to TLV to normalize the distribution of WMH. Repeated-measures analysis of covariance (RM-ANCOVA) assessed pre/post change in proximal composite (Total Training Composite) and sub-composite (Processing Speed Training Composite, Working Memory Training Composite) measures in the CT group compared to their ET counterparts, controlling for age, sex, years of education and tDCS group. Linear regression assessed the effect of TLV on post-intervention proximal composite and sub-composite, controlling for baseline performance, intervention assignment, age, sex, years of education, multisite scanner differences, estimated total intracranial volume, and binarized cardiovascular disease risk.
Results:
RM-ANCOVA revealed two-way group*time interactions such that those assigned cognitive training demonstrated greater improvement on proximal composite (Total Training Composite) and sub-composite (Processing Speed Training Composite, Working Memory Training Composite) measures compared to their ET counterparts. Multiple linear regression showed higher baseline TLV associated with lower pre-post change on Processing Speed Training sub-composite (ß = -0.19, p = 0.04) but not other composite measures.
Conclusions:
These findings demonstrate the utility of cognitive training for improving postintervention proximal performance in older adults. Additionally, pre-post proximal processing speed training change appear to be particularly sensitive to white matter hyperintensity load versus working memory training change. These data suggest that TLV may serve as an important factor for consideration when planning processing speed-based cognitive training interventions for remediation of cognitive decline in older adults.
Cognitive training using a visual speed-of-processing task, called the Useful Field of View (UFOV) task, reduced dementia risk and reduced decline in activities of daily living at a 10-year follow-up in older adults. However, there is variability in the level of cognitive gains after cognitive training across studies. One potential explanation for this variability could be moderating factors. Prior studies suggest variables moderating cognitive training gains share features of the training task. Learning trials of the Hopkins Verbal Learning Test-Revised (HVLT-R) and Brief Visuospatial Memory Test-Revised (BVMT-R) recruit similar cognitive abilities and have overlapping neural correlates with the UFOV task and speed-ofprocessing/working memory tasks and therefore could serve as potential moderators. Exploring moderating factors of cognitive training gains may boost the efficacy of interventions, improve rigor in the cognitive training literature, and eventually help provide tailored treatment recommendations. This study explored the association between the HVLT-R and BVMT-R learning and the UFOV task, and assessed the moderation of HVLT-R and BVMT-R learning on UFOV improvement after a 3-month speed-ofprocessing/attention and working memory cognitive training intervention in cognitively healthy older adults.
Participants and Methods:
75 healthy older adults (M age = 71.11, SD = 4.61) were recruited as part of a larger clinical trial through the Universities of Florida and Arizona. Participants were randomized into a cognitive training (n=36) or education control (n=39) group and underwent a 40-hour, 12-week intervention. Cognitive training intervention consisted of practicing 4 attention/speed-of-processing (including the UFOV task) and 4 working memory tasks. Education control intervention consisted of watching 40-minute educational videos. The HVLT-R and BVMT-R were administered at the pre-intervention timepoint as part of a larger neurocognitive battery. The learning ratio was calculated as: trial 3 total - trial 1 total/12 - trial 1 total. UFOV performance was measured at pre- and post-intervention time points via the POSIT Brain HQ Double Decision Assessment. Multiple linear regressions predicted baseline Double Decision performance from HVLT-R and BVMT-R learning ratios controlling for study site, age, sex, and education. A repeated measures moderation analysis assessed the moderation of HVLT-R and BVMT-R learning ratio on Double Decision change from pre- to post-intervention for cognitive training and education control groups.
Results:
Baseline Double Decision performance significantly associated with BVMT-R learning ratio (β=-.303, p=.008), but not HVLT-R learning ratio (β=-.142, p=.238). BVMT-R learning ratio moderated gains in Double Decision performance (p<.01); for each unit increase in BVMT-R learning ratio, there was a .6173 unit decrease in training gains. The HVLT-R learning ratio did not moderate gains in Double Decision performance (p>.05). There were no significant moderations in the education control group.
Conclusions:
Better visuospatial learning was associated with faster Double Decision performance at baseline. Those with poorer visuospatial learning improved most on the Double Decision task after training, suggesting that healthy older adults who perform below expectations may show the greatest training gains. Future cognitive training research studying visual speed-of-processing interventions should account for differing levels of visuospatial learning at baseline, as this could impact the magnitude of training outcomes.
Aging is associated with disruptions in functional connectivity within the default mode (DMN), frontoparietal control (FPCN), and cingulo-opercular (CON) resting-state networks. Greater within-network connectivity predicts better cognitive performance in older adults. Therefore, strengthening network connectivity, through targeted intervention strategies, may help prevent age-related cognitive decline or progression to dementia. Small studies have demonstrated synergistic effects of combining transcranial direct current stimulation (tDCS) and cognitive training (CT) on strengthening network connectivity; however, this association has yet to be rigorously tested on a large scale. The current study leverages longitudinal data from the first-ever Phase III clinical trial for tDCS to examine the efficacy of an adjunctive tDCS and CT intervention on modulating network connectivity in older adults.
Participants and Methods:
This sample included 209 older adults (mean age = 71.6) from the Augmenting Cognitive Training in Older Adults multisite trial. Participants completed 40 hours of CT over 12 weeks, which included 8 attention, processing speed, and working memory tasks. Participants were randomized into active or sham stimulation groups, and tDCS was administered during CT daily for two weeks then weekly for 10 weeks. For both stimulation groups, two electrodes in saline-soaked 5x7 cm2 sponges were placed at F3 (cathode) and F4 (anode) using the 10-20 measurement system. The active group received 2mA of current for 20 minutes. The sham group received 2mA for 30 seconds, then no current for the remaining 20 minutes.
Participants underwent resting-state fMRI at baseline and post-intervention. CONN toolbox was used to preprocess imaging data and conduct region of interest (ROI-ROI) connectivity analyses. The Artifact Detection Toolbox, using intermediate settings, identified outlier volumes. Two participants were excluded for having greater than 50% of volumes flagged as outliers. ROI-ROI analyses modeled the interaction between tDCS group (active versus sham) and occasion (baseline connectivity versus postintervention connectivity) for the DMN, FPCN, and CON controlling for age, sex, education, site, and adherence.
Results:
Compared to sham, the active group demonstrated ROI-ROI increases in functional connectivity within the DMN following intervention (left temporal to right temporal [T(202) = 2.78, pFDR < 0.05] and left temporal to right dorsal medial prefrontal cortex [T(202) = 2.74, pFDR < 0.05]. In contrast, compared to sham, the active group demonstrated ROI-ROI decreases in functional connectivity within the FPCN following intervention (left dorsal prefrontal cortex to left temporal [T(202) = -2.96, pFDR < 0.05] and left dorsal prefrontal cortex to left lateral prefrontal cortex [T(202) = -2.77, pFDR < 0.05]). There were no significant interactions detected for CON regions.
Conclusions:
These findings (a) demonstrate the feasibility of modulating network connectivity using tDCS and CT and (b) provide important information regarding the pattern of connectivity changes occurring at these intervention parameters in older adults. Importantly, the active stimulation group showed increases in connectivity within the DMN (a network particularly vulnerable to aging and implicated in Alzheimer’s disease) but decreases in connectivity between left frontal and temporal FPCN regions. Future analyses from this trial will evaluate the association between these changes in connectivity and cognitive performance post-intervention and at a one-year timepoint.
The ability to generate, plan for, and follow through with goals is essential to everyday functioning. Compared to young adults, cognitively normal older adults have more difficulty on a variety of cognitive functions that contribute to goal setting and follow through. However, how these age-related cognitive differences impact real-world goal planning and success remains unclear. In the current study, we aimed to better understand the impact of older age on everyday goal planning and success.
Participants and Methods:
Cognitively normal young adults (18-35 years, n= 57) and older adults (60-80 years, n= 49) participated in a 10-day 2-session study. In the first session, participants described 4 real-world goals that they hoped to pursue in the next 10 days. These goals were subjectively rated for personal significance, significance to others, and vividness, and goal descriptions were objectively scored for temporal, spatial, and event specificity, among other measures. Ten days later, participants rated the degree to which they planned for and made progress in their real-world goals since session one. Older adults also completed a battery of neuropsychological tests.
Results:
Some key results are as follows. Relative to the young adults, cognitively normal older adults described real-world goals which navigated smaller spaces (p=0.01) and that they perceived as more important to other people (p=0.03). Older adults also planned more during the 10-day window (p<0.001). There was not a statistically significant age group difference, however, in real-world goal progress (p=0.65). Nonetheless, among older participants, goal progress was related to higher mental processing speed as shown by the Trail Making Test Part A (r=0.36, p=0.02) and the creation of goals confined to specific temporal periods (r=0.35, p=0.01). Older participants who scored lower on the Rey Complex Figure Test (RCFT) long delay recall trial reported that their goals were more like ones that they had set in the past (r= -0.34, p=0.02), and higher episodic memory as shown by the RCFT was associated with more spatially specific goals (r=0.32, p=0.02), as well as a greater use of implementation intentions in goal descriptions(r=0.35, p=0.02).
Conclusions:
Although older adults tend to show decline in several cognitive domains relevant to goal setting, we found that cognitively normal older adults did not make significantly less progress toward a series of real-world goals over a 10-day window. However, relative to young adults, older adults tended to pursue goals which were more important to others, as well as goals that involved navigating smaller spaces. Older adults also appear to rely on planning more than young adults to make progress toward their goals. These findings reveal age group differences in the quality of goals and individual differences in goal success among older adults. They are also in line with prior research suggesting that cognitive aging effects may be more subtle in real-world contexts.
Nonpathological aging has been linked to decline in both verbal and visuospatial memory abilities in older adults. Disruptions in resting-state functional connectivity within well-characterized, higherorder cognitive brain networks have also been coupled with poorer memory functioning in healthy older adults and in older adults with dementia. However, there is a paucity of research on the association between higherorder functional connectivity and verbal and visuospatial memory performance in the older adult population. The current study examines the association between resting-state functional connectivity within the cingulo-opercular network (CON), frontoparietal control network (FPCN), and default mode network (DMN) and verbal and visuospatial learning and memory in a large sample of healthy older adults. We hypothesized that greater within-network CON and FPCN functional connectivity would be associated with better immediate verbal and visuospatial memory recall. Additionally, we predicted that within-network DMN functional connectivity would be associated with improvements in delayed verbal and visuospatial memory recall. This study helps to glean insight into whether within-network CON, FPCN, or DMN functional connectivity is associated with verbal and visuospatial memory abilities in later life.
Participants and Methods:
330 healthy older adults between 65 and 89 years old (mean age = 71.6 ± 5.2) were recruited at the University of Florida (n = 222) and the University of Arizona (n = 108). Participants underwent resting-state fMRI and completed verbal memory (Hopkins Verbal Learning Test - Revised [HVLT-R]) and visuospatial memory (Brief Visuospatial Memory Test - Revised [BVMT-R]) measures. Immediate (total) and delayed recall scores on the HVLT-R and BVMT-R were calculated using each test manual’s scoring criteria. Learning ratios on the HVLT-R and BVMT-R were quantified by dividing the number of stimuli (verbal or visuospatial) learned between the first and third trials by the number of stimuli not recalled after the first learning trial. CONN Toolbox was used to extract average within-network connectivity values for CON, FPCN, and DMN. Hierarchical regressions were conducted, controlling for sex, race, ethnicity, years of education, number of invalid scans, and scanner site.
Results:
Greater CON connectivity was significantly associated with better HVLT-R immediate (total) recall (ß = 0.16, p = 0.01), HVLT-R learning ratio (ß = 0.16, p = 0.01), BVMT-R immediate (total) recall (ß = 0.14, p = 0.02), and BVMT-R delayed recall performance (ß = 0.15, p = 0.01). Greater FPCN connectivity was associated with better BVMT-R learning ratio (ß = 0.13, p = 0.04). HVLT-R delayed recall performance was not associated with connectivity in any network, and DMN connectivity was not significantly related to any measure.
Conclusions:
Connectivity within CON demonstrated a robust relationship with different components of memory function as well across verbal and visuospatial domains. In contrast, FPCN only evidenced a relationship with visuospatial learning, and DMN was not significantly associated with memory measures. These data suggest that CON may be a valuable target in longitudinal studies of age-related memory changes, but also a possible target in future non-invasive interventions to attenuate memory decline in older adults.
We performed a preimplementation assessment of workflows, resources, needs, and antibiotic prescribing practices of trainees and practicing dentists to inform the development of an antibiotic-stewardship clinical decision-support tool (CDST) for dentists.
Methods:
We used a technology implementation framework to conduct the preimplementation assessment via surveys and focus groups of students, residents, and faculty members. Using Likert scales, the survey assessed baseline knowledge and confidence in dental providers’ antibiotic prescribing. The focus groups gathered information on existing workflows, resources, and needs for end users for our CDST.
Results:
Of 355 dental providers recruited to take the survey, 213 (60%) responded: 151 students, 27 residents, and 35 faculty. The average confidence in antibiotic prescribing decisions was 3.2 ± 1.0 on a scale of 1 to 5 (ie, moderate). Dental students were less confident about prescribing antibiotics than residents and faculty (P < .01). However, antibiotic prescribing knowledge was no different between dental students, residents, and faculty. The mean likelihood of prescribing an antibiotic when it was not needed was 2.7 ± 0.6 on a scale of 1 to 5 (unlikely to maybe) and was not meaningfully different across subgroups (P = .10). We had 10 participants across 3 focus groups: 7 students, 2 residents, and 1 faculty member. Four major themes emerged, which indicated that dentists: (1) make antibiotic prescribing decisions based on anecdotal experiences; (2) defer to physicians’ recommendations; (3) have limited access to evidence-based resources; and (4) want CDST for antibiotic prescribing.
Conclusions:
Dentists’ confidence in antibiotic prescribing increased by training level, but knowledge did not. Trainees and practicing dentists would benefit from a CDST to improve appropriateness of antibiotic prescribing.
The current definition of dietary fibre was adopted by the Codex Alimentarius Commission in 2009, but implementation requires updating food composition databases with values based on appropriate analysis methods. Previous data on population intakes of dietary fibre fractions are sparse. We studied the intake and sources of total dietary fibre (TDF) and dietary fibre fractions insoluble dietary fibre (IDF), dietary fibre soluble in water but insoluble in 76 % aqueous ethanol (SDFP) and dietary fibre soluble in water and soluble in 76 % aqueous ethanol (SDFS) in Finnish children based on new CODEX-compliant values of the Finnish National Food Composition Database Fineli. Our sample included 5193 children at increased genetic risk of type 1 diabetes from the Type 1 Diabetes Prediction and Prevention birth cohort, born between 1996 and 2004. We assessed the intake and sources based on 3-day food records collected at the ages of 6 months, 1, 3 and 6 years. Both absolute and energy-adjusted intakes of TDF were associated with age, sex and breast-feeding status of the child. Children of older parents, parents with a higher level of education, non-smoking mothers and children with no older siblings had higher energy-adjusted TDF intake. IDF was the major dietary fibre fraction in non-breastfed children, followed by SDFP and SDFS. Cereal products, fruits and berries, potatoes and vegetables were major food sources of dietary fibre. Breast milk was a major source of dietary fibre in 6-month-olds due to its human milk oligosaccharide content and resulted in high SDFS intakes in breastfed children.
We describe the association between job roles and coronavirus disease 2019 (COVID-19) among healthcare personnel. A wide range of hazard ratios were observed across job roles. Medical assistants had higher hazard ratios than nurses, while attending physicians, food service workers, laboratory technicians, pharmacists, residents and fellows, and temporary workers had lower hazard ratios.
Despite a recent wave in global recognition of the rights of transgender and gender-diverse populations, referred to in this text by the umbrella label of trans*, international law continues to presume a cisgender binary definition of gender — dismissing the lived realities of trans* individuals throughout the world. This gap in international legal recognition and protection has fundamental implications for health, where trans* persons have been and continue to be subjected to widespread discrimination in health care, longstanding neglect of health needs, and significant violations of bodily autonomy.
Integration of clinical skills during graduate training in dual-degree programs remains a challenge. The present study investigated the availability and self-perceived efficacy of clinical continuity strategies for dual-degree trainees preparing for clinical training.
Methods:
Survey participants were MD/DO-PhD students enrolled in dual-degree-granting institutions in the USA. The response rate was 95% of 73 unique institutions surveyed, representing 56% of the 124 MD-PhD and 7 DO-PhD recognized training programs. Respondents were asked to indicate the availability and self-perceived efficacy of each strategy.
Results:
Reported available clinical continuity strategies included clinical volunteering (95.6%), medical grand rounds (86.9%), mentored clinical experiences (84.2%), standardized patients/ practice Objective Structured Clinical Examinations (OSCEs) (70.3%), clinical case reviews (45.9%), clinical journal clubs (38.3%), and preclinical courses/review sessions (37.2%). Trainees rated standardized patients (µ = 6.98 ± 0.356), mentored clinical experiences (µ = 6.94 ± 0.301), clinical skills review sessions (µ = 6.89 ± 0.384), preclinical courses/review sessions (µ = 6.74 ± 0.482), and clinical volunteering (µ = 6.60 ± 0.369), significantly (p < 0.050) higher than clinical case review (µ = 5.34 ± 0.412), clinical journal club (µ = 4.75 ± 0.498), and medicine grand rounds (µ = 4.45 ± 0.377). Further, 84.4% of respondents stated they would be willing to devote at least 0.5–1 hour per week to clinical continuity opportunities during graduate training.
Conclusion:
Less than half of the institutions surveyed offered strategies perceived as the most efficacious in preparing trainees for clinical reentry, such as clinical skills review sessions. Broader implementation of these strategies could help better prepare dual-degree students for their return to clinical training.
We describe COVID-19 cases among nonphysician healthcare personnel (HCP) by work location. The proportion of HCP with coronavirus disease 2019 (COVID-19) was highest in the emergency department and lowest among those working remotely. COVID-19 and non–COVID-19 units had similar proportions of HCP with COVID-19 (13%). Cases decreased across all work locations following COVID-19 vaccination.
We analyzed blood-culture practices to characterize the utilization of the Infectious Diseases Society of America (IDSA) recommendations related to catheter-related bloodstream infection (CRBSI) blood cultures. Most patients with a central line had only peripheral blood cultures. Increasing the utilization of CRBSI guidelines may improve clinical care, but may also affect other quality metrics.
Survival after paediatric in-hospital cardiac arrest is worse on nights and weekends without demonstration of disparity in cardiopulmonary resuscitation quality. It is unknown whether these findings differ in children with CHD. This study aimed to determine whether cardiopulmonary resuscitation quality might explain the hypothesised worse outcomes of children with CHD during nights and weekends.
Methods:
In-hospital cardiac arrest data collected by the Pediatric Resuscitation Quality Collaborative for children with CHD. Chest compression quality metrics and survival outcomes were compared between events that occurred during day versus night, and during weekday versus weekend using multivariable logistic regression.
Results:
We evaluated 3614 sixty-second epochs of chest compression data from 132 subjects between 2015 and 2020. There was no difference in chest compression quality metrics during day versus night or weekday versus weekend. Weekday versus weekend was associated with improved survival to hospital discharge (adjusted odds ratio 4.56 [1.29,16.11]; p = 0.02] and survival to hospital discharge with favourable neurological outcomes (adjusted odds ratio 6.35 [1.36,29.6]; p = 0.02), but no difference with rate of return of spontaneous circulation or return of circulation. There was no difference in outcomes for day versus night.
Conclusion:
For children with CHD and in-hospital cardiac arrest, there was no difference in chest compression quality metrics by time of day or day of week. Although there was no difference in outcomes for events during days versus nights, there was improved survival to hospital discharge and survival to hospital discharge with favourable neurological outcome for events occurring on weekdays compared to weekends.