We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Diversifying the simplified landscape of corn and soybeans in the Midwest is an emerging priority in both the public and private sectors to reap a suite of climate, social, agronomic, and economic benefits. However, little research has documented the perspectives of farmers, the primary stakeholders in diversification efforts. This preliminary report uses newly collected survey data (n = 725) from farmers in the states of Illinois, Indiana, and Iowa to provide descriptive statistics and tests to understand what farmers in the region think about agricultural diversification, including their perspectives on its benefits, barriers, and opportunities. For the purposes of the study, we define diversification as extended rotations, perennials, horticulture, grazed livestock, and agroforestry practices. We find that a majority or plurality of farmers in the sample believe that diversified systems are superior to non-diversified systems at achieving a range of environmental, agronomic, and economic goals, although many farmers are still forming opinions. Farmers believe that primarily economic barriers stand in the way of diversification, including the lack of affordable land, low short-term returns on investment, and lack of labor. Farmers identified key opportunities to increase diversification through developing processing capacity for local meat and specialty crops, increasing demand for diversified products, and providing more information on returns on investment of diversified systems. Different interventions, however, may be needed to support farmers who are already diversified compared to non-diversified farmers. Building on these initial results, future studies using these data will develop more detailed analyses and recommendations for policymakers, the private sector, and agricultural organizations to support diversification.
Assessing children’s diets is currently challenging and burdensome. Abbreviated FFQ have the potential to assess dietary patterns in a rapid and standardised manner. Using nationally representative UK dietary intake and biomarker data, we developed an abbreviated FFQ to calculate dietary quality scores for pre-school and primary school-aged children. UK National Diet and Nutrition Survey (2008–2016) weekly consumption frequencies of 129 food groups from 4-d diaries were cross-sectionally analysed using principal component analysis. A 129-item score was derived, alongside a 12-item score based on foods with the six highest and six lowest coefficients. Participants included 1069 pre-schoolers and 2565 primary schoolchildren. The first principal component explained 3·4 and 3·0 % of the variation in the original diet variables for pre-school and primary school groups, respectively, and described a prudent diet pattern. Prudent diet scores were characterised by greater consumption of fruit, vegetables and tap water and lower consumption of crisps, manufactured coated chicken/turkey products, purchased chips and soft drinks for both age groups. Correlations between the 129-item and 12-item scores were 0·86 and 0·84 for pre-school and primary school-aged children, respectively. Bland–Altman mean differences between the scores were 0·00 sd; 95 % limits of agreement were −1·05 to 1·05 and −1·10 to 1·10 sd for pre-school and primary school-aged children, respectively. Correlations between dietary scores and nutritional biomarkers showed only minor attenuation for the 12-item compared with the 129-item scores, illustrating acceptable congruence between prudent diet scores. The two 12-item FFQ offer user-friendly tools to measure dietary quality among UK children.
Australian children fall short of national dietary guidelines with only 63 % consuming adequate fruit and 10 % enough vegetables. Before school care operates as part of Out of School Hours Care (OSHC) services and provides opportunities to address poor dietary habits in children. The aim of this study was to describe the food and beverages provided in before school care and to explore how service-level factors influence food provision.
Design:
A cross-sectional study was conducted in OSHC services. Services had their before school care visited twice between March and June 2021. Direct observation was used to capture food and beverage provision and child and staff behaviour during breakfast. Interviews with staff collected information on service characteristics. Foods were categorised using the Australian Dietary Guidelines, and frequencies were calculated. Fisher’s exact test was used to compare food provision with service characteristics.
Setting:
The before school care of OSHC services in New South Wales, Australia.
Participants:
Twenty-five OSHC services.
Results:
Fruit was provided on 22 % (n 11) of days and vegetables on 12 % (n 6). Services with nutrition policies containing specific language on food provision (i.e. measurable) were more likely to provide fruit compared with those with policies using non-specific language (P= 0·027). Services that reported receiving training in healthy eating provided more vegetables than those who had not received training (P= 0·037).
Conclusions:
Before school care can be supported to improve food provision through staff professional development and advocating to regulatory bodies for increased specificity requirements in the nutrition policies of service providers.
Task-sharing holds promise for bridging gaps in access to mental healthcare; yet there remain significant challenges to scaling up task-sharing models. This formative study aimed to develop a digital platform for training non-specialist providers without prior experience in mental healthcare to deliver a brief psychosocial intervention for depression in community settings in Texas. A 5-step development approach was employed, consisting of: blueprinting, scripting, video production and digital content creation, uploading digital content to a Learning Management System and user testing. This resulted in the development of two courses, one called Foundational Skills covering the skills to become an effective counselor, and the second called Behavioral Activation covering the skills for addressing adult depression. Twenty-one participants with a range of health-related backgrounds, including 11 with prior training in mental healthcare, completed the training and joined focus group discussions offering qualitative feedback and recommendations for improving the program’s usability. Participant feedback centered around the need to make the content more interactive, to include additional engaging features, and to improve the layout and usability of the platform. The next steps will involve evaluating the training program on developing the skills of non-specialist providers and supporting its uptake and implementation.
Stigma of mental health conditions hinders recovery and well-being. The Honest, Open, Proud (HOP) program shows promise in reducing stigma but there is uncertainty about the feasibility of a randomized trial to evaluate a peer-delivered, individual adaptation of HOP for psychosis (Let's Talk).
Methods
A multi-site, Prospective Randomized Open Blinded Evaluation (PROBE) design, feasibility randomised controlled trial (RCT) comparing the peer-delivered intervention (Let's Talk) to treatment as usual (TAU). Follow-up was 2.5 and 6 months. Randomization was via a web-based system, with permuted blocks of random size. Up to 10 sessions of the intervention over 10 weeks were offered. The primary outcome was feasibility data (recruitment, retention, intervention attendance). Primary outcomes were analyzed by intention to treat. Safety outcomes were reported by as treated status. The study was prospectively registered: https://doi.org/10.1186/ISRCTN17197043.
Results
149 patients were referred to the study and 70 were recruited. 35 were randomly assigned to intervention + TAU and 35 to TAU. Recruitment was 93% of the target sample size. Retention rate was high (81% at 2.5 months primary endpoint), and intervention attendance rate was high (83%). 21% of 33 patients in Let's talk + TAU had an adverse event and 16% of 37 patients in TAU. One serious adverse event (pre-randomization) was partially related and expected.
Conclusions
This is the first trial to show that it is feasible and safe to conduct a RCT of HOP adapted for people with psychosis and individual delivery. An adequately powered trial is required to provide robust evidence.
It is well established that there is a substantial genetic component to eating disorders (EDs). Polygenic risk scores (PRSs) can be used to quantify cumulative genetic risk for a trait at an individual level. Recent studies suggest PRSs for anorexia nervosa (AN) may also predict risk for other disordered eating behaviors, but no study has examined if PRS for AN can predict disordered eating as a global continuous measure. This study aimed to investigate whether PRS for AN predicted overall levels of disordered eating, or specific lifetime disordered eating behaviors, in an Australian adolescent female population.
Methods
PRSs were calculated based on summary statistics from the largest Psychiatric Genomics Consortium AN genome-wide association study to date. Analyses were performed using genome-wide complex trait analysis to test the associations between AN PRS and disordered eating global scores, avoidance of eating, objective bulimic episodes, self-induced vomiting, and driven exercise in a sample of Australian adolescent female twins recruited from the Australian Twin Registry (N = 383).
Results
After applying the false-discovery rate correction, the AN PRS was significantly associated with all disordered eating outcomes.
Conclusions
Findings suggest shared genetic etiology across disordered eating presentations and provide insight into the utility of AN PRS for predicting disordered eating behaviors in the general population. In the future, PRSs for EDs may have clinical utility in early disordered eating risk identification, prevention, and intervention.
Staphylococcus aureus infection patterns in Yuma, Arizona show a 2.25x higher infection rate in non-Hispanics. Males had higher infection rates in most age classes. These disparities in infection are mostly consistent with previously observed patterns in colonization, suggesting that sex and ethnicity do not differentially impact colonization and infection.
In the UK, food banks and other forms of food aid have become a normalised support mechanism for those living at the sharp end of poverty. Drawing from accounts of those who have used, worked, and volunteered in two of England’s food banks during the Covid-19 pandemic, this article explores some of the key challenges that emerged for food aid during this unique period. In documenting these experiences, the paper concurs with previous work that has identified the expanding role of food banks in providing core welfare support, suggesting an increasingly extended welfare function of food aid. This has implications for understanding the effectiveness of welfare – and the appropriateness of our reliance on voluntary aid – in the post-pandemic period.
Routine pre-Fontan cardiac catheterization remains standard practice at most centres. However, with advances in non-invasive risk assessment, an invasive haemodynamic assessment may not be necessary for all patients.
Using retrospective data from patients undergoing Fontan palliation at our institution, we developed a multivariable model to predict the likelihood of a composite adverse post-operative outcome including prolonged length of stay ≥ 30 days, hospital readmission within 6 months, and death and/or transplant within 6 months. Our baseline model included non-invasive risk factors obtained from clinical history and echocardiogram. We then incrementally incorporated invasive haemodynamic data to determine if these variables improved risk prediction.
Our baseline model correctly predicted favourable versus adverse post-Fontan outcomes in 118/174 (68%) patients. Covariates associated with adverse outcomes included the presence of a systemic right ventricle (adjusted adds ratio [aOR] 2.9; 95% CI 1.4, 5.8; p = 0.004), earlier surgical era (aOR 3.1 for era 1 vs 2; 95% CI 1.5, 6.5; p = 0.002), and performance of concomitant surgical procedures at the time of Fontan surgery (aOR 2.5; 95% CI 1.1, 5.0; p = 0.026). Incremental addition of invasively acquired haemodynamic data did not improve model performance or percentage of outcomes predicted.
Invasively acquired haemodynamic data does not add substantially to non-invasive risk stratification in the majority of patients. Pre-Fontan catheterization may still be beneficial for angiographic evaluation of anatomy, for therapeutic intervention, and in select patients with equivocal risk stratification.
A quarter of People with Intellectual Disabilities (PwID) have epilepsy compared with 1% of the general population. Epilepsy in PwID is a bellwether for premature mortality, multimorbidity and polypharmacy. This group depends on their care provider to give relevant information for management, especially epilepsy. There is no research on care status relationship and clinical characteristics of PwID and epilepsy.
Aim
Explore and compare the clinical characteristics of PwID with epilepsy across different care settings.
Method
A retrospective multicentre cohort study across England and Wales collected information on seizure characteristics, intellectual disability severity, neurodevelopmental/biological/psychiatric comorbidities, medication including psychotropics/anti-seizure medication, and care status. Clinical characteristics were compared across different care settings, and those aged over and younger than 40 years.
Results
Of 618 adult PwID across six centres (male:female = 61%:39%), 338 (55%) received professional care whereas 258 (42%) lived with family. Significant differences between the care groups existed in intellectual disability severity (P = 0.01), autism presence (P < 0.001), challenging behaviour (P < 0.001) and comorbid physical conditions (P = 0.008). The two groups did not vary in intellectual disability severity/genetic conditions/seizure type and frequency/psychiatric disorders. The professional care cohort experienced increased polypharmacy (P < 0.001) and antipsychotic/psychotropic use (P < 0.001/P = 0.008).
The over-40s cohort had lower autism spectrum disorder (ASD) and attention-deficit hyperactivity disorder (ADHD) comorbidity (P < 0.001/P = 0.007), increased psychiatric comorbidity and challenging behaviour (P < 0.05), physical multimorbidity (P < 0.001), polypharmacy (P < 0.001) and antipsychotic use (P < 0.001) but reduced numbers of seizures (P = 0.007).
Conclusion
PwID and epilepsy over 40 years in professional care have more complex clinical characteristics, increased polypharmacy and antipsychotic prescribing but fewer seizures.
The Interdisciplinary Home-bAsed Reablement Program (I-HARP) integrates evidence-based rehabilitation strategies into a dementia-specific person-centred, time-limited, home-based, interdisciplinary rehabilitation package. I-HARP was a 4-month model of care, incorporated into community aged care services and hospital-based community geriatric services. I-HARP involved: 8-10 individually tailored home visits by occupational therapist and registered nurse; 2-4 optional other allied health sessions; up to A$1,000 minor home modifications and/or assistive devices; and three individual carer support sessions. The aim of the study was to determine the effectiveness of I-HARP on the health and wellbeing of people living with dementia and their family carers.
Methods:
A multi-centre pragmatic parallel-arm randomised controlled trial compared I-HARP to usual care in community-dwelling people with mild to moderate dementia and family carers in Sydney, Australia (2018-22). Assessments of the client’s daily activities, mobility and health-related quality of life, caregiver burden and quality of life were conducted at baseline, 4- and 12-month follow-up. Changes from baseline were compared between groups.
Results:
Of 260 recruited, 232 (116 dyads of clients and their carers, 58 dyads per group) completed the trial to 4-month follow-up (89% retention). Clients were: aged 60-97 years, 63% female, 57% with mild dementia and 43% with moderate dementia. The I-HARP group had somewhat better mean results for most outcome measures than usual care at both 4 and 12 months, but the only statistically significant difference was a reduction in home environment hazards at 4 months (reduction: 2.29 on Home Safety Self-Assessment Tool, 95% CI: 0.52, 4.08; p=.01, effect size [ES] 0.53). Post-hoc sub-group analysis of 66 clients with mild dementia found significantly better functional independence in the intervention group: 11.2 on Disability Assessment for Dementia (95% CI: 3.4, 19.1; p=.005; ES 0.69) at 4 months and 13.7 (95% CI: 3.7, 23.7; p=.007; ES 0.69) at 12 months.
Conclusion:
The I-HARP model enhanced functional independence of people with mild dementia only but not significantly in people with moderate dementia, so did not result in better outcomes in the group overall. A different type of rehabilitation model or strategies may be required as dementia becomes more severe.
White matter hyperintensity (WMH) burden is greater, has a frontal-temporal distribution, and is associated with proxies of exposure to repetitive head impacts (RHI) in former American football players. These findings suggest that in the context of RHI, WMH might have unique etiologies that extend beyond those of vascular risk factors and normal aging processes. The objective of this study was to evaluate the correlates of WMH in former elite American football players. We examined markers of amyloid, tau, neurodegeneration, inflammation, axonal injury, and vascular health and their relationships to WMH. A group of age-matched asymptomatic men without a history of RHI was included to determine the specificity of the relationships observed in the former football players.
Participants and Methods:
240 male participants aged 45-74 (60 unexposed asymptomatic men, 60 male former college football players, 120 male former professional football players) underwent semi-structured clinical interviews, magnetic resonance imaging (structural T1, T2 FLAIR, and diffusion tensor imaging), and lumbar puncture to collect cerebrospinal fluid (CSF) biomarkers as part of the DIAGNOSE CTE Research Project. Total WMH lesion volumes (TLV) were estimated using the Lesion Prediction Algorithm from the Lesion Segmentation Toolbox. Structural equation modeling, using Full-Information Maximum Likelihood (FIML) to account for missing values, examined the associations between log-TLV and the following variables: total cortical thickness, whole-brain average fractional anisotropy (FA), CSF amyloid ß42, CSF p-tau181, CSF sTREM2 (a marker of microglial activation), CSF neurofilament light (NfL), and the modified Framingham stroke risk profile (rFSRP). Covariates included age, race, education, APOE z4 carrier status, and evaluation site. Bootstrapped 95% confidence intervals assessed statistical significance. Models were performed separately for football players (college and professional players pooled; n=180) and the unexposed men (n=60). Due to differences in sample size, estimates were compared and were considered different if the percent change in the estimates exceeded 10%.
Results:
In the former football players (mean age=57.2, 34% Black, 29% APOE e4 carrier), reduced cortical thickness (B=-0.25, 95% CI [0.45, -0.08]), lower average FA (B=-0.27, 95% CI [-0.41, -.12]), higher p-tau181 (B=0.17, 95% CI [0.02, 0.43]), and higher rFSRP score (B=0.27, 95% CI [0.08, 0.42]) were associated with greater log-TLV. Compared to the unexposed men, substantial differences in estimates were observed for rFSRP (Bcontrol=0.02, Bfootball=0.27, 994% difference), average FA (Bcontrol=-0.03, Bfootball=-0.27, 802% difference), and p-tau181 (Bcontrol=-0.31, Bfootball=0.17, -155% difference). In the former football players, rFSRP showed a stronger positive association and average FA showed a stronger negative association with WMH compared to unexposed men. The effect of WMH on cortical thickness was similar between the two groups (Bcontrol=-0.27, Bfootball=-0.25, 7% difference).
Conclusions:
These results suggest that the risk factor and biological correlates of WMH differ between former American football players and asymptomatic individuals unexposed to RHI. In addition to vascular risk factors, white matter integrity on DTI showed a stronger relationship with WMH burden in the former football players. FLAIR WMH serves as a promising measure to further investigate the late multifactorial pathologies of RHI.
Obesity is associated with adverse effects on brain health, including increased risk for neurodegenerative diseases. Changes in cerebral metabolism may underlie or precede structural and functional brain changes. While bariatric surgery is known to be effective in inducing weight loss and improving obesity-related medical comorbidities, few studies have examined whether it may be able to improve brain metabolism. In the present study, we examined change in cerebral metabolite concentrations in participants with obesity who underwent bariatric surgery.
Participants and Methods:
35 patients with obesity (BMI > 35 kg/m2) were recruited from a bariatric surgery candidate nutrition class. They completed single voxel 1H-proton magnetic resonance spectroscopy at baseline (pre-surgery) and within one year post-surgery. Spectra were obtained from a large medial frontal brain region. Tissue-corrected absolute concentrations for metabolites including choline-containing compounds (Cho), myo-inositol (mI), N-acetylaspartate (NAA), creatine (Cr), and glutamate and glutamine (Glx) were determined using Osprey. Paired t-tests were used to examine within-subject change in metabolite concentrations, and correlations were used to relate these changes to other health-related outcomes, including weight loss and glycemic control.
Results:
Bariatric surgery was associated with a reduction in cerebral Cho (f[34j = -3.79, p < 0.001, d = -0.64) and mI (f[34] = -2.81, p < 0.01, d = -0.47) concentrations. There were no significant changes in NAA, Glx, or Cr concentrations. Reductions in Cho were associated with greater weight loss (r = 0.40, p < 0.05), and reductions in mI were associated with greater reductions in HbA1c (r = 0.44, p < 0.05).
Conclusions:
Participants who underwent bariatric surgery exhibited reductions in cerebral Cho and mI concentrations, which were associated with improvements in weight loss and glycemic control. Given that elevated levels of Cho and mI have been implicated in neuroinflammation, reduction in these metabolites after bariatric surgery may reflect amelioration of obesity-related neuroinflammatory processes. As such, our results provide evidence that bariatric surgery may improve brain health and metabolism in individuals with obesity.
Children with neurodevelopmental disorders (NDDs) commonly experience attentional and executive function (EF) difficulties that are negatively associated with academic success, psychosocial functioning, and quality of life. Access to early and consistent interventions is a critical protective factor and there are recommendations to deliver cognitive interventions in schools; however, current cognitive interventions are expensive and/or inaccessible, particularly for those with limited resources and/or in remote communities. The current study evaluated the school-based implementation of two game-based interventions in children with NDDs: 1) a novel neurocognitive attention/EF intervention (Dino Island; DI), and 2) a commercial educational intervention (Adventure Academy; AA). DI is a game-based attention/EF intervention specifically developed for children for delivery in community-based settings.
Participants and Methods:
Thirty five children with NDDs (ages 5-13 years) and 17 EAs participated. EAs completed on-line training to deliver the interventions to assigned students at their respective schools (3x/week, 40-60 minutes/session, 8 weeks, 14 hours in total). We gathered baseline child and EA demographic data, completed pre-intervention EA interviews, and conducted regular fidelity checks throughout the interventions. Implementation data included paper-pencil tracking forms, computerized game analytic data, and online communications.
Results:
Using a mixed methods approach we evaluated the following implementation outcomes: fidelity, feasibility, acceptability, and barriers. Overall, no meaningful between-group differences were found in EA or child demographics, except for total number of years worked as an EA (M = 17.18 for AA and 9.15 for DI; t (22) = - 4.34, p < .01) and EA gender (χ2 (1) = 6.11, p < .05). For both groups, EA age was significantly associated with the number of sessions played [DI (r = .847, p < .01), AA (r = .986, p < .05)]. EAs who knew their student better completed longer sessions [DI (r = .646), AA (r = .973)], all ps < .05]. The number of years worked as an EA was negatively associated with the total intervention hours for both groups. Qualitative interview data indicated that most EAs found DI valuable and feasible to deliver in their classrooms, whereas more implementation challenges were identified with AA. Barriers common to both groups included technical difficulties (e.g., game access, internet firewalls), environmental barriers (e.g., distractions in surroundings, time of the year), child factors (e.g., lack of motivation, attentional difficulties, frustration), and game-specific factors (e.g., difficulty level progression). Barriers specific to DI included greater challenges in motivating children as a function of difficulty level progression. Furthermore, given the comprehensive nature of training required for delivery, EAs needed a longer time to complete the training for DI. Nevertheless, many EAs in the DI group found the training helpful, with a potential to generalize to other children in the classroom.
Conclusions:
The availability of affordable, accessible, and effective cognitive intervention is important for children with NDDs. We found that delivery of a novel cognitive intervention by EAs was feasible and acceptable, with similarities and differences in implementation facilitators/barriers between the cognitive and commercialized academic intervention. Recommendations regarding strategies for successful school-based implementation of neurocognitive intervention will be elaborated on in the poster.
Subthreshold depressive symptoms are both prevalent and associated with negative outcomes in older adults, including conversion to major depressive disorder and other medical conditions. Antidepressants are not recommended as first-line or sole intervention for subthreshold depression; thus, finding other efficacious interventions is important. In depressed adults, transcranial direct current stimulation (tDCS) applied to the frontal lobe has antidepressant properties and pairing tDCS with cognitive training results in additional benefit due to enhancement of frontal cortical activity. However, these studies have primarily targeted depressed adults under age 65 years and less is known about whether this intervention combination is beneficial or affects subthreshold depressive symptoms in older adults.
Participants and Methods:
We are reporting secondary data analyses from Nissim et al. (2019), who recruited 30 non-demented healthy older adults and randomized them to receive active or sham tDCS in combination with cognitive training for 2 weeks. Active tDCS was delivered bifrontally over F3 (cathode) and F4 (anode) for 20-min at 2 mA intensity through two 5x7 cm2 saline saturated sponge electrodes using the Soterix Medical 1x1 tDCS clinical trials device. Sham tDCS had identical set-up with 2 mA stimulation for 30-sec with 30-sec ramp up and down. Cognitive training was administered for 40-min daily using attention/processing speed and working memory modules from BrainHQ. The first 20-min of cognitive training was paired with active or sham tDCS. To allow room for symptom improvement, we only included participants with Beck Depression Inventory, 2nd edition (BDI-II) scores of 5 or greater ("minimal" depression severity). We identified 15 participants who met this cut-off (70.93 ± 5.41 years old, 10 females, 16.4 years ± 2.32 years education, MoCA = 27.27 ± 2.34; 7 active, 8 sham).
Results:
tDCS conditions did not significantly differ in age, sex, years of education, MoCA scores, number of completed intervention days, or baseline BDI-II (active: 7.71 ± 2.93, sham: 11.38 ± 6.44). There were no differences in sensation ratings between groups or in confidence ratings for condition received (suggesting successful blinding). Results indicated the combination of active (and not sham) tDCS with cognitive training was associated with reduced depressive symptoms (2.7 vs. 1.4 points, active vs. sham). Including covariates (age, sex, education, MoCA scores, and number of completed intervention days) in the model further strengthened this discrepancy (3.7 vs. 0.51 points, active vs. sham).
Conclusions:
While preliminary, these results suggest this intervention combination may be a potential method for improving subthreshold depressive symptoms in older adults via targeting prefrontal neural circuitry and promoting neuroplasticity of the underlying neural network. While baseline BDI-II scores did not significantly differ, the active tDCS group had a lower score than sham, but saw greater improvement in BDI-II scores post-intervention despite having less room for change. Adequate treatment of subthreshold depressive symptoms may prevent or reduce negative outcomes associated with depressive symptoms in at-risk older adults. Larger randomized clinical trials are needed to better understand tDCS plus cognitive training antidepressant effects in this age group.
To characterize residential social vulnerability among healthcare personnel (HCP) and evaluate its association with severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection.
Design:
Case–control study.
Setting:
This study analyzed data collected in May–December 2020 through sentinel and population-based surveillance in healthcare facilities in Colorado, Minnesota, New Mexico, New York, and Oregon.
Participants:
Data from 2,168 HCP (1,571 cases and 597 controls from the same facilities) were analyzed.
Methods:
HCP residential addresses were linked to the social vulnerability index (SVI) at the census tract level, which represents a ranking of community vulnerability to emergencies based on 15 US Census variables. The primary outcome was SARS-CoV-2 infection, confirmed by positive antigen or real-time reverse-transcriptase– polymerase chain reaction (RT-PCR) test on nasopharyngeal swab. Significant differences by SVI in participant characteristics were assessed using the Fisher exact test. Adjusted odds ratios (aOR) with 95% confidence intervals (CIs) for associations between case status and SVI, controlling for HCP role and patient care activities, were estimated using logistic regression.
Results:
Significantly higher proportions of certified nursing assistants (48.0%) and medical assistants (44.1%) resided in high SVI census tracts, compared to registered nurses (15.9%) and physicians (11.6%). HCP cases were more likely than controls to live in high SVI census tracts (aOR, 1.76; 95% CI, 1.37–2.26).
Conclusions:
These findings suggest that residing in more socially vulnerable census tracts may be associated with SARS-CoV-2 infection risk among HCP and that residential vulnerability differs by HCP role. Efforts to safeguard the US healthcare workforce and advance health equity should address the social determinants that drive racial, ethnic, and socioeconomic health disparities.
The nursing associate role was first deployed in England in 2019 to fill a perceived skills gap in the nursing workforce between healthcare assistants and registered nurses and to offer an alternative route into registered nursing. Initially, trainee nursing associates were predominantly based in hospital settings; however, more recently, there has been an increase in trainees based in primary care settings. Early research has focussed on experiences of the role across a range of settings, particularly secondary care; therefore, little is known about the experiences and unique support needs of trainees based in primary care.
Aim:
To explore the experiences and career development opportunities for trainee nursing associates based in primary care.
Methods:
This study used a qualitative exploratory design. Semi-structured interviews were undertaken with 11 trainee nursing associates based in primary care from across England. Data were collected between October and November 2021, transcribed and analysed thematically.
Findings:
Four key themes relating to primary care trainee experiences of training and development were identified. Firstly, nursing associate training provided a ‘valuable opportunity for career progression’. Trainees were frustrated by the ‘emphasis on secondary care’ in both academic content and placement portfolio requirements. They also experienced ‘inconsistency in support’ from their managers and assessors and noted a number of ‘constraints to their learning opportunities’, including the opportunity to progress to become registered nurses.
Conclusion:
This study raises important issues for trainee nursing associates, which may influence the recruitment and retention of the nursing associate workforce in primary care. Educators should consider adjustments to how the curriculum is delivered, including primary care skills and relevant assessments. Employers need to recognise the resource requirements for the programme, in relation to time and support, to avoid undue stress for trainees. Protected learning time should enable trainees to meet the required proficiencies.
Maternal protein restriction is often associated with structural and functional sequelae in offspring, particularly affecting growth and renal-cardiovascular function. However, there is little understanding as to whether hypertension and kidney disease occur because of a primary nephron deficit or whether controlling postnatal growth can result in normal renal-cardiovascular phenotypes. To investigate this, female Sprague-Dawley rats were fed either a low-protein (LP, 8.4% protein) or normal-protein (NP, 19.4% protein) diet prior to mating and until offspring were weaned at postnatal day (PN) 21. Offspring were then fed a non ‘growth’ (4.6% fat) which ensured that catch-up growth did not occur. Offspring growth was determined by weight and dual energy X-ray absorptiometry. Nephron number was determined at PN21 using the disector-fractionator method. Kidney function was measured at PN180 and PN360 using clearance methods. Blood pressure was measured at PN360 using radio-telemetry. Body weight was similar at PN1, but by PN21 LP offspring were 39% smaller than controls (Pdiet < 0.001). This difference was due to proportional changes in lean muscle, fat, and bone content. LP offspring remained smaller than NP offspring until PN360. In LP offspring, nephron number was 26% less in males and 17% less in females, than NP controls (Pdiet < 0.0004). Kidney function was similar across dietary groups and sexes at PN180 and PN360. Blood pressure was similar in LP and NP offspring at PN360. These findings suggest that remaining on a slow growth trajectory after exposure to a suboptimal intrauterine environment does not lead to the development of kidney dysfunction and hypertension.
Recent guidance has called for the reduction of restrictive practice use owing to growing concerns over the harmful physical and psychological effects for both patients and staff. Despite concerns and efforts, these measures continue to be used regularly to manage challenging behaviour in psychiatric in-patient settings.
Aims
To undertake a systematic review of patients’ and staff members’ experiences of restrictive practices in acute psychiatric in-patient settings.
Method
A systematic review and thematic synthesis was conducted using data from 21 qualitative papers identified from a systematic search across three electronic databases (PsycInfo, Embase and MEDLINE) and citation searching. The protocol for the review was pre-registered on PROSPERO (CRD42020176859). The quality of included papers was examined using the Critical Appraisal Skills Programme (CASP).
Results
Four overarching themes emerged from the experiences of patients: the psychological effects, staff communication, loss of human rights and making changes. Likewise, the analysis of staff data produced four themes: the need for restrictive practices, the psychological impact, decision-making and making changes. Patient and staff experiences of restrictive practices were overwhelmingly negative, and their use carried harmful physical and psychological consequences. Lack of support following restraint events was a problem for both groups.
Conclusions
Future programmes seeking to improve or reduce restrictive practices should consider the provision of staff training covering behaviour management and de-escalation techniques, offering psychological support to both patients and staff, the importance of effective staff–patient communication and the availability of alternatives.