We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
New Zealand and Australian governments rely heavily on voluntary industry initiatives to improve population nutrition, such as voluntary front-of-pack nutrition labelling (Health Star Rating [HSR]), industry-led food advertising standards, and optional food reformulation programmes. Research in both countries has shown that food companies vary considerably in their policies and practices on nutrition(1). We aimed to determine if a tailored nutrition support programme for food companies improved their nutrition policies and practices compared with control companies who were not offered the programme. REFORM was a 24-month, two-country, cluster-randomised controlled trial. 132 major packaged food/drink manufacturers (n=96) and fast-food companies (n=36) were randomly assigned (2:1 ratio) to receive a 12-month tailored support programme or to the control group (no intervention). The intervention group was offered a programme designed and delivered by public health academics comprising regular meetings, tailored company reports, and recommendations and resources to improve product composition (e.g., reducing nutrients of concern through reformulation), nutrition labelling (e.g., adoption of HSR labels), marketing to children (reducing the exposure of children to unhealthy products and brands) and improved nutrition policy and corporate sustainability reporting. The primary outcome was the nutrient profile (measured using HSR) of company food and drink products at 24 months. Secondary outcomes were the nutrient content (energy, sodium, total sugar, and saturated fat) of company products, display of HSR labels on packaged products, company nutrition-related policies and commitments, and engagement with the intervention. Eighty-eight eligible intervention companies (9,235 products at baseline) were invited to participate, of whom 21 accepted and were enrolled in the REFORM programme (delivered between September 2021 and December 2022). Forty-four companies (3,551 products at baseline) were randomised to the control arm. At 24 months, the model-adjusted mean HSR of intervention company products was 2.58 compared to 2.68 for control companies, with no significant difference between groups (mean difference -0.10, 95% CI -0.40 to 0.21, p-value 0.53). A per protocol analysis of intervention companies who enrolled in the programme compared to control companies with no major protocol violation also found no significant difference (2.93 vs 2.64, mean difference 0.29, 95% CI -0.13 to 0.72, p-value 0.18). We found no significant differences between the intervention and control groups in any secondary outcome, except in total sugar (g/100g) where the sugar content of intervention company products was higher than that of control companies (12.32 vs 6.98, mean difference 5.34, 95% CI 1.73 to 8.96, p-value 0.004). The per-protocol analysis for sugar did not show a significant difference (10.47 vs 7.44, mean difference 3.03, 95% CI -0.48 to 6.53, p-value 0.09).In conclusion, a 12-month tailored nutrition support for food companies did not improve the nutrient profile of company products.
Here, we explore variation in a new record of archaeological house-floor sizes from the southwestern United States relative to spatially explicit time series estimates of local precipitation. Our results show that inequality becomes more severe during periods of high precipitation. This supports the theory suggesting that inequality may emerge where resources are dense, predictable, and clumped within heterogenous and circumscribed environments. Our findings indicate that wealth inequality may emerge among populations with similar subsistence adaptations as a result of local socioenvironmental variation.
The kinetic stability of collisionless, sloshing beam-ion ($45^\circ$ pitch angle) plasma is studied in a three-dimensional (3-D) simple magnetic mirror, mimicking the Wisconsin high-temperature superconductor axisymmetric mirror experiment. The collisional Fokker–Planck code CQL3D-m provides a slowing-down beam-ion distribution to initialize the kinetic-ion/fluid-electron code Hybrid-VPIC, which then simulates free plasma decay without external heating or fuelling. Over $1$–$10\;\mathrm{\unicode{x03BC} s}$, drift-cyclotron loss-cone (DCLC) modes grow and saturate in amplitude. The DCLC scatters ions to a marginally stable distribution with gas-dynamic rather than classical-mirror confinement. Sloshing ions can trap cool (low-energy) ions in an electrostatic potential well to stabilize DCLC, but DCLC itself does not scatter sloshing beam-ions into the said well. Instead, cool ions must come from external sources such as charge-exchange collisions with a low-density neutral population. Manually adding cool $\mathord {\sim } 1\;\mathrm{keV}$ ions improves beam-ion confinement several-fold in Hybrid-VPIC simulations, which qualitatively corroborates prior measurements from real mirror devices with sloshing ions.
Moral injury is a potentially deleterious mental health outcome that can follow exposure to events that challenge one’s moral code. Theoretical models suggest a multi-faceted self-concept may support adaptation following such events. However, little is known about the relationship between self-concept complexity and outcomes following potentially morally injurious events.
Aims:
This cross-sectional study investigated hypothesized relationships between self-concept complexity and outcomes in adults (n=172) exposed to potentially morally injurious events.
Method:
Participants completed validated measures of event-related distress, traumatic stress, depression and anxiety, and a self-complexity task in which they provided multiple descriptors of their self-concept. Responses were coded for overall diversity, defined as number of categories of self-descriptors, and role diversity, defined as number of social and activity-based roles.
Results:
Multiple regression analyses found greater role diversity independently predicted lower event-related distress, while overall self-diversity and total number of self-descriptors did not.
Conclusion:
Findings indicate diversity in active facets of the self (e.g. relational or activity-based roles) may buffer the effects of a potentially morally injurious event.
Inflammation and infections such as malaria affect concentrations of many micronutrient biomarkers and hence estimates of nutritional status. We aimed to assess the relationship between malaria infection and micronutrient biomarker concentrations in pre-school children (PSC), school-age children (SAC) and women of reproductive age (WRA) in Malawi and examine the potential role of malarial immunity on the relationship between malaria and micronutrient biomarkers. Data from the 2015/2016 Malawi micronutrient survey were used. The associations between current or recent malaria infection, detected by rapid diagnostic test and concentration of serum ferritin, soluble transferrin receptor (sTfR), zinc, serum folate, red blood cell folate and vitamin B12 were estimated using multivariable linear regression. Factors related to malarial immunity including age, altitude and presence of hemoglobinopathies were examined as effect modifiers. Serum ferritin, sTfR and zinc were adjusted for inflammation using the BRINDA method. Malaria infection was associated with 68 % (95 % CI 51, 86), 28 % (18, 40) and 34 % (13, 45) greater inflammation-adjusted ferritin in PSC, SAC and WRA, respectively (P < 0·001 for each). In PSC, the positive association was stronger in younger children, high altitude and children who were not carriers of the sickle cell trait. In PSC and SAC, sTfR was elevated (+ 25 % (16, 29) and + 15 % (9, 22) respectively, P < 0·001). Serum folate and erythrocyte folate were elevated in WRA with malaria (+ 18 % (3, 35) and + 11 % (1, 23), P = 0·01 and P = 0·003 respectively). Malaria affects the interpretation of micronutrient biomarker concentrations, and examining factors related to malarial immunity may be informative.
The problem of evil is an ideal topic for experimental philosophy. Suffering – which is at the heart of most prominent formulations of the problem of evil – is a universal human experience and has been the topic of careful reflection for millennia. However, interpretations of suffering and how it bears on the existence of God are tremendously diverse and nuanced. Why does suffering push some people toward atheism while pushing others toward deeper faith? What cultural, psychological, or sociological differences account for this diversity of responses? And, importantly, what light might this diversity of responses shed on the problem of evil and how it has been formulated by philosophers in recent years? The aim of this article is to highlight how the tools and resources of experimental philosophy might be fruitfully applied to the problem of evil. In the first section, we review some recent work in this area and describe the current state of this emergent body of literature. In the second section, we review the broader and more recent theoretical developments on the problem of evil. In the final section, we outline some potential areas of future empirical research that we see as especially promising given those developments.
Nearly 100 years ago, economist John Maynard Keynes predicted that, by today, technological advancements would allow the workweek to dwindle to just 15 hours, or 3 hours per day, and that the real problem of humanity would be filling their time with leisure. Although much has changed in the world of work since this prediction, such a drastic change has not taken place. In this article, several industrial-organizational psychology scholars discuss why this is the case. Why do we continue to work as much as we do, and how might that change? More fundamentally, what do these trends, contra Keynes’ prediction, tell us about the nature of work itself? We use this discussion to propose several research directions regarding the nature of work and how it might change in the future. We depict the phenomenon of working hours as multilevel in nature, and we consider both the positive and negative possible implications of working less than we do now.
Inflammation and infections such as malaria affect micronutrient biomarker concentrations and hence estimates of nutritional status. It is unknown whether correction for C-reactive protein (CRP) and α1-acid glycoprotein (AGP) fully captures the modification in ferritin concentrations during a malaria infection, or whether environmental and sociodemographic factors modify this association. Cross-sectional data from eight surveys in children aged 6–59 months (Cameroon, Cote d’Ivoire, Kenya, Liberia, Malawi, Nigeria and Zambia; n 6653) from the Biomarkers Reflecting Inflammation and Nutritional Determinants of Anaemia (BRINDA) project were pooled. Ferritin was adjusted using the BRINDA adjustment method, with values < 12 μg/l indicating iron deficiency. The association between current or recent malaria infection, detected by microscopy or rapid test kit, and inflammation-adjusted ferritin was estimated using pooled multivariable linear regression. Age, sex, malaria endemicity profile (defined by the Plasmodium falciparum infection prevalence) and malaria diagnostic methods were examined as effect modifiers. Unweighted pooled malaria prevalence was 26·0 % (95 % CI 25·0, 27·1) and unweighted pooled iron deficiency was 41·9 % (95 % CI 40·7, 43·1). Current or recent malaria infection was associated with a 44 % (95 % CI 39·0, 52·0; P < 0·001) increase in inflammation-adjusted ferritin after adjusting for age and study identifier. In children, ferritin increased less with malaria infection as age and malaria endemicity increased. Adjustment for malaria increased the prevalence of iron deficiency, but the effect was small. Additional information would help elucidate the underlying mechanisms of the role of endemicity and age in the association between malaria and ferritin.
Community-Engaged Research (CEnR) and Community-Based Participatory Research (CBPR) require validated measures and metrics for evaluating research partnerships and outcomes. There is a need to adapt and translate existing measures for practical use with diverse and non-English-speaking communities. This paper describes the Spanish translation and adaptation of Engage for Equity’s Community Engagement Survey (E2 CES), a nationally validated and empirically-supported CEnR evaluation tool, into the full-length “Encuesta Comunitaria,” and a pragmatic shorter version “Fortaleciendo y Uniendo EsfueRzos Transdisciplinarios para Equidad de Salud” (FUERTES).
Methods:
Community and academic partners from the mainland US, Puerto Rico, and Nicaragua participated in translating and adapting E2 CES, preserving content validity, psychometric properties, and importance to stakeholders of items, scales, and CBPR constructs (contexts, partnership processes, intervention and research actions, and outcomes). Internal consistency was assessed using Cronbach’s alpha and convergent validity was assessed via a correlation matrix among scales.
Results:
Encuesta Comunitaria respondents (N = 57) self-identified as primarily Latinos/as/x (97%), female (74%), and academics (61%). Cronbach’s alpha values ranged from 0.72 to 0.88 for items in the context domain to 0.90–0.92 for items in the intervention/research domain. Correlations were found as expected among subscales, with the strongest relationships found for subscales within the same CBPR domain. Results informed the creation of FUERTES.
Conclusions:
Encuenta Comunitaria and FUERTES offer CEnR/CBPR practitioners two validated instruments for assessing their research partnering practices, and outcomes. Moreover, FUERTES meets the need for shorter pragmatic tools. These measures can further strengthen CEnR/CBPR involving Latino/a/x communities within the US, Latin America, and globally.
Suicide prevention strategies have shifted in many countries, from a national approach to one that is regionally tailored and responsive to local community needs. Previous Australian studies support this approach. However, most studies have focused on suicide deaths which may not fully capture a complete understanding of prevention needs, and few have focused on the priority population of youth. This was the first nationwide study to examine regional variability of self-harm prevalence and related factors in Australian young people.
Methods
A random sample of Australian adolescents (12–17-year-olds) were recruited as part of the Young Minds Matter (YMM) survey. Participants completed self-report questions on self-harm (i.e., non-suicidal self-harm and suicide attempts) in the previous 12 months. Using mixed effects regressions, an area-level model was built with YMM and Census data to produce out-of-sample small area predictions for self-harm prevalence. Spatial unit of analysis was Statistical Area Level 1 (average population 400 people), and all prevalence estimates were updated to 2019.
Results
Across Australia, there was large variability in youth self-harm prevalence estimates. Northern Territory, Western Australia, and South Australia had the highest estimated state prevalence. Psychological distress and depression were factors which best predicted self-harm at an individual level. At an area-level, the strongest predictor was a high percentage of single unemployed parents, while being in an area where ≥30% of parents were born overseas was associated with reduced odds of self-harm.
Conclusions
This study identified characteristics of regions with lower and higher youth self-harm risk. These findings should assist governments and communities with developing and implementing regionally appropriate youth suicide prevention interventions and initiatives.
Efforts are critically needed to increase the armamentarium of options that clinicians can use to treat patients with alcohol use disorder (AUD). Numerous preclinical studies support the hypothesis that mineralocorticoid receptor (MR) pharmacological antagonism may represent a novel and promising treatment for AUD. Namely, the non-selective MR antagonist spironolactone dose-dependently decreased 1) the intake of alcohol in mice in a model of alcohol binge drinking procedure and 2) alcohol self-administration in dependent and non-dependent rats (Farokhnia, Rentsch, Choung et al., Mol Psychiatry 2022; 27(11):4642-4652). Furthermore, two U.S.-based independent human pharmacoepidemiologic studies utilizing electronic health records data showed that patients treated with spironolactone for any indication reduced their weekly alcohol use in a primary care-type medical setting (Palzes et al., Neuropsychopharmacology 2021; 46(12):2140-2147) and Alcohol Use Disorders Identification Test-Consumption (AUDIT-C) score in a Veterans Affairs medical setting (Farokhnia, Rentsch, Choung et al., 2022; 27(11):4642-4652). In both studies, spironolactone-treated patients were compared to matched ones without spironolactone prescription using propensity score matching.
Objectives
We are conducting a Phase 1b human study to assess the pharmacokinetics and pharmacodynamics of spironolactone-alcohol co-administration and testing the safety and tolerability of spironolactone, alone and combined with alcohol in individuals with AUD.
Methods
Spironolactone in Alcohol Use Disorder (SAUD) is a double-blind, placebo-controlled, randomized, within-subject, ascending dose study with spironolactone (0, 100, 200, 400 mg/day) PO for 5 days to reach steady-state, followed by oral fixed-dose alcohol administration aimed at reaching a blood alcohol level of approximately 0.08 %. Our sample consists of 12 adults diagnosed with AUD.
Results
The primary endpoint is to measure spironolactone and alcohol PK during concomitant administration. Our secondary endpoints are 1) assessment of subjective and cognitive effects of acute alcohol administration during concomitant spironolactone treatment; 2) number and severity of adverse events (AEs) experienced, compared between placebo (0 mg/day) and all three spironolactone doses; 3) PK characteristic of spironolactone active metabolites, canrenone, 7-α-thiomethylspirolactone (TMS) and 6β-hydroxy-7α-thiomethylspirolactone (HTMS), before and after administration of alcohol. Recruitment is underway.
Conclusions
The above-mentioned preclinical and clinical evidence suggest that spironolactone may be repurposed for the treatment of AUD. Our Phase 1b study is a key step before moving to larger efficacy trials.
Fetal alcohol spectrum disorder (FASD) is a life-long condition, and few interventions have been developed to improve the neurodevelopmental course in this population. Early interventions targeting core neurocognitive deficits have the potential to confer long-term neurodevelopmental benefits. Time-targeted choline supplementation is one such intervention that has been shown to provide neurodevelopmental benefits that emerge with age during childhood. We present a long-term follow-up study evaluating the neurodevelopmental effects of early choline supplementation in children with FASD approximately 7 years on average after an initial efficacy trial. In this study, we examine treatment group differences in executive function (EF) outcomes and diffusion MRI of the corpus callosum using the Neurite Orientation Dispersion and Density Index (NODDI) biophysical model.
Participants and Methods:
The initial study was a randomized, double-blind, placebo-controlled trial of choline vs. placebo in 2.5- to 5-year-olds with FASD. Participants in this long-term follow-up study included 18 children (9 placebo; 9 choline) seen 7 years on average following initial trial completion. The mean age at follow-up was 11 years old. Diagnoses were 28% fetal alcohol syndrome (FAS), 28% partial FAS, and 44% alcohol-related neurodevelopmental disorder. The follow-up evaluation included measures of executive functioning (WISC-V Picture Span and Digit Span; DKEFS subtests) and diffusion MRI (NODDI).
Results:
Children who received choline early in development outperformed those in the placebo group across a majority of EF tasks at long-term follow-up (effect sizes ranged from -0.09 to 1.27). Children in the choline group demonstrated significantly better performance on several tasks of lower-order executive function skills (i.e., DKEFS Color Naming [Cohen's d = 1.27], DKEFS Word Reading [Cohen's d = 1.13]) and showed potentially better white matter microstructure organization (as indicated by lower orientation dispersion; Cohen's d = -1.26) in the splenium of the corpus callosum compared to the placebo group. In addition, when collapsing across treatment groups, higher white matter microstructural organization was associated with better performance on several EF tasks (WISC-V Digit Span; DKEFS Number Sequencing and DKEFS Word Reading).
Conclusions:
These findings highlight long-term benefits of choline as a neurodevelopmental intervention for FASD and suggest that changes in white matter organization may represent an important target of choline in this population. Unique to this study is the use of contemporary biophysical modeling of diffusion MRI data in youth with FASD. Findings suggest this neuroimaging approach may be particularly useful for identifying subtle white matter differences in FASD as well as neurobiological responses to early intervention associated with important cognitive functions.
Fetal alcohol spectrum disorder (FASD) is a common neurodevelopmental condition associated with deficits in cognitive functioning (executive functioning [EF], attention, working memory, etc.), behavioral impairments, and abnormalities in brain structure including cortical and subcortical volumes. Rates of comorbid attention-deficit/hyperactivity disorder (ADHD) are high in children with FASD and contribute to significant functional impairments. Sluggish cognitive tempo (SCT) includes a cluster of symptoms (e.g. underactive/slow-moving, confusion, fogginess, daydreaming) found to be related to but distinct from ADHD, and previous research suggests that it may be common in FASD. We explored SCT by examining the relationship between SCT and both brain volumes (corpus callosum, caudate, and hippocampus) and objective EF measures in children with FASD vs. typically developing controls.
Participants and Methods:
This is a secondary analysis of a larger longitudinal CIFASD study that consisted of 35 children with prenatal alcohol exposure (PAE) and 30 controls between the ages of 9 to 18 at follow-up. Children completed a set of cognitive assessments (WISC-IV, DKEFS, & NIH Toolbox) and an MRI scan, while parents completed the Child Behavior Checklist (CBCL), which includes a SCT scale. We examined group differences between PAE and controls in relation to SCT symptoms, EF scores, and subcortical volumes. Then, we performed within-and between-group comparisons with and without controlling for total intracranial volume, age, attention problems, and ADHD problems between SCT and subcortical brain volumes. Finally, we performed correlations between SCT and EF measures for both groups.
Results:
Compared to controls, participants with PAE showed significantly more SCT symptoms on the CBCL (t [57] = 3.66, p = 0.0006), more parent-rated attention problems and ADHD symptoms, lower scores across several EF measures (DKEFS Trail-Making and Verbal Fluency; WISC-IV Digit Span, Symbol Search, and Coding; effect sizes ranging from 0.44 to 1.16), and smaller regional volumes in the caudate, hippocampus, and posterior areas of the corpus callosum. In the PAE group, a smaller hippocampus was associated with more SCT symptoms (controlling for parent-rated attention problems and ADHD problems, age, and intracranial volume). However, in the control group, a larger mid posterior and posterior corpus callosum were significantly associated with more SCT symptoms (controlling for parent-rated attention problems, intracranial volume, and age; r [24] = 0.499, p = 0.009; r [24] = 0.517, p = 0.007). In terms of executive functioning, children in the PAE group with more SCT symptoms performed worse on letter sequencing of the Trail-Making subtest (controlling attention problems & ADHD symptoms). In comparison, those in the control group with more SCT symptoms performed better on letter sequencing and combined number letter sequencing of the Trail-Making subtest (controlling attention problems).
Conclusions:
Findings suggest that children with FASD experience elevated SCT symptoms compared to typically developing controls, which may be associated with worse performance on EF tasks and smaller subcortical volumes (hippocampus) when taking attention difficulties and ADHD symptoms into account. Additional research into the underlying causes and correlates of SCT in FASD could result in improved tailoring of interventions for this population.
The COVID-19 pandemic accelerated the development of decentralized clinical trials (DCT). DCT’s are an important and pragmatic method for assessing health outcomes yet comprise only a minority of clinical trials, and few published methodologies exist. In this report, we detail the operational components of COVID-OUT, a decentralized, multicenter, quadruple-blinded, randomized trial that rapidly delivered study drugs nation-wide. The trial examined three medications (metformin, ivermectin, and fluvoxamine) as outpatient treatment of SARS-CoV-2 for their effectiveness in preventing severe or long COVID-19. Decentralized strategies included HIPAA-compliant electronic screening and consenting, prepacking investigational product to accelerate delivery after randomization, and remotely confirming participant-reported outcomes. Of the 1417 individuals with the intention-to-treat sample, the remote nature of the study caused an additional 94 participants to not take any doses of study drug. Therefore, 1323 participants were in the modified intention-to-treat sample, which was the a priori primary study sample. Only 1.4% of participants were lost to follow-up. Decentralized strategies facilitated the successful completion of the COVID-OUT trial without any in-person contact by expediting intervention delivery, expanding trial access geographically, limiting contagion exposure, and making it easy for participants to complete follow-up visits. Remotely completed consent and follow-up facilitated enrollment.
OBJECTIVES/GOALS: Supported by the State of Alabama, the Alabama Genomic Health Initiative (AGHI) is aimed at preventing and treating common conditions with a genetic basis. This joint UAB Medicine-HudsonAlpha Institute for Biotechnology effort provides genomic testing, interpretation, and counseling free of charge to residents in each of Alabama’s 67 counties. METHODS/STUDY POPULATION: Launched in 2017, as a state-wide population cohort, AGHI (1.0) enrolled 6,331 Alabamians and returned individual risk of disease(s) related to the ACMG SF v2.0 medically actionable genes. In 2021, the cohort was expanded to include a primary care cohort. AGHI (2.0) has enrolled 750 primary care patients, returning individual risk of disease(s) related to the ACMG SF v3.1 gene list and pre-emptive pharmacogenetics (PGx) to guide medication therapy. Genotyping is done on the Illumina Global Diversity Array with Sanger sequencing to confirm likely pathogenic / pathogenic variants in medically actionable genes and CYP2D6 copy number variants using Taqman assays, resulting in a CLIA-grade report. Disease risk results are returned by genetic counselors and Pharmacogenetics results are returned by Pharmacists. RESULTS/ANTICIPATED RESULTS: We have engaged a statewide community (>7000 participants), returning 94 disease risk genetic reports and 500 PGx reports. Disease risk reports include increased predisposition to cancers (n=38), cardiac diseases (n=33), metabolic (n=12), other (n=11). 100% of participants harbor an actionable PGx variant, 70% are on medication with PGx guidance, 48% harbor PGx variants and are taking medications affected. In 10% of participants, pharmacists sent an active alert to the provider to consider/ recommend alternative medication. Most commonly impacted medications included antidepressants, NSAIDS, proton-pump inhibitors and tramadol. To enable the EMR integration of genomic information, we have developed an automated transfer of reports into the EMR with Genetics Reports and PGx reports viewable in Cerner. DISCUSSION/SIGNIFICANCE: We share our experience on pre-emptive implementation of genetic risk and pharmacogenetic actionability at a population and clinic level. Both patients and providers are actively engaged, providing feedback to refine the return of results. Real time alerts with guidance at the time of prescription are needed to ensure future actionability and value.
Cognitive behavioural therapy (CBT) is considered the first-line treatment for obsessive-compulsive disorder (OCD). However, some individuals with OCD remain symptomatic following CBT, and therefore understanding predictors of outcome is important for informing treatment recommendations.
Aims:
The current study aimed to provide the first synthesis of predictors of outcome following CBT for OCD in adults with a primary diagnosis of OCD, as classified by DSM-5.
Method:
Eight studies (n=359; mean age range=29.2–37.7 years; 55.4% female) were included in the systematic review.
Results:
Congruent with past reviews, there was great heterogeneity of predictors measured across the included studies. Therefore, a narrative synthesis of findings was conducted. Findings from this systematic review indicated that some OCD-related pre-treatment variables (i.e. pre-treatment severity, past CBT treatment, and levels of avoidance) and during treatment variables (i.e. poor working alliance and low treatment adherence) may be important to consider when making treatment recommendations. However, the results also indicate that demographic variables and psychological co-morbidities may not be specific predictors of treatment response.
Conclusions:
These findings add to the growing body of literature on predictors of CBT treatment outcomes for individuals with OCD.
The social sciences can help provide a deeper understanding of human-farm animal relations. However, social science research exploring problematic human-farm animal interactions can be of a sensitive nature. Studies that carry risks for participants and the researcher are known methodologically as sensitive research. However, there is little discussion in the animal welfare sciences on how best to conduct research of this nature on animal owners, despite recommendations being made for more interdisciplinary collaboration between the animal welfare sciences and social sciences. Drawing on social science research conducted in 2012 on the human element of on-farm animal welfare incidents in the Republic of Ireland, this short communication presents a case study of the sensitivities and challenges involved in carrying out social science research related to farm animal welfare. This communication details the steps involved in recruiting participants, the methodological challenges encountered, and the approaches used to overcome these challenges. Our experience suggests that when conducting socially sensitive research, careful consideration needs to be applied to the recruitment process, and the study design must aim to minimise the potential risks for all involved. Professionals in the field, such as veterinarians, can play an important role in outlining some of the implications involved, and in overcoming research challenges. Understanding the challenges to this form of research will help to maximise research potential.
We aimed to demonstrate the role of real-time, on-site, whole-genome sequencing (WGS) of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) in the management of hospital outbreaks of coronavirus disease 2019 (COVID-19).
Design:
This retrospective study was undertaken at our institutions in Sydney, New South Wales, Australia, between July 2021 and April 2022. We included SARS-CoV-2 outbreaks due to SARS-CoV-2 δ (delta) and ο (omicron) variants. All unexpected SARS-CoV-2–positive cases identified within the hospital were managed by the infection control team. An outbreak was defined as 2 or more cases acquired on a single ward. We included only outbreaks with 2 or more suspected transmission events in which WGS was utilized to assist with outbreak assessment and management.
Results:
We studied 8 outbreaks involving 266 patients and 486 staff, of whom 73 (27.4%) and 39 (8.0%), respectively, tested positive for SARS-CoV-2 during the outbreak management. WGS was used to evaluate the source of the outbreak, to establish transmission chains, to highlight deficiencies in infection control practices, and to delineate between community and healthcare acquired infection.
Conclusions:
Real-time, on-site WGS combined with epidemiologic assessment is a useful tool to guide management of hospital SARS-CoV-2 outbreaks. WGS allowed us (1) to establish likely transmission events due to personal protective equipment (PPE) breaches; (2) to detect inadequacies in infection control infrastructure including ventilation; and (3) to confirm multiple viral introductions during periods of high community SARS-CoV-2 transmission. Insights gained from WGS-guides outbreak management directly influenced policy including modifying PPE requirements, instituting routine inpatient SARS-CoV-2 surveillance, and confirmatory SARS-CoV-2 testing prior to placing patients in a cohort setting.
This commentary expands the discussion of Cesario's Missing Forces Flaw by identifying and discussing variables that influence police shooting decisions but are often absent from bias-based research. Additionally, the closing identifies novel recommendations for future contextually related research.
Italian ryegrass is a major weed in winter cereals in the south-central United States. Harvest weed seed control (HWSC) tactics that aim to remove weed seed from crop fields are a potential avenue to reduce Italian ryegrass seedbank inputs. To this effect, a 4-yr, large-plot field study was conducted in College Station, Texas, and Newport, Arkansas, from 2016 to 2019. The treatments were arranged in a split-plot design. The main-plot treatments were (1) no narrow-windrow burning (a HWSC strategy) + disk tillage immediately after harvest, (2) HWSC + disk tillage immediately after harvest, and (3) HWSC + disk tillage 1 mo after harvest. The subplot treatments were (1) pendimethalin (1,065 g ai ha−1; Prowl H2O®) as a delayed preemergence application (herbicide program #1), and (2) a premix of flufenacet (305 g ai ha−1) + metribuzin (76 g ai ha−1; Axiom®) mixed with pyroxasulfone (89 g ai ha−1; Zidua® WG) as an early postemergence application followed by pinoxaden (59 g ai ha−1; Axial® XL) in spring (herbicide program #2). After 4 yr, HWSC alone was significantly better than no HWSC. Herbicide program #2 was superior to herbicide program #1. Herbicide program #2 combined with HWSC was the most effective treatment. The combination of herbicide program #1 and standard harvest practice (no HWSC; check) led to an increase in fall Italian ryegrass densities from 4 plants m−2 in 2017 to 58 plants m−2 in 2019 at College Station. At wheat harvest, Italian ryegrass densities were 58 and 59 shoots m−2 in check plots at College Station and Newport, respectively, whereas the densities were near zero in plots with herbicide program #2 and HWSC at both locations. These results will be useful for developing an improved Italian ryegrass management strategy in this region.