We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Maladaptive daydreaming is a distinct syndrome in which the main symptom is excessive vivid fantasising that causes clinically significant distress and functional impairment in academic, vocational and social domains. Unlike normal daydreaming, maladaptive daydreaming is persistent, compulsive and detrimental to one’s life. It involves detachment from reality in favour of intense emotional engagement with alternative realities and often includes specific features such as psychomotor stereotypies (e.g. pacing in circles, jumping or shaking one’s hands), mouthing dialogues, facial gestures or enacting fantasy events. Comorbidity is common, but existing disorders do not account for the phenomenology of the symptoms. Whereas non-specific therapy is ineffective, targeted treatment seems promising. Thus, we propose that maladaptive daydreaming be considered a formal syndrome in psychiatric taxonomies, positioned within the dissociative disorders category. Maladaptive daydreaming satisfactorily meets criteria for conceptualisation as a psychiatric syndrome, including reliable discrimination from other disorders and solid interrater agreement. It involves significant dissociative aspects, such as disconnection from perception, behaviour and sense of self, and has some commonalities with but is not subsumed under existing dissociative disorders. Formal recognition of maladaptive daydreaming as a dissociative disorder will encourage awareness of a growing problem and spur theoretical, research and clinical developments.
Stuart Hall stated “the university is a critical institution or it is nothing.” When it comes to the historical study of sexual abuse in Canadian sport, until very recently, it has been very much the latter. Nothing. As part of a larger project on studies of sexual abuse in sport, we reviewed articles across the four leading sport history journals – Sport History Review, Sport in History, Journal of Sport History, and International Journal of the History of Sport – to consider what methods, sports, and demographics received the most analysis. Such an effort proved impossible. There was scholarly silence on the matter. But this raised another question. So what? Would publishing in pay-walled academic journals about so pressing a societal issue make any difference at all? Furthermore, can a PhD-touting academic – including the lead author of this paper – ever enact change via the field of history if their sole purpose is to churn out studies for the ivory tower? We think not. It requires boots on the ground. Engagement and collaboration with those Antonio Gramsci called “organic intellectuals,” so we can tend the flames of knowledge and fuel a movement. History can be the tool one wields. Public, digital history.
An assessment of systemic inflammation and nutritional status may form the basis of a framework to examine the prognostic value of cachexia in patients with advanced cancer. The objective of the study was to examine the prognostic value of the Global Leadership Initiative on Malnutrition criteria, including BMI, weight loss (WL) and systemic inflammation (as measured by the modified Glasgow Prognostic Score (mGPS)), in advanced cancer patients. Three criteria were examined in a combined cohort of patients with advanced cancer, and their relationship with survival was examined using Cox regression methods. Data were available on 1303 patients. Considering BMI and the mGPS, the 3-month survival rate varied from 74 % (BMI > 28 kg/m2) to 61 % (BMI < 20 kg/m2) and from 84 % (mGPS 0) to 60 % (mGPS 2). Considering WL and the mGPS, the 3-month survival rate varied from 81 % (WL ± 2·4 %) to 47 % (WL ≥ 15 %) and from 93 % (mGPS 0) to 60 % (mGPS 2). Considering BMI/WL grade and mGPS, the 3-month survival rate varied from 86 % (BMI/WL grade 0) to 59 % (BMI/WL grade 4) and from 93 % (mGPS 0) to 63 % (mGPS 2). When these criteria were combined, they better predicted survival. On multivariate survival analysis, the most highly predictive factors were BMI/WL grade 3 (HR 1·454, P = 0·004), BMI/WL grade 4 (HR 2·285, P < 0·001) and mGPS 1 and 2 (HR 1·889, HR 2·545, all P < 0·001). In summary, a high BMI/WL grade and a high mGPS as outlined in the BMI/WL grade/mGPS framework were consistently associated with poorer survival of patients with advanced cancer. It can be readily incorporated into the routine assessment of patients.
The bright radio source, GLEAM J091734$-$001243 (hereafter GLEAM J0917$-$0012), was previously selected as a candidate ultra-high redshift ($z \gt 5$) radio galaxy due to its compact radio size and faint magnitude ($K(\mathrm{AB})=22.7$). Its redshift was not conclusively determined from follow-up millimetre and near-infrared spectroscopy. Here we present new HST WFC3 G141 grism observations which reveal several emission lines including [NeIII]$\lambda$3867, [NeV]$\lambda$3426 and an extended ($\approx 4.8\,$kpc), [OII]$\lambda$3727 line which confirm a redshift of $3.004\pm0.001$. The extended component of the [OII]$\lambda$3727 line is co-spatial with one of two components seen at 2.276 GHz in high resolution ($60\times 20\,$mas) Long Baseline Array data, reminiscent of the alignments seen in local compact radio galaxies. The BEAGLE stellar mass ($\approx 2\times 10^{11}\,\textit{M}_\odot$) and radio luminosity ($L_{\mathrm{500MHz}}\approx 10^{28}\,$W Hz$^{-1}$) put GLEAM J0917$-$0012 within the distribution of the brightest high-redshift radio galaxies at similar redshifts. However, it is more compact than all of them. Modelling of the radio jet demonstrates that this is a young, $\approx 50\,$kyr old, but powerful, $\approx 10^{39}\,$W, compact steep spectrum radio source. The weak constraint on the active galactic nucleus bolometric luminosity from the [NeV]$\lambda$3426 line combined with the modelled jet power tentatively implies a large black hole mass, $\ge 10^9\,\textit{M}_\odot$, and a low, advection-dominated accretion rate, i.e. an Eddington ratio $\le 0.03$. The [NeV]$\lambda$3426/[NeIII]$\lambda$3867 vs [OII]$\lambda$3727/[NeIII]$\lambda$3867 line ratios are most easily explained by radiative shock models with precursor photoionisation. Hence, we infer that the line emission is directly caused by the shocks from the jet and that this radio source is one of the youngest and most powerful known at cosmic noon. We speculate that the star-formation in GLEAM J0917$-$0012 could be on its way to becoming quenched by the jet.
Natural selection should favour litter sizes that optimise trade-offs between brood-size and offspring viability. Across the primate order, the modal litter size is one, suggesting a deep history of selection favouring minimal litters in primates. Humans, however – despite having the longest juvenile period and slowest life-history of all primates – still produce twin births at appreciable rates, even though such births are costly. This presents an evolutionary puzzle. Why is twinning still expressed in humans despite its cost? More puzzling still is the discordance between the principal explanations for human twinning and extant empirical data. Such explanations propose that twinning is regulated by phenotypic plasticity in polyovulation, permitting the production of larger sib sets if and when resources are abundant. However, comparative data suggest that twinning rates are actually highest in poorer economies and lowest in richer, more developed economies. We propose that a historical dynamic of gene–culture co-evolution might better explain this geographic patterning. Our explanation distinguishes geminophilous and geminophobic cultural contexts, as those celebrating twins (e.g. through material support) and those hostile to twins (e.g. through sanction of twin-infanticide). Geminophilous institutions, in particular, may buffer the fitness cost associated with twinning, potentially reducing selection pressures against polyovulation. We conclude by synthesising a mathematical and empirical research programme that might test our ideas.
Upon succeeding his brother as king of Iran, Hirasıp gives his son, Qahtarān, the following advice:
Never stop being a champion. Never take an interest in kingship. Be a friend to your friends and an enemy to your enemies. Do not oppose a necessary and just king. Serve him and never go against his wishes. Being a champion is a greater thing than being a king, because a king becomes a king only through the deeds of his champions.
Hirasıp's advice neatly encapsulates the overarching message of the Qahramān-nāma / Qahramān-i Qātil, a medieval Persian romance that, in Turkish translation, circulated in Russia's Volga-Ural Muslim communities from the time of the Golden Horde until the early twentieth century. Central to Qahramān's enduring popularity were its portrayals of masculine conduct in times of conflict and heroic submission to a higher authority. These models were clearly defined but also proved flexible enough to be adapted and reinterpreted as the political, social, and cultural landscape of the Volga-Ural region changed from the fourteenth century to the eighteenth century. In the political world of the Golden Horde and its successor states, Qahramān offered a template for warrior-aristocratic masculinity that reinforced views on cooperation, submission, and service expressed in Chinggisid literature and histories. After the mid-sixteenth century, as the Volga-Ural region's Chinggisid-Muslim political culture was gradually replaced by a Russian Christian one, the titular hero of the Qahramān romance was embraced as an ideal for Muslim masculine self-discipline in a society in which Islam was no longer the politically dominant faith.
Qahramān offers a glimpse of the complexities of masculinity in a medieval and early modern Inner Asian society. In examining these complexities, this essay approaches masculinity as something that individuals were expected to achieve and demonstrate rather than as an inborn trait. The character arcs of Qahramān's male characters and, especially, of the titular hero, center around the achievement of full membership in a warrior-aristocratic elite and the continuing display of the characteristics valued by that elite. At the same time, both the specific masculinity to which Qahramān's men aspire and the aspects of that masculinity that readers of Qahramān emphasized at different historical moments were determined in relation to social class, the political order, and other co-existing masculinities.
Recent conceptualizations of concussion symptoms have begun to shift from a latent perspective (which suggests a common cause; i.e., head injury), to a network perspective (where symptoms influence and interact with each other throughout injury and recovery). Recent research has examined the network structure of the Post-Concussion Symptom Scale (PCSS) cross-sectionally at pre-and post-concussion, with the most important symptoms including dizziness, sadness, and feeling more emotional. However, within-subject comparisons between network structures at pre-and post-concussion have yet to be made. These analyses can provide invaluable information on whether concussion alters symptom interactions. This study examined within-athlete changes in PCSS network connectivity and centrality (the importance of different symptoms within the networks) from baseline to post-concussion.
Participants and Methods:
Participants were selected from a larger longitudinal database of high school athletes who completed the PCSS in English as part of their standard athletic training protocol (N=1,561). The PCSS is a 22-item self-report measure of common concussion symptoms (i.e., headache, vomiting, dizziness, etc.) in which individuals rate symptom severity on a 7-point Likert scale. Participants were excluded if they endorsed history of brain surgery, neurodevelopmental disorder, or treatment history for epilepsy, migraines, psychiatric disorders, or alcohol/substance use. Network analysis was conducted on PCSS ratings from a baseline and acute post-concussion (within 72-hours post-injury) assessment. In each network, the nodes represented individual symptoms, and the edges connecting them their partial correlations. Estimations of the regularized partial correlation networks were completed using the Gaussian graphical model, and the GLASSO algorithm was used for regularization. Each symptom’s expected influence (the sum of its partial correlations with other symptoms) was calculated to identify the most central symptoms in each network. Recommended techniques from Epskamp et al. (2018) were completed for assessing the accuracy of the estimated symptom importance and relationships. Network Comparison Tests were conducted to observe changes in network connectivity, structure, and node influence.
Results:
Both baseline and acute post-concussion networks contained negative and positive relationships. The expected influence of symptoms was stable in both networks, with difficulty concentrating having the greatest expected influence in both. The strongest edges in the networks were between symptoms within similar domains of functioning (e.g., sleeping less was associated with trouble falling asleep). Network connectivity was not significantly different between networks (S=0.43), suggesting the overall degree to which symptoms are related was not different at acute post-concussion. Network structure significantly differed at acute post-concussion (M=0.305), suggesting specific relationships in the acute post-concussion network were different than they were at baseline. In the acute post concussion network, vomiting was less central and sensitivity to noise and mentally foggy more central.
Conclusions:
PCSS network structure at acute post-concussion is altered, suggesting concussion may disrupt symptom networks and certain symptoms’ associations with the experience of others after sustaining a concussive injury. Future research should compare PCSS networks later in recovery to examine if similar structural changes remain or return to baseline structure, with the potential that observing PCSS network structure changes post-concussion could inform symptom resolution trajectories.
Previous studies have found differences between monolingual and bilingual athletes on ImPACT, the most widely used sport-related concussion (SRC) assessment measure. Most recently, results suggest that monolingual English-Speaking athletes outperformed bilingual English- and Spanish-speaking athletes on Visual Motor Speed and Reaction Time composites. Before further investigation of these differences can occur, measurement invariance of ImPACT must be established to ensure that differences are not attributable to measurement error. The current study aimed to 1) replicate a recently identified four-factor model using cognitive subtest scores of ImPACT on baseline assessments in monolingual English-Speaking athletes and bilingual English- and Spanish-speaking athletes and 2) to establish measurement invariance across groups.
Participants and Methods:
Participants included high school athletes who were administered the ImPACT as part of their standard pre-season athletic training protocol in English. Participants were excluded if they had a self-reported history of concussion, Autism, ADHD, learning disability or treatment history of epilepsy/seizures, brain surgery, meningitis, psychiatric disorders, or substance/alcohol use. The final sample included 7,948 monolingual English-speaking athletes and 7,938 bilingual English- and Spanish-speaking athletes with valid baseline assessments. Language variables were based on self-report. As the number of monolingual athletes was substantially larger than the number of bilingual athletes, monolingual athletes were randomly selected from a larger sample to match the bilingual athletes on age, sex, and sport. Confirmatory factor analysis (CFA) was used to test competing models, including one-factor, two-factor, and three-factor models to determine if a recently identified four-factor model (Visual Memory, Visual Reaction Time, Verbal Memory, Working Memory) provided the best fit of the data. Eighteen subtest scores from ImPACT were used in the CFAs. Through increasingly restrictive multigroup CFAs (MGCFA), configural, metric, scalar, and residual levels of invariance were assessed by language group.
Results:
CFA indicated that the four-factor model provided the best fit in the monolingual and bilingual samples compared to competing models. However, some goodness-of-fit-statistics were below recommended cutoffs, and thus, post-hoc model modifications were made on a theoretical basis and by examination of modification indices. The modified four-factor model had adequate to superior fit and met criteria for all goodness-of-fit indices and was retained as the configural model to test measurement invariance across language groups. MGCFA revealed that residual invariance, the strictest level of invariance, was achieved across groups.
Conclusions:
This study provides support for a modified four-factor model as estimating the latent structure of ImPACT cognitive scores in monolingual English-speaking and bilingual English- and Spanish-speaking high school athletes at baseline assessment. Results further suggest that differences between monolingual English-speaking and bilingual English- and Spanish-speaking athletes reported in prior ImPACT studies are not caused by measurement error. The reason for these differences remains unclear but are consistent with other studies suggesting monolingual advantages. Given the increase in bilingual individuals in the United States, and among high school athletics, future research should investigate other sources of error such as item bias and predictive validity to further understand if group differences reflect real differences between these athletes.
Palliative care necessitates questions about the preferred place for delivering care and location of death. Place is integral to palliative care, as it can impact proximity to family, available resources/support, and patient comfort. Despite the importance of place, there is remarkably little literature exploring its role in pediatric palliative care (PPC).
Objectives
To understand the importance and meaning of place in PPC.
Methods
We conducted a scoping review to understand the importance of place in PPC. Five databases were searched using keywords related to “pediatric,” “palliative,” and “place.” Two reviewers screened results, extracted data, and analyzed emergent themes pertaining to place.
Results
From 3076 search results, we identified and reviewed 25 articles. The literature highlights hospital, home, and hospice as 3 distinct PPC places. Children and their families have place preferences for PPC and place of death, and a growing number prefer death to occur at home. Results also indicate numerous factors influence place preferences (e.g., comfort, grief, cultural/spiritual practices, and socioeconomic status).
Significance of results
Place influences families’ PPC decisions and experiences and thus warrants further study. Greater understanding of the importance and roles of place in PPC could enhance PPC policy and practice, as well as PPC environments.
We have investigated the spectral evolutions of H2O and SiO masers associated with 12 “water fountain” sources in our FLASHING (Finest Legacy Acquisitions of SiO-/H2O-maser Ignitions by Nobeyama Generation) project. Our monitoring observations have been conducted using the Nobeyama 45 m telescope every 2 weeks–2 months since 2018 December except during summer seasons. We have found new extremely high velocity H2O maser components, breaking the records of jet speeds in this type of sources. Systematic line-of-sight velocity drifts of the H2O maser spectral peaks have also been found, indicating acceleration of the entrained material hosting the masers around the jet. Moreover, by comparing with previous spectral data, we can find decadal growths/decays of H2O maser emission. Possible periodic variations of the maser spectra are further being inspected in order to explore the periodicity of the central stellar system (a pulsating star or a binary). Thus we expect to see the real-time evolution/devolutions of the water fountains over decades.
We report new detections of SiO ν = 1 and ν = 2 J = 1 → 0 masers in the “water fountain” source IRAS 16552-3050, which was observed with the Nobeyama 45 m telescope from March 2021 to April 2023. Water fountains are evolved stars whose H2O maser spectra trace high-velocity outflows of >100 Km s−. This is the second known case of SiO masers in a water fountain, after their prototypical source, W 43A. The line-of-sight velocity of the SiO masers are blue-shifted by ∼25 km s−1 from the systemic velocity. This velocity offset imply that the SiO masers are associated with nozzle structure formed by a jet penetrating the circumstellar envelope, and that new gas blobs of the jet erupted recently. Thus, the SiO masers imply this star to be in a new evolutionary stage.
To reduce both inappropriate testing for and diagnosis of healthcare-onset (HO) Clostridioides difficile infections (CDIs).
Design:
We performed a retrospective analysis of C. difficile testing from hospitalized children before (October 2017–October 2018) and after (November 2018–October 2020) implementing restrictive computerized provider order entry (CPOE).
Setting:
Study sites included hospital A (a ∼250-bed freestanding children’s hospital) and hospital B (a ∼100-bed children’s hospital within a larger hospital) that are part of the same multicampus institution.
Methods:
In October 2018, we implemented CPOE. No testing was allowed for infants aged ≤12 months, approval of the infectious disease team was required to test children aged 13–23 months, and pathology residents’ approval was required to test all patients aged ≥24 months with recent laxative, stool softener, or enema use. Interrupted time series analysis and Mann-Whitney U test were used for analysis.
Results:
An interrupted time series analysis revealed that from October 2017 to October 2020, the numbers of tests ordered and samples sent significantly decreased in all age groups (P < .05). The monthly median number of HO-CDI cases significantly decreased after implementation of the restrictive CPOE in children aged 13–23 months (P < .001) and all ages combined (P = .003).
Conclusion:
Restrictive CPOE for CDI in pediatrics was successfully implemented and sustained. Diagnostic stewardship for CDI is likely cost-saving and could decrease misdiagnosis, unnecessary antibiotic therapy, and overestimation of HO-CDI rates.
Deficits in visuospatial attention, known as neglect, are common following brain injury, but underdiagnosed and poorly treated, resulting in long-term cognitive disability. In clinical settings, neglect is often assessed using simple pen-and-paper tests. While convenient, these cannot characterise the full spectrum of neglect. This protocol reports a research programme that compares traditional neglect assessments with a novel virtual reality attention assessment platform: The Attention Atlas (AA).
Methods/design:
The AA was codesigned by researchers and clinicians to meet the clinical need for improved neglect assessment. The AA uses a visual search paradigm to map the attended space in three dimensions and seeks to identify the optimal parameters that best distinguish neglect from non-neglect, and the spectrum of neglect, by providing near-time feedback to clinicians on system-level behavioural performance. A series of experiments will address procedural, scientific, patient, and clinical feasibility domains.
Results:
Analyses focuses on descriptive measures of reaction time, accuracy data for target localisation, and histogram-based raycast attentional mapping analysis; which measures the individual’s orientation in space, and inter- and intra-individual variation of visuospatial attention. We will compare neglect and control data using parametric between-subjects analyses. We present example individual-level results produced in near-time during visual search.
Conclusions:
The development and validation of the AA is part of a new generation of translational neuroscience that exploits the latest advances in technology and brain science, including technology repurposed from the consumer gaming market. This approach to rehabilitation has the potential for highly accurate, highly engaging, personalised care.
The place of interfaith dialogue in Orthodox Judaism has been the subject of extensive discussion. This article offers a reading of Rabbi Joseph Soloveitchik's and Rabbi Irving Greenberg's stances on interfaith dialogue that situates them in a Jewish philosophical context. Some scholars have argued that Soloveitchik's refusal to engage in Jewish-Christian theological dialogue must be understood historically; others have argued that his opposition to such dialogue must be understood halakhically. This article, building upon the view articulated by Daniel Rynhold in his 2003 article that Soloveitchik's stance on interfaith dialogue must be understood philosophically, posits that in order for Soloveitchik's stance on interfaith dialogue to be fully understood, it should be studied bearing in mind the influence of Hermann Cohen upon Soloveitchik's religious philosophy. This article, which demonstrates the direct influence of Franz Rosenzweig upon aspects of Greenberg's thought, further argues that in order for Greenberg's stance on interfaith dialogue—as well as his interfaith theology—to be completely grasped, his positions upon these theological matters must be studied with the awareness of Franz Rosenzweig's influence upon his thought. The reading offered in this article of Cohen and Soloveitchik and of Rosenzweig and Greenberg does not purport to minimize the irreconcilable differences between these thinkers; nonetheless, it believes that the substantial resemblances—and, in the case of Rosenzweig and Greenberg, the direct influence—between the views of Christianity held by these pairs of figures are significant and suggest a reconsideration of the role of philosophy in the story of American Jewish theology.
The Immediate Post-Concussion Assessment and Cognitive Testing (ImPACT) is commonly used to assist with post-concussion return-to-play decisions for athletes. Additional investigation is needed to determine whether embedded indicators used to determine the validity of scores are influenced by the presence of neurodevelopmental disorders (NDs).
Method:
This study examined standard and novel ImPACT validity indicators in a large sample of high school athletes (n = 33,772) with or without self-reported ND.
Results:
Overall, 7.1% of athletes’ baselines were judged invalid based on standard ImPACT validity criteria. When analyzed by group (healthy, ND), there were significantly more invalid ImPACT baselines for athletes with an ND diagnosis or special education history (between 9.7% and 54.3% for standard and novel embedded validity criteria) when compared to athletes without NDs. ND history was a significant predictor of invalid baseline performance above and beyond other demographic characteristics (i.e., age, sex, and sport), although it accounted for only a small percentage of variance. Multivariate base rates are presented stratified for age, sex, and ND.
Conclusions:
These data provide evidence of higher than normal rates of invalid baselines in athletes who report ND (based on both the standard and novel embedded validity indicators). Although ND accounted for a small percentage of variance in the prediction of invalid performance, negative consequences (e.g., extended time out of sports) of incorrect decision-making should be considered for those with neurodevelopmental conditions. Also, reasons for the overall increase noted here, such as decreased motivation, “sandbagging”, or disability-related cognitive deficit, require additional investigation.
Little is known about practices used to disseminate findings to non-research, practitioner audiences. This study describes the perspectives, experience and activities of dissemination & implementation (D&I) scientists around disseminating their research findings.
Methods:
The study explored D&I scientists’ experiences and recommendations for assessment of dissemination activities to non-research audiences. Existing list serves were used to recruit scientists. Respondents were asked three open-ended questions on an Internet survey about dissemination activities, recommendations for changing evaluation systems and suggestions to improve their own dissemination of their work.
Results:
Surveys were completed by 159 scientists reporting some training, funding and/or publication history in D&I. Three themes emerged across each of the three open-ended questions. Question 1 on evaluation generated the themes of: 1a) promotional review; 1b) funding requirements and 1c) lack of acknowledgement of dissemination activities. Question 2 on recommended changes generated the themes of: 2a) dissemination as a requirement of the academic promotion process; 2b) requirement of dissemination plan and 2c) dissemination metrics. Question 3 on personal changes to improve dissemination generated the themes of: 3a) allocation of resources for dissemination activities; 3b) emerging dissemination channels and 3c) identify and address issues of priority for stakeholders.
Conclusions:
Our findings revealed different types of issues D&I scientists encounter when disseminating findings to clinical, public health or policy audiences and their suggestions to improve the process. Future research should consider key requirements which determine academic promotion and grant funding as an opportunity to expand dissemination efforts.
Family-based treatment (FBT) is an efficacious intervention for adolescents with an eating disorder. Evaluated to a lesser degree among adolescents, enhanced cognitive-behavior therapy (CBT-E) has shown promising results. This study compared the relative effectiveness of FBT and CBT-E, and as per manualized CBT-E, the sample was divided into a lower weight [<90% median body mass index (mBMI)], and higher weight cohort (⩾90%mBMI).
Method
Participants (N = 97) aged 12–18 years, with a DSM-5 eating disorder diagnosis (largely restrictive, excluding Avoidant Restrictive Food Intake Disorder), and their parents, chose between FBT and CBT-E. Assessments were administered at baseline, end-of-treatment (EOT), and follow-up (6 and 12 months). Treatment comprised of 20 sessions over 6 months, except for the lower weight cohort where CBT-E comprised 40 sessions over 9–12 months. Primary outcomes were slope of weight gain and change in Eating Disorder Examination (EDE) Global Score at EOT.
Results
Slope of weight gain at EOT was significantly higher for FBT than for CBT-E (lower weight, est. = 0.597, s.e. = 0.096, p < 0.001; higher weight, est. = 0.495, s.e. = 0.83, p < 0.001), but not at follow-up. There were no differences in the EDE Global Score or most secondary outcome measures at any time-point. Several baseline variables emerged as potential treatment effect moderators at EOT. Choosing between FBT and CBT-E resulted in older and less well participants opting for CBT-E.
Conclusions
Results underscore the efficiency of FBT to facilitate weight gain among underweight adolescents. FBT and CBT-E achieved similar outcomes in other domains assessed, making CBT-E a viable treatment for adolescents with an eating disorder.