We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Next-generation X-ray satellite telescopes such as XRISM, NewAthena and Lynx will enable observations of exotic astrophysical sources at unprecedented spectral and spatial resolution. Proper interpretation of these data demands that the accuracy of the models is at least within the uncertainty of the observations. One set of quantities that might not currently meet this requirement is transition energies of various astrophysically relevant ions. Current databases are populated with many untested theoretical calculations. Accurate laboratory benchmarks are required to better understand the coming data. We obtained laboratory spectra of X-ray lines from a silicon plasma at an average spectral resolving power of $\sim$7500 with a spherically bent crystal spectrometer on the Z facility at Sandia National Laboratories. Many of the lines in the data are measured here for the first time. We report measurements of 53 transitions originating from the K-shells of He-like to B-like silicon in the energy range between $\sim$1795 and 1880 eV (6.6–6.9 Å). The lines were identified by qualitative comparison against a full synthetic spectrum calculated with ATOMIC. The average fractional uncertainty (uncertainty/energy) for all reported lines is ${\sim}5.4 \times 10^{-5}$. We compare the measured quantities against transition energies calculated with RATS and FAC as well as those reported in the NIST ASD and XSTAR’s uaDB. Average absolute differences relative to experimentally measured values are 0.20, 0.32, 0.17 and 0.38 eV, respectively. All calculations/databases show good agreement with the experimental values; NIST ASD shows the closest match overall.
Clinical guidelines for personality disorder emphasise the importance of patients being supported to develop psychological skills to help them manage their symptoms and behaviours. But where these mechanisms fail, and hospital admission occurs, little is known about how episodes of acutely disturbed behaviour are managed.
Aims
To explore the clinical characteristics and management of episodes of acutely disturbed behaviour requiring medication in in-patients with a diagnosis of personality disorder.
Method
Analysis of clinical audit data collected in 2024 by the Prescribing Observatory for Mental Health, as part of a quality improvement programme addressing the pharmacological management of acutely disturbed behaviour. Data were collected from clinical records using a bespoke proforma.
Results
Sixty-two mental health Trusts submitted data on 951 episodes of acutely disturbed behaviour involving patients with a personality disorder, with this being the sole psychiatric diagnosis in 471 (50%). Of the total, 782 (82%) episodes occurred in female patients. Compared with males, episodes in females were three times more likely to involve self-harming behaviour or be considered to pose such a risk (22% and 70% respectively: p < 0.001). Parenteral medication (rapid tranquillisation) was administered twice as often in episodes involving females than in males (64 and 34% respectively: p < 0.001).
Conclusions
Our findings suggest that there are a large number of episodes of acutely disturbed behaviour on psychiatric wards in women with a diagnosis of personality disorder. These episodes are characterised by self-harm and regularly prompt the administration of rapid tranquillisation. This has potential implications for service design, staff training, and research.
Patients with posttraumatic stress disorder (PTSD) exhibit smaller regional brain volumes in commonly reported regions including the amygdala and hippocampus, regions associated with fear and memory processing. In the current study, we have conducted a voxel-based morphometry (VBM) meta-analysis using whole-brain statistical maps with neuroimaging data from the ENIGMA-PGC PTSD working group.
Methods
T1-weighted structural neuroimaging scans from 36 cohorts (PTSD n = 1309; controls n = 2198) were processed using a standardized VBM pipeline (ENIGMA-VBM tool). We meta-analyzed the resulting statistical maps for voxel-wise differences in gray matter (GM) and white matter (WM) volumes between PTSD patients and controls, performed subgroup analyses considering the trauma exposure of the controls, and examined associations between regional brain volumes and clinical variables including PTSD (CAPS-4/5, PCL-5) and depression severity (BDI-II, PHQ-9).
Results
PTSD patients exhibited smaller GM volumes across the frontal and temporal lobes, and cerebellum, with the most significant effect in the left cerebellum (Hedges’ g = 0.22, pcorrected = .001), and smaller cerebellar WM volume (peak Hedges’ g = 0.14, pcorrected = .008). We observed similar regional differences when comparing patients to trauma-exposed controls, suggesting these structural abnormalities may be specific to PTSD. Regression analyses revealed PTSD severity was negatively associated with GM volumes within the cerebellum (pcorrected = .003), while depression severity was negatively associated with GM volumes within the cerebellum and superior frontal gyrus in patients (pcorrected = .001).
Conclusions
PTSD patients exhibited widespread, regional differences in brain volumes where greater regional deficits appeared to reflect more severe symptoms. Our findings add to the growing literature implicating the cerebellum in PTSD psychopathology.
DSM-5 specifies bulimia nervosa (BN) severity based on specific thresholds of compensatory behavior frequency. There is limited empirical support for such severity groupings. Limited support could be because the DSM-5’s compensatory behavior frequency cutpoints are inaccurate or because compensatory behavior frequency does not capture true underlying differences in severity. In support of the latter possibility, some work has suggested shape/weight overvaluation or use of single versus multiple purging methods may be better severity indicators. We used structural equation modeling (SEM) Trees to empirically determine the ideal variables and cutpoints for differentiating BN severity, and compared the SEM Tree groupings to alternate severity classifiers: the DSM-5 indicators, single versus multiple purging methods, and a binary indicator of shape/weight overvaluation.
Methods
Treatment-seeking adolescents and adults with BN (N = 1017) completed self-report measures assessing BN and comorbid symptoms. SEM Trees specified an outcome model of BN severity and recursively partitioned this model into subgroups based on shape/weight overvaluation and compensatory behaviors. We then compared groups on clinical characteristics (eating disorder symptoms, depression, anxiety, and binge eating frequency).
Results
SEM Tree analyses resulted in five severity subgroups, all based on shape/weight overvaluation: overvaluation <1.25; overvaluation 1.25–3.74; overvaluation 3.75–4.74; overvaluation 4.75–5.74; and overvaluation ≥5.75. SEM Tree groups explained 1.63–6.41 times the variance explained by other severity schemes.
Conclusions
Shape/weight overvaluation outperformed the DSM-5 severity scheme and single versus multiple purging methods, suggesting the DSM-5 severity scheme should be reevaluated. Future research should examine the predictive utility of this severity scheme.
The Society for Healthcare Epidemiology of America, the Association of Professionals in Infection Control and Epidemiology, the Infectious Diseases Society of America, and the Pediatric Infectious Diseases Society represent the core expertise regarding healthcare infection prevention and infectious diseases and have written multisociety statement for healthcare facility leaders, regulatory agencies, payors, and patients to strengthen requirements and expectations around facility infection prevention and control (IPC) programs. Based on a systematic literature search and formal consensus process, the authors advocate raising the expectations for facility IPC programs, moving to effective programs that are:
• Foundational and influential parts of the facility’s operational structure
• Resourced with the correct expertise and leadership
• Prioritized to address all potential infectious harms
This document discusses the IPC program’s leadership—a dyad model that includes both physician and infection preventionist leaders—its reporting structure, expertise, and competencies of its members, and the roles and accountability of partnering groups within the healthcare facility. The document outlines a process for identifying minimum IPC program medical director support. It applies to all types of healthcare settings except post-acute long-term care and focuses on resources for the IPC program. Long-term acute care hospital (LTACH) staffing and antimicrobial stewardship programs will be discussed in subsequent documents.
Inadequate recruitment and retention impede clinical trial goals. Emerging decentralized clinical trials (DCTs) leveraging digital health technologies (DHTs) for remote recruitment and data collection aim to address barriers to participation in traditional trials. The ACTIV-6 trial is a DCT using DHTs, but participants’ experiences of such trials remain largely unknown. This study explored participants’ perspectives of the ACTIV-6 DCT that tested outpatient COVID-19 therapeutics.
Methods:
Participants in the ACTIV-6 study were recruited via email to share their day-to-day trial experiences during 1-hour virtual focus groups. Two human factors researchers guided group discussions through a semi-structured script that probed expectations and perceptions of study activities. Qualitative data analysis was conducted using a grounded theory approach with open coding to identify key themes.
Results:
Twenty-eight ACTIV-6 study participants aged 30+ years completed a virtual focus group including 1–4 participants each. Analysis yielded three major themes: perceptions of the DCT experience, study activity engagement, and trust. Participants perceived the use of remote DCT procedures supported by DHTs as an acceptable and efficient method of organizing and tracking study activities, communicating with study personnel, and managing study medications at home. Use of social media was effective in supporting geographically dispersed participant recruitment but also raised issues with trust and study legitimacy.
Conclusions:
While participants in this qualitative study viewed the DCT-with-DHT approach as reasonably efficient and engaging, they also identified challenges to address. Understanding facilitators and barriers to DCT participation and DHT interaction can help improve future research design.
Having a relapse of schizophrenia or recurrent psychosis is feared by patients, can cause social and personal disruption and has been suggested to cause long-term deterioration, possibly because of a toxic biological process.
Aims
To assess whether relapse affected the social and clinical outcomes of people enrolled in a 24-month randomised controlled trial of antipsychotic medication dose reduction versus maintenance treatment.
Methods
The trial involved participants with a diagnosis of schizophrenia or recurrent, non-affective psychosis. Relapse was defined as admission to hospital or significant deterioration (assessed by a blinded end-point committee). We analysed the relationship between relapse during the trial and social functioning, quality of life, symptom scores (Positive and Negative Syndrome Scale) and rates of being in employment, education or training at 24-month follow-up. We also analysed changes in these measures during the trial among those who relapsed and those who did not. Sensitivity analyses were conducted examining the effects of ‘severe’ relapse (i.e. admission to hospital).
Results
During the course of the trial, 82 out of 253 participants relapsed. There was no evidence for a difference between those who relapsed and those who did not on changes in social functioning, quality of life, symptom scores or overall employment rates between baseline and 24-month follow-up. Those who relapsed showed no change in their social functioning or quality of life, and a slight improvement in symptoms compared to baseline. They were more likely than those who did not relapse to have had a change in their employment status (mostly moving out of employment, education or training), although numbers changing status were small. Sensitivity analyses showed the same results for those who experienced a ‘severe’ relapse.
Conclusions
Our data provide little evidence that relapse has a detrimental effect in the long term in people with schizophrenia and recurrent psychosis.
To compare rates of clinical response in children with Clostridioides difficile infection (CDI) treated with metronidazole vs vancomycin.
Design:
Retrospective cohort study was performed as a secondary analysis of a previously established prospective cohort of hospitalized children with CDI. For 187 participants 2–17 years of age who were treated with metronidazole and/or vancomycin, the primary outcome of clinical response (defined as resolution of diarrhea within 5 days of treatment initiation) was identified retrospectively. Baseline variables associated with the primary outcome were included in a logistic regression propensity score model estimating the likelihood of receiving metronidazole vs vancomycin. Logistic regression using inverse probability of treatment weighting (IPTW) was used to estimate the effect of treatment on clinical response.
Results:
One hundred seven subjects received metronidazole and 80 subjects received vancomycin as primary treatment. There was no univariable association between treatment group and clinical response; 78.30% (N = 83) of the metronidazole treatment group and 78.75% (N = 63) of the vancomycin group achieved clinical response (P = 0.941). After adjustment using propensity scores with IPTW, the odds of a clinical response for participants who received metronidazole was 0.554 (95% CI: 0.272, 1.131) times the odds of those who received vancomycin (P = 0.105).
Conclusions:
In this observational cohort study of pediatric inpatients with CDI, the rate of resolution of diarrhea after 5 days of treatment did not differ among children who received metronidazole vs vancomycin.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
To understand healthcare workers’ (HCWs) beliefs and practices toward blood culture (BCx) use.
Design:
Cross-sectional electronic survey and semi-structured interviews.
Setting:
Academic hospitals in the United States.
Participants:
HCWs involved in BCx ordering and collection in adult intensive care units (ICU) and wards.
Methods:
We administered an anonymous electronic survey to HCWs and conducted semi-structured interviews with unit staff and quality improvement (QI) leaders in these institutions to understand their perspectives regarding BCx stewardship between February and November 2023.
Results:
Of 314 HCWs who responded to the survey, most (67.4%) were physicians and were involved in BCx ordering (82.3%). Most survey respondents reported that clinicians had a low threshold to culture patients for fever (84.4%) and agreed they could safely reduce the number of BCx obtained in their units (65%). However, only half of them believed BCx was overused. Although most made BCx decisions as a team (74.1%), a minority reported these team discussions occurred daily (42.4%). A third of respondents reported not usually collecting the correct volume per BCx bottle, half were unaware of the improved sensitivity of 2 BCx sets, and most were unsure of the nationally recommended BCx contamination threshold (87.5%). Knowledge regarding the utility of BCx for common infections was limited.
Conclusions:
HCWs’ understanding of best collection practices and yield of BCx was limited.
To examine feasibility, acceptability, and preliminary effectiveness of a novel group-based telemedicine psychoeducation programme aimed at supporting psychological well-being among adolescents with Fontan-palliated CHD.
Study design:
A 5-week telemedicine psychoeducation group-based programme (WE BEAT) was developed for adolescents (N = 20; 13–18 years) with Fontan-palliated CHD aimed at improving resiliency and psychological well-being. Outcome measures included surveys of resilience (Connor–Davidson Resilience Scale), benefit finding (Benefit/Burden Scale for Children), depression, anxiety, peer relationships, and life satisfaction (National Institutes of Health Patient-Reported Outcomes Measurement Information System scales). Within-subject changes in these outcomes were compared pre- to post-intervention using Cohen’s d effect size. In addition, acceptability in the form of satisfaction measures and qualitative feedback was assessed.
Results:
Among eligible patients reached, 68% expressed interest in study participation. Of those consented, 77% have been scheduled for a group programme to date with 87% programme completion. Twenty adolescents (mean age 16.1 ± SD 1.6 years) participated across five WE BEAT group cohorts (range: 3–6 participants per group). The majority (80%) attended 4–5 sessions in the 5-session programme, and the median programme rating was a 9 out of 10 (10 = most favourable rating). Following WE BEAT participation, resiliency (d = 0.44) and perceptions of purpose in life increased (d = 0.26), while depressive symptoms reduced (d = 0.36). No other changes in assessed outcome measures were noted.
Conclusions:
These findings provide preliminary support that a group-based, telemedicine delivered psychoeducation programme to support psychological well-being among adolescents with CHD is feasible, acceptable, and effective. Future directions include examining intervention effects across diverse centres, populations, and implementation methods.
In response to the COVID-19 pandemic, we rapidly implemented a plasma coordination center, within two months, to support transfusion for two outpatient randomized controlled trials. The center design was based on an investigational drug services model and a Food and Drug Administration-compliant database to manage blood product inventory and trial safety.
Methods:
A core investigational team adapted a cloud-based platform to randomize patient assignments and track inventory distribution of control plasma and high-titer COVID-19 convalescent plasma of different blood groups from 29 donor collection centers directly to blood banks serving 26 transfusion sites.
Results:
We performed 1,351 transfusions in 16 months. The transparency of the digital inventory at each site was critical to facilitate qualification, randomization, and overnight shipments of blood group-compatible plasma for transfusions into trial participants. While inventory challenges were heightened with COVID-19 convalescent plasma, the cloud-based system, and the flexible approach of the plasma coordination center staff across the blood bank network enabled decentralized procurement and distribution of investigational products to maintain inventory thresholds and overcome local supply chain restraints at the sites.
Conclusion:
The rapid creation of a plasma coordination center for outpatient transfusions is infrequent in the academic setting. Distributing more than 3,100 plasma units to blood banks charged with managing investigational inventory across the U.S. in a decentralized manner posed operational and regulatory challenges while providing opportunities for the plasma coordination center to contribute to research of global importance. This program can serve as a template in subsequent public health emergencies.
Tiafenacil is a new nonselective protoporphyrinogen IX oxidase (PPO)–inhibiting herbicide with both grass and broadleaf activity labeled for preplant application to corn, cotton, soybean, and wheat. Early-season corn emergence and growth often coincides in the mid-South with preplant herbicide application in cotton and soybean, thereby increasing opportunity for off-target herbicide movement from adjacent fields. Field studies were conducted in 2022 to identify the impacts of reduced rates of tiafenacil (12.5% to 0.4% of the lowest labeled application rate of 24.64 g ai ha–1) applied to two- or four-leaf corn. Corn injury 1 wk after treatment (WAT) for the two- and four-leaf growth stages ranged from 31% to 6% and 37% to 9%, respectively, whereas at 2 WAT these respective ranges were 21.7% to 4% and 22.5% to 7.2%. By 4 WAT, visible injury following the two- and four-leaf exposure timing was no greater than 8% in all instances except the highest tiafenacil rate applied at the four-leaf growth stage (13%). Tiafenacil had no negative season-long impact, as the early-season injury observed was not manifested in a reduction in corn height 2 WAT or yield. Application of tiafenacil directly adjacent to corn in early vegetative stages of growth should be avoided. In cases where off-target movement does occur, however, affected corn should be expected to fully recover with no impact on growth and yield, assuming adequate growing conditions and agronomic/pest management practices are provided.
Bioturbation can increase time averaging by downward and upward movements of young and old shells within the entire mixed layer and by accelerating the burial of shells into a sequestration zone (SZ), allowing them to bypass the uppermost taphonomically active zone (TAZ). However, bioturbation can increase shell disintegration concurrently, neutralizing the positive effects of mixing on time averaging. Bioirrigation by oxygenated pore-water promotes carbonate dissolution in the TAZ, and biomixing itself can mill shells weakened by dissolution or microbial maceration, and/or expose them to damage at the sediment–water interface. Here, we fit transition rate matrices to bivalve age–frequency distributions from four sediment cores from the southern California middle shelf (50–75 m) to assess the competing effects of bioturbation on disintegration and time averaging, exploiting a strong gradient in rates of sediment accumulation and bioturbation created by historic wastewater pollution. We find that disintegration covaries positively with mixing at all four sites, in accord with the scenario where bioturbation ultimately fuels carbonate disintegration. Both mixing and disintegration rates decline abruptly at the base of the 20- to 40-cm-thick, age-homogenized surface mixed layer at the three well-bioturbated sites, despite different rates of sediment accumulation. In contrast, mixing and disintegration rates are very low in the upper 25 cm at an effluent site with legacy sediment toxicity, despite recolonization by bioirrigating lucinid bivalves. Assemblages that formed during maximum wastewater emissions vary strongly in time averaging, with millennial scales at the low-sediment accumulation non-effluent sites, a centennial scale at the effluent site where sediment accumulation was high but bioturbation recovered quickly, and a decadal scale at the second high-sedimentation effluent site where bioturbation remained low for decades. Thus, even though disintegration rates covary positively with mixing rates, reducing postmortem shell survival, bioturbation has the net effect of increasing the time averaging of skeletal remains on this warm-temperate siliciclastic shelf.
Emerging evidence suggests that routine physical activity may improve exercise capacity, long-term outcomes, and quality of life in individuals with Fontan circulation. Despite this, it is unclear how active these individuals are and what guidance they receive from medical providers regarding physical activity. The aim of this study was to survey Fontan patients on personal physical activity behaviours and their cardiologist-directed physical activity recommendations to set a baseline for future targeted efforts to improve this.
Methods:
An electronic survey assessing physical activity habits and cardiologist-directed guidance was developed in concert with content experts and patients/parents and shared via a social media campaign with Fontan patients and their families.
Results:
A total of 168 individuals completed the survey. The median age of respondents was 10 years, 51% identifying as male. Overall, 21% of respondents spend > 5 hours per week engaged in low-exertion activity and only 7% spend > 5 hours per week engaged in high-exertion activity. In all domains questioned, pre-adolescents reported higher participation rates than adolescents. Nearly half (43%) of respondents reported that they do not discuss activity recommendations with their cardiologist.
Conclusions:
Despite increasing evidence over the last two decades demonstrating the benefit of exercise for individuals living with Fontan circulation, only a minority of patients report engaging in significant amounts of physical activity or discussing activity goals with their cardiologist. Specific, individualized, and actionable education needs to be provided to patients, families, and providers to promote and support regular physical activity in this patient population.
Underrepresentation of people from racial and ethnic minoritized groups in clinical trials threatens external validity of clinical and translational science, diminishes uptake of innovations into practice, and restricts access to the potential benefits of participation. Despite efforts to increase diversity in clinical trials, children and adults from Latino backgrounds remain underrepresented. Quality improvement concepts, strategies, and tools demonstrate promise in enhancing recruitment and enrollment in clinical trials. To demonstrate this promise, we draw upon our team’s experience conducting a randomized clinical trial that tests three behavioral interventions designed to promote equity in language and social-emotional skill acquisition among Latino parent–infant dyads from under-resourced communities. The recruitment activities took place during the COVID-19 pandemic, which intensified the need for responsive strategies and procedures. We used the Model for Improvement to achieve our recruitment goals. Across study stages, we engaged strategies such as (1) intentional team formation, (2) participatory approaches to setting goals, monitoring achievement, selecting change strategies, and (3) small iterative tests that informed additional efforts. These strategies helped our team overcome several barriers. These strategies may help other researchers apply quality improvement tools to increase participation in clinical and translational research among people from minoritized groups.
Tiafenacil is a new non-selective protoporphyrinogen IX oxidase (PPO)-inhibiting herbicide with both grass and broadleaf activity labeled for preplant application to corn (Zea mays L.), cotton (Gossypium hirsutum L.), soybean [Glycine max (L.) Merr.], and wheat (Triticum aestivum L.). Early-season soybean emergence and growth often coincide in the U.S. Midsouth with preplant herbicide application in later-planted cotton and soybean, thereby increasing opportunity for off-target herbicide movement from adjacent fields. Field studies were conducted in 2022 to identify any deleterious impacts of reduced rates of tiafenacil (12.5% to 0.4% of the lowest labeled application rate of 24.64 g ai ha−1) applied to 1- to 2-leaf soybean. Visual injury at 1 wk after treatment (WAT) with 1/8×, 1/16×, 1/32×, and 1/64× rate of tiafenacil was 80%, 61%, 39%, and 21%, while at 4 WAT, these respective rates resulted in visual injury of 67%, 33%, 14%, and 4%. Tiafenacil at these respective rates reduced soybean height 55% to 2% and 53% to 5% at 1 and 4 WAT and soybean yield 53%, 24%, 5%, and 1%. Application of tiafenacil directly adjacent to soybean in early vegetative growth should be avoided, as severe visual injury will occur. In cases where off-target movement does occur, impacted soybean should not be expected to fully recover, and negative impact on growth and yield will be observed.
Medications with anticholinergic properties are associated with a range of adverse effects that tend to be worse in older people.
Aims
To investigate medication regimens with high anticholinergic burden, prescribed for older adults under the care of mental health services.
Method
Clinical audit of prescribing practice, using a standardised data collection tool.
Results
Fifty-seven trusts/healthcare organisations submitted data on medicines prescribed for 7915 patients: two-thirds (66%) were prescribed medication with anticholinergic properties, while just under a quarter (23%) had a medication regimen with high anticholinergic burden (total score ≥3 on the anticholinergic effect on cognition (AEC) scale). Some 16% of patients with a diagnosis of dementia or mild cognitive impairment were prescribed medication regimens with a high anticholinergic burden, compared with 35% of those without such diagnoses. A high anticholinergic burden was mostly because of combinations of commonly prescribed psychotropic medications, principally antidepressant and antipsychotic medications with individual AEC scores of 1 or 2.
Conclusions
Adults under the care of older people's mental health services are commonly prescribed multiple medications for psychiatric and physical disorders; these medication regimens can have a high anticholinergic burden, often an inadvertent consequence of the co-prescription of medications with modest anticholinergic activity. Prescribers for older adults should assess the anticholinergic burden of medication regimens, assiduously check for adverse anticholinergic effects and consider alternative medications with less anticholinergic effect where indicated. The use of a scale, such as the AEC, which identifies the level of central anticholinergic activity of relevant medications, can be a helpful clinical guide.
DSM-5 differentiates avoidant/restrictive food intake disorder (ARFID) from other eating disorders (EDs) by a lack of overvaluation of body weight/shape driving restrictive eating. However, clinical observations and research demonstrate ARFID and shape/weight motivations sometimes co-occur. To inform classification, we: (1) derived profiles underlying restriction motivation and examined their validity and (2) described diagnostic characterizations of individuals in each profile to explore whether findings support current diagnostic schemes. We expected, consistent with DSM-5, that profiles would comprise individuals endorsing solely ARFID or restraint (i.e. trying to eat less to control shape/weight) motivations.
Methods
We applied latent profile analysis to 202 treatment-seeking individuals (ages 10–79 years [M = 26, s.d. = 14], 76% female) with ARFID or a non-ARFID ED, using the Nine-Item ARFID Screen (Picky, Appetite, and Fear subscales) and the Eating Disorder Examination-Questionnaire Restraint subscale as indicators.
Results
A 5-profile solution emerged: Restraint/ARFID-Mixed (n = 24; 8% [n = 2] with ARFID diagnosis); ARFID-2 (with Picky/Appetite; n = 56; 82% ARFID); ARFID-3 (with Picky/Appetite/Fear; n = 40; 68% ARFID); Restraint (n = 45; 11% ARFID); and Non-Endorsers (n = 37; 2% ARFID). Two profiles comprised individuals endorsing solely ARFID motivations (ARFID-2, ARFID-3) and one comprising solely restraint motivations (Restraint), consistent with DSM-5. However, Restraint/ARFID-Mixed (92% non-ARFID ED diagnoses, comprising 18% of those with non-ARFID ED diagnoses in the full sample) endorsed ARFID and restraint motivations.
Conclusions
The heterogeneous profiles identified suggest ARFID and restraint motivations for dietary restriction may overlap somewhat and that individuals with non-ARFID EDs can also endorse high ARFID symptoms. Future research should clarify diagnostic boundaries between ARFID and non-ARFID EDs.
Understanding characteristics of healthcare personnel (HCP) with SARS-CoV-2 infection supports the development and prioritization of interventions to protect this important workforce. We report detailed characteristics of HCP who tested positive for SARS-CoV-2 from April 20, 2020 through December 31, 2021.
Methods:
CDC collaborated with Emerging Infections Program sites in 10 states to interview HCP with SARS-CoV-2 infection (case-HCP) about their demographics, underlying medical conditions, healthcare roles, exposures, personal protective equipment (PPE) use, and COVID-19 vaccination status. We grouped case-HCP by healthcare role. To describe residential social vulnerability, we merged geocoded HCP residential addresses with CDC/ATSDR Social Vulnerability Index (SVI) values at the census tract level. We defined highest and lowest SVI quartiles as high and low social vulnerability, respectively.
Results:
Our analysis included 7,531 case-HCP. Most case-HCP with roles as certified nursing assistant (CNA) (444, 61.3%), medical assistant (252, 65.3%), or home healthcare worker (HHW) (225, 59.5%) reported their race and ethnicity as either non-Hispanic Black or Hispanic. More than one third of HHWs (166, 45.2%), CNAs (283, 41.7%), and medical assistants (138, 37.9%) reported a residential address in the high social vulnerability category. The proportion of case-HCP who reported using recommended PPE at all times when caring for patients with COVID-19 was lowest among HHWs compared with other roles.
Conclusions:
To mitigate SARS-CoV-2 infection risk in healthcare settings, infection prevention, and control interventions should be specific to HCP roles and educational backgrounds. Additional interventions are needed to address high social vulnerability among HHWs, CNAs, and medical assistants.