We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Next-generation X-ray satellite telescopes such as XRISM, NewAthena and Lynx will enable observations of exotic astrophysical sources at unprecedented spectral and spatial resolution. Proper interpretation of these data demands that the accuracy of the models is at least within the uncertainty of the observations. One set of quantities that might not currently meet this requirement is transition energies of various astrophysically relevant ions. Current databases are populated with many untested theoretical calculations. Accurate laboratory benchmarks are required to better understand the coming data. We obtained laboratory spectra of X-ray lines from a silicon plasma at an average spectral resolving power of $\sim$7500 with a spherically bent crystal spectrometer on the Z facility at Sandia National Laboratories. Many of the lines in the data are measured here for the first time. We report measurements of 53 transitions originating from the K-shells of He-like to B-like silicon in the energy range between $\sim$1795 and 1880 eV (6.6–6.9 Å). The lines were identified by qualitative comparison against a full synthetic spectrum calculated with ATOMIC. The average fractional uncertainty (uncertainty/energy) for all reported lines is ${\sim}5.4 \times 10^{-5}$. We compare the measured quantities against transition energies calculated with RATS and FAC as well as those reported in the NIST ASD and XSTAR’s uaDB. Average absolute differences relative to experimentally measured values are 0.20, 0.32, 0.17 and 0.38 eV, respectively. All calculations/databases show good agreement with the experimental values; NIST ASD shows the closest match overall.
Patients with posttraumatic stress disorder (PTSD) exhibit smaller regional brain volumes in commonly reported regions including the amygdala and hippocampus, regions associated with fear and memory processing. In the current study, we have conducted a voxel-based morphometry (VBM) meta-analysis using whole-brain statistical maps with neuroimaging data from the ENIGMA-PGC PTSD working group.
Methods
T1-weighted structural neuroimaging scans from 36 cohorts (PTSD n = 1309; controls n = 2198) were processed using a standardized VBM pipeline (ENIGMA-VBM tool). We meta-analyzed the resulting statistical maps for voxel-wise differences in gray matter (GM) and white matter (WM) volumes between PTSD patients and controls, performed subgroup analyses considering the trauma exposure of the controls, and examined associations between regional brain volumes and clinical variables including PTSD (CAPS-4/5, PCL-5) and depression severity (BDI-II, PHQ-9).
Results
PTSD patients exhibited smaller GM volumes across the frontal and temporal lobes, and cerebellum, with the most significant effect in the left cerebellum (Hedges’ g = 0.22, pcorrected = .001), and smaller cerebellar WM volume (peak Hedges’ g = 0.14, pcorrected = .008). We observed similar regional differences when comparing patients to trauma-exposed controls, suggesting these structural abnormalities may be specific to PTSD. Regression analyses revealed PTSD severity was negatively associated with GM volumes within the cerebellum (pcorrected = .003), while depression severity was negatively associated with GM volumes within the cerebellum and superior frontal gyrus in patients (pcorrected = .001).
Conclusions
PTSD patients exhibited widespread, regional differences in brain volumes where greater regional deficits appeared to reflect more severe symptoms. Our findings add to the growing literature implicating the cerebellum in PTSD psychopathology.
DSM-5 specifies bulimia nervosa (BN) severity based on specific thresholds of compensatory behavior frequency. There is limited empirical support for such severity groupings. Limited support could be because the DSM-5’s compensatory behavior frequency cutpoints are inaccurate or because compensatory behavior frequency does not capture true underlying differences in severity. In support of the latter possibility, some work has suggested shape/weight overvaluation or use of single versus multiple purging methods may be better severity indicators. We used structural equation modeling (SEM) Trees to empirically determine the ideal variables and cutpoints for differentiating BN severity, and compared the SEM Tree groupings to alternate severity classifiers: the DSM-5 indicators, single versus multiple purging methods, and a binary indicator of shape/weight overvaluation.
Methods
Treatment-seeking adolescents and adults with BN (N = 1017) completed self-report measures assessing BN and comorbid symptoms. SEM Trees specified an outcome model of BN severity and recursively partitioned this model into subgroups based on shape/weight overvaluation and compensatory behaviors. We then compared groups on clinical characteristics (eating disorder symptoms, depression, anxiety, and binge eating frequency).
Results
SEM Tree analyses resulted in five severity subgroups, all based on shape/weight overvaluation: overvaluation <1.25; overvaluation 1.25–3.74; overvaluation 3.75–4.74; overvaluation 4.75–5.74; and overvaluation ≥5.75. SEM Tree groups explained 1.63–6.41 times the variance explained by other severity schemes.
Conclusions
Shape/weight overvaluation outperformed the DSM-5 severity scheme and single versus multiple purging methods, suggesting the DSM-5 severity scheme should be reevaluated. Future research should examine the predictive utility of this severity scheme.
Recent changes to US research funding are having far-reaching consequences that imperil the integrity of science and the provision of care to vulnerable populations. Resisting these changes, the BJPsych Portfolio reaffirms its commitment to publishing mental science and advancing psychiatric knowledge that improves the mental health of one and all.
The Society for Healthcare Epidemiology of America, the Association of Professionals in Infection Control and Epidemiology, the Infectious Diseases Society of America, and the Pediatric Infectious Diseases Society represent the core expertise regarding healthcare infection prevention and infectious diseases and have written multisociety statement for healthcare facility leaders, regulatory agencies, payors, and patients to strengthen requirements and expectations around facility infection prevention and control (IPC) programs. Based on a systematic literature search and formal consensus process, the authors advocate raising the expectations for facility IPC programs, moving to effective programs that are:
• Foundational and influential parts of the facility’s operational structure
• Resourced with the correct expertise and leadership
• Prioritized to address all potential infectious harms
This document discusses the IPC program’s leadership—a dyad model that includes both physician and infection preventionist leaders—its reporting structure, expertise, and competencies of its members, and the roles and accountability of partnering groups within the healthcare facility. The document outlines a process for identifying minimum IPC program medical director support. It applies to all types of healthcare settings except post-acute long-term care and focuses on resources for the IPC program. Long-term acute care hospital (LTACH) staffing and antimicrobial stewardship programs will be discussed in subsequent documents.
Acquired chylothorax is an established complication of CHD surgery, affecting 2–9% of patients. CHD places a child at risk for failure to thrive, with subsequent chylothorax imposing additional risk.
Objective:
We conducted a retrospective chart review to ascertain quantitative markers of nutrition and growth in children affected by chylothorax following CHD surgery between 2018 and 2022 compared to controls.
Methods:
We utilised electronic medical record system, EPIC, at Children’s Hospital, New Orleans, targeting subjects < 18 years old who underwent CHD surgery between 2018 and 2022 and developed a subsequent chylothorax. Study subjects were identified using the 10th revision of the International Classification of Diseases codes (ICD-10 codes: J94.0, I89.8, and J90.0). Each chylothorax case (n = 20) was matched by procedure type and age to a control with no chylothorax (n = 20). Data were recorded in REDCap and analysed using SPSS.
Results:
After removal of outliers, we analysed 19 total matched pairs. There was no statistical difference in growth velocity (p = 0.12), weight change (operation to discharge) (p = 0.95), weight change (admission to discharge) (p = 0.35), Z-score change (operation to discharge) (p = 0.90), Z-score change (admission to discharge) (p = 0.21), serum protein (p = 0.88), or serum albumin (p = 0.82). Among cases, linear regression demonstrated no significant association between maximum chylous output and growth velocity (p = 0.91), weight change (operation to discharge) (p = 0.15), or weight change (admission to discharge) (p = 0.98).
Conclusions:
We did not observe statistically significant markers of growth or nutrition in children with chylothorax post-CHD surgery compared to those without chylothorax. Multisite data collection and analysis is required to better ascertain clinical impact and guide clinical practice.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
In response to the COVID-19 pandemic, we rapidly implemented a plasma coordination center, within two months, to support transfusion for two outpatient randomized controlled trials. The center design was based on an investigational drug services model and a Food and Drug Administration-compliant database to manage blood product inventory and trial safety.
Methods:
A core investigational team adapted a cloud-based platform to randomize patient assignments and track inventory distribution of control plasma and high-titer COVID-19 convalescent plasma of different blood groups from 29 donor collection centers directly to blood banks serving 26 transfusion sites.
Results:
We performed 1,351 transfusions in 16 months. The transparency of the digital inventory at each site was critical to facilitate qualification, randomization, and overnight shipments of blood group-compatible plasma for transfusions into trial participants. While inventory challenges were heightened with COVID-19 convalescent plasma, the cloud-based system, and the flexible approach of the plasma coordination center staff across the blood bank network enabled decentralized procurement and distribution of investigational products to maintain inventory thresholds and overcome local supply chain restraints at the sites.
Conclusion:
The rapid creation of a plasma coordination center for outpatient transfusions is infrequent in the academic setting. Distributing more than 3,100 plasma units to blood banks charged with managing investigational inventory across the U.S. in a decentralized manner posed operational and regulatory challenges while providing opportunities for the plasma coordination center to contribute to research of global importance. This program can serve as a template in subsequent public health emergencies.
Early nutritional and growth experiences can impact development, metabolic function, and reproductive outcomes in adulthood, influencing health trajectories in the next generation. The insulin-like growth factor (IGF) axis regulates growth, metabolism, and energetic investment, but whether it plays a role in the pathway linking maternal experience with offspring prenatal development is unclear. To test this, we investigated patterns of maternal developmental weight gain (a proxy of early nutrition), young adult energy stores, age, and parity as predictors of biomarkers of the pregnancy IGF axis (n = 36) using data from the Cebu Longitudinal Health and Nutrition Survey in Metro Cebu, Philippines. We analyzed maternal conditional weight measures at 2, 8, and 22 years of age and leptin at age 22 (a marker of body fat/energy stores) in relation to free IGF-1 and IGFBP-3 in mid/late pregnancy (mean age = 27). Maternal IGF axis measures were also assessed as predictors of offspring fetal growth. Maternal age, parity, and age 22 leptin were associated with pregnancy free IGF-1, offspring birth weight, and offspring skinfold thickness. We find that free IGF-1 levels in pregnancy are more closely related to nutritional status in early adulthood than to preadult developmental nutrition and demonstrate significant effects of young adult leptin on offspring fetal fat mass deposition. We suggest that the previously documented finding that maternal developmental nutrition predicts offspring birth size likely operates through pathways other than the maternal IGF axis, which reflects more recent energy status.
NASA’s all-sky survey mission, the Transiting Exoplanet Survey Satellite (TESS), is specifically engineered to detect exoplanets that transit bright stars. Thus far, TESS has successfully identified approximately 400 transiting exoplanets, in addition to roughly 6 000 candidate exoplanets pending confirmation. In this study, we present the results of our ongoing project, the Validation of Transiting Exoplanets using Statistical Tools (VaTEST). Our dedicated effort is focused on the confirmation and characterisation of new exoplanets through the application of statistical validation tools. Through a combination of ground-based telescope data, high-resolution imaging, and the utilisation of the statistical validation tool known as TRICERATOPS, we have successfully discovered eight potential super-Earths. These planets bear the designations: TOI-238b (1.61$^{+0.09} _{-0.10}$ R$_\oplus$), TOI-771b (1.42$^{+0.11} _{-0.09}$ R$_\oplus$), TOI-871b (1.66$^{+0.11} _{-0.11}$ R$_\oplus$), TOI-1467b (1.83$^{+0.16} _{-0.15}$ R$_\oplus$), TOI-1739b (1.69$^{+0.10} _{-0.08}$ R$_\oplus$), TOI-2068b (1.82$^{+0.16} _{-0.15}$ R$_\oplus$), TOI-4559b (1.42$^{+0.13} _{-0.11}$ R$_\oplus$), and TOI-5799b (1.62$^{+0.19} _{-0.13}$ R$_\oplus$). Among all these planets, six of them fall within the region known as ‘keystone planets’, which makes them particularly interesting for study. Based on the location of TOI-771b and TOI-4559b below the radius valley we characterised them as likely super-Earths, though radial velocity mass measurements for these planets will provide more details about their characterisation. It is noteworthy that planets within the size range investigated herein are absent from our own solar system, making their study crucial for gaining insights into the evolutionary stages between Earth and Neptune.
The oxidation of As(III) to As(V) by K-birnessite was examined at different temperatures, pHs, and birnessite/As(III) ratios. Experiments ranged in duration from 5 to 64 hr, and solution and solid products were determined at several intervals. All experiments showed that the reaction produced large amounts of K+ to solution and very little Mn2+. As(V) was released to solution and incorporated into the K-birnessite. The oxidation was initially rapid and then slowed. The oxidation of As(III) was probably facilitated initially by autocatalytic Mn-As(V) reactions occurring mostly in the interlayer, in which large amounts of As(V) and K+ could be easily released to solution. The reaction also slowed when interlayer Mn was exhausted by forming Mn-As(V) complexes. Mn(IV) could only be acquired from the octahedral sheets of the birnessite. The two-stage reaction process proposed here depended on the layered structure of birnessite, the specific surface, and presence of exchangeable cations in K-birnessite.
Blood-based biomarkers offer a more feasible alternative to Alzheimer’s disease (AD) detection, management, and study of disease mechanisms than current in vivo measures. Given their novelty, these plasma biomarkers must be assessed against postmortem neuropathological outcomes for validation. Research has shown utility in plasma markers of the proposed AT(N) framework, however recent studies have stressed the importance of expanding this framework to include other pathways. There is promising data supporting the usefulness of plasma glial fibrillary acidic protein (GFAP) in AD, but GFAP-to-autopsy studies are limited. Here, we tested the association between plasma GFAP and AD-related neuropathological outcomes in participants from the Boston University (BU) Alzheimer’s Disease Research Center (ADRC).
Participants and Methods:
This sample included 45 participants from the BU ADRC who had a plasma sample within 5 years of death and donated their brain for neuropathological examination. Most recent plasma samples were analyzed using the Simoa platform. Neuropathological examinations followed the National Alzheimer’s Coordinating Center procedures and diagnostic criteria. The NIA-Reagan Institute criteria were used for the neuropathological diagnosis of AD. Measures of GFAP were log-transformed. Binary logistic regression analyses tested the association between GFAP and autopsy-confirmed AD status, as well as with semi-quantitative ratings of regional atrophy (none/mild versus moderate/severe) using binary logistic regression. Ordinal logistic regression analyses tested the association between plasma GFAP and Braak stage and CERAD neuritic plaque score. Area under the curve (AUC) statistics from receiver operating characteristics (ROC) using predicted probabilities from binary logistic regression examined the ability of plasma GFAP to discriminate autopsy-confirmed AD status. All analyses controlled for sex, age at death, years between last blood draw and death, and APOE e4 status.
Results:
Of the 45 brain donors, 29 (64.4%) had autopsy-confirmed AD. The mean (SD) age of the sample at the time of blood draw was 80.76 (8.58) and there were 2.80 (1.16) years between the last blood draw and death. The sample included 20 (44.4%) females, 41 (91.1%) were White, and 20 (44.4%) were APOE e4 carriers. Higher GFAP concentrations were associated with increased odds for having autopsy-confirmed AD (OR=14.12, 95% CI [2.00, 99.88], p=0.008). ROC analysis showed plasma GFAP accurately discriminated those with and without autopsy-confirmed AD on its own (AUC=0.75) and strengthened as the above covariates were added to the model (AUC=0.81). Increases in GFAP levels corresponded to increases in Braak stage (OR=2.39, 95% CI [0.71-4.07], p=0.005), but not CERAD ratings (OR=1.24, 95% CI [0.004, 2.49], p=0.051). Higher GFAP levels were associated with greater temporal lobe atrophy (OR=10.27, 95% CI [1.53,69.15], p=0.017), but this was not observed with any other regions.
Conclusions:
The current results show that antemortem plasma GFAP is associated with non-specific AD neuropathological changes at autopsy. Plasma GFAP could be a useful and practical biomarker for assisting in the detection of AD-related changes, as well as for study of disease mechanisms.
Hippocampal pathology is a consistent feature in persons with temporal lobe epilepsy (TLE) and a strong biomarker of memory impairment. Histopathological studies have identified selective patterns of cell loss across hippocampal subfields in TLE, the most common being cellular loss in the cornu ammonis 1 (CA1) and dentage gyrus (DG). Structural neuroimaging provides a non-invasive method to understand hippocampal pathology, but traditionally only at a whole-hippocampal level. However, recent methodological advances have enabled the non-invasive quantification of subfield pathology in patients, enabling potential integration into clinical workflow. In this study, we characterize patterns of hippocampal subfield atrophy in patients with TLE and examine the associations between subfield atrophy and clinical characteristics.
Participants and Methods:
High-resolution T2 and T1-weighted MRI were collected from 31 participants (14 left TLE; 6 right TLE; 11 healthy controls [HC], aged 18-61 years). Reconstructions of hippocampal subfields and estimates of their volumes were derived using the Automated Segmentation of Hippocampal Subfields (ASHS) pipeline. Total hippocampal volume was calculated by combining estimates of the subfields CA1-3, DG, and subiculum. To control for variations in head size, all volume estimates were divided by estimates of total brain volume. To assess disease effects on hippocampal atrophy, hippocampi were recoded as either ipsilateral or contralateral to the side of seizure focus. Two sample t-tests at a whole-hippocampus level were used to test for ipsilateral and contralateral volume loss in patients relative to HC. To assess whether we replicated the selective histopathological patterns of subfield atrophy, we carried out mixed-effects ANOVA, coding for an interaction between diagnostic group and hippocampal subfield. Finally, to assess effects of disease load, non-parametric correlations were performed between subfield volume and age of first seizure and duration of illness.
Results:
Patients had significantly smaller total ipsilateral hippocampal volume compared with HC (d=1.23, p<.005). Contralateral hippocampus did not significantly differ between TLE and HC. Examining individual subfields for the ipsilateral hemisphere revealed significant main-effects for group (F(1, 29)=8.2, p<0.01), subfields (F(4, 115)=550.5, p<0.005), and their interaction (F(4, 115)=8.1, p<0.001). Post-hoc tests revealed that TLE had significantly smaller volume in the ipsilateral CA1 (d=-2.0, p<0.001) and DG (d = -1.4, p<0.005). Longer duration of illness was associated with smaller volume of ipsilateral CA2 (p=-0.492, p<0.05) and larger volume of contralateral whole-hippocampus (p=0.689, p<0.001), CA1 (p=0.614, p < 0.005), and DG (p=0.450, p<0.05).
Conclusions:
Histopathological characterization after surgery has revealed important associations between hippocampal subfield cell loss and memory impairments in patients with TLE. Here we demonstrate that non-invasive neuroimaging can detect a pattern of subfield atrophy in TLE (i.e., CA1/DG) that matches the most common form of histopathologically-observed hippocampal sclerosis in TLE (HS Type 1) and has been linked directly to both verbal and visuospatial memory impairment. Finally, we found evidence that longer disease duration is associated with larger contralateral hippocampal volume, driven by increases in CA1 and DG. This may reflect subfield-specific functional reorganization to the unaffected brain tissue, a compensatory effect which may have important implications for patient function and successful treatment outcomes.
The recent achievement of fusion ignition with laser-driven technologies at the National Ignition Facility sets a historic accomplishment in fusion energy research. This accomplishment paves the way for using laser inertial fusion as a viable approach for future energy production. Europe has a unique opportunity to empower research in this field internationally, and the scientific community is eager to engage in this journey. We propose establishing a European programme on inertial-fusion energy with the mission to demonstrate laser-driven ignition in the direct-drive scheme and to develop pathway technologies for the commercial fusion reactor. The proposed roadmap is based on four complementary axes: (i) the physics of laser–plasma interaction and burning plasmas; (ii) high-energy high repetition rate laser technology; (iii) fusion reactor technology and materials; and (iv) reinforcement of the laser fusion community by international education and training programmes. We foresee collaboration with universities, research centres and industry and establishing joint activities with the private sector involved in laser fusion. This project aims to stimulate a broad range of high-profile industrial developments in laser, plasma and radiation technologies along with the expected high-level socio-economic impact.
To assess the safety and efficacy of a novel beta-lactam allergy assessment algorithm managed by an antimicrobial stewardship program (ASP) team.
Design:
Retrospective analysis.
Setting:
One quaternary referral teaching hospital and one tertiary care teaching hospital in a large western Pennsylvania health network.
Patients or participants:
Patients who received a beta-lactam challenge dose under the beta-lactam allergy assessment algorithm.
Interventions:
A beta-lactam allergy assessment protocol was designed and implemented by an ASP team. The protocol risk stratified patients’ reported allergies to identify patients appropriate for a challenge with a beta-lactam antibiotic. This retrospective analysis assessed the safety and efficacy of this protocol among patients receiving a challenge dose from November 2017 to July 2021.
Results:
Over a 45-month period, 119 total patients with either penicillin or cephalosporin allergies entered the protocol. Following a challenge dose, 106 (89.1%) patients were treated with a beta-lactam. Eleven patients had adverse reactions to a challenge dose, one of which required escalation of care to the intensive care unit. Of the patients with an unknown or low-risk reported allergy, 7/66 (10.6%) had an observed adverse reaction compared to 3/42 (7.1%) who had an observed reaction with a reported high-risk or anaphylactic allergy.
Conclusions:
Our implemented protocol was safe and effective, with over 90% of patients tolerating the challenge without incident and many going on to receive indicated beta-lactam therapy. This protocol may serve as a framework for other inpatient ASP teams to implement a low-barrier allergy assessment led by ASP teams.
The social networks surrounding intimate couples provide them with bonding and bridging social capital and have been theorized to be associated with their well-being and relationship quality. These networks are multidimensional, featuring compositional (e.g., the proportion of family members vs. friends) and structural characteristics (e.g., density, degree of overlap between spouses’ networks). Most previous studies of couple networks are based on partners’ global ratings of their network characteristics or network data collected from one member of the dyad. This study presents the analysis of “duocentric networks" or the combined personal networks of both members of a couple, collected from 207 mixed-sex newlywed couples living in low-income neighborhoods of Harris County, TX. We conducted a pattern-centric analysis of compositional and structural features to identify distinct types of couple networks. We identified five qualitatively distinct network types (wife family-focused, husband family-focused, shared friends, wife friend-focused, and extremely disconnected). Couples’ network types were associated with the quality of the relationships between couples and their network contacts (e.g., emotional support) but not with the quality of the couples’ relationship with each other. We argue that duocentric networks provide appropriate data for measuring bonding and bridging capital in couple networks.
Attacks on minoritized communities and increasing awareness of the societal causes of health disparities have combined to highlight deep systemic inequities. In response, academic health centers have prioritized justice, equity, diversity, and inclusion (JEDI) in their strategic goals. To have a sustained impact, JEDI efforts cannot be siloed; rather, they must be woven into the fabric of our work and systematically assessed to promote meaningful outcomes and accountability. To this end, the University of Pittsburgh’s Institute for Clinical Research Education assembled a task force to create and apply a rubric to identify short and long-term JEDI goals, assess the current state of JEDI at our Institute, and make recommendations for immediate action. To ensure deep buy-in, we gathered input from diverse members of our academic community, who served on targeted subcommittees. We then applied a three-step process to ensure rapid forward progress. We emerged with concrete actions for priority focus and a plan for ongoing assessment of JEDI institutionalization. We believe our process and rubric offer a scalable and adaptable model for other institutions and departments to follow as we work together across academic medical institutions to put our justice, equity, diversity, and inclusion goals into meaningful action.
Current scholarship suggests that Neo-Eneolithic systems of settlement and subsistence in Eastern Europe were defined by short-to-medium range migration, while sparsely populated land in peripheral regions allowed for the continual colonization of new territories. We address the Eastern Tripolye Culture (ETC), a sub-group of the Cucuteni-Tripolye cultural complex that flourished ca. 4300–2950 BC by expanding into the forest-steppe ecozone of Central Ukraine. While a general lack of multi-layer sites complicates regional chronology, we resolve several longstanding questions in Ukrainian archaeological discourse by combining traditional relative chronologies of ceramic types with high-precision AMS dating of material from key sites. We offer a revision of the chronology of Tripolye BI and BI-II, which, rather than consisting of distinct “early” and “late” temporal periods, instead constitute a single period characterized by stylistic diversity in material culture. With an absolute chronology established, we then analyze the space-time distribution of sites, revealing a southwest-to-northeast migratory vector across Central Ukraine characterized by punctuated episodes of “leapfrog” colonization. The establishment of this vector by the ETC presages larger-scale population movements by the Western Tripolye Culture (WTC), which led to the establishment of the giant-settlement phenomenon during the first part of the 4th millennium BC.