We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Posttraumatic stress disorder (PTSD) has been associated with advanced epigenetic age cross-sectionally, but the association between these variables over time is unclear. This study conducted meta-analyses to test whether new-onset PTSD diagnosis and changes in PTSD symptom severity over time were associated with changes in two metrics of epigenetic aging over two time points.
Methods
We conducted meta-analyses of the association between change in PTSD diagnosis and symptom severity and change in epigenetic age acceleration/deceleration (age-adjusted DNA methylation age residuals as per the Horvath and GrimAge metrics) using data from 7 military and civilian cohorts participating in the Psychiatric Genomics Consortium PTSD Epigenetics Workgroup (total N = 1,367).
Results
Meta-analysis revealed that the interaction between Time 1 (T1) Horvath age residuals and new-onset PTSD over time was significantly associated with Horvath age residuals at T2 (meta β = 0.16, meta p = 0.02, p-adj = 0.03). The interaction between T1 Horvath age residuals and changes in PTSD symptom severity over time was significantly related to Horvath age residuals at T2 (meta β = 0.24, meta p = 0.05). No associations were observed for GrimAge residuals.
Conclusions
Results indicated that individuals who developed new-onset PTSD or showed increased PTSD symptom severity over time evidenced greater epigenetic age acceleration at follow-up than would be expected based on baseline age acceleration. This suggests that PTSD may accelerate biological aging over time and highlights the need for intervention studies to determine if PTSD treatment has a beneficial effect on the aging methylome.
In response to the COVID-19 pandemic, we rapidly implemented a plasma coordination center, within two months, to support transfusion for two outpatient randomized controlled trials. The center design was based on an investigational drug services model and a Food and Drug Administration-compliant database to manage blood product inventory and trial safety.
Methods:
A core investigational team adapted a cloud-based platform to randomize patient assignments and track inventory distribution of control plasma and high-titer COVID-19 convalescent plasma of different blood groups from 29 donor collection centers directly to blood banks serving 26 transfusion sites.
Results:
We performed 1,351 transfusions in 16 months. The transparency of the digital inventory at each site was critical to facilitate qualification, randomization, and overnight shipments of blood group-compatible plasma for transfusions into trial participants. While inventory challenges were heightened with COVID-19 convalescent plasma, the cloud-based system, and the flexible approach of the plasma coordination center staff across the blood bank network enabled decentralized procurement and distribution of investigational products to maintain inventory thresholds and overcome local supply chain restraints at the sites.
Conclusion:
The rapid creation of a plasma coordination center for outpatient transfusions is infrequent in the academic setting. Distributing more than 3,100 plasma units to blood banks charged with managing investigational inventory across the U.S. in a decentralized manner posed operational and regulatory challenges while providing opportunities for the plasma coordination center to contribute to research of global importance. This program can serve as a template in subsequent public health emergencies.
Depression is an independent risk factor for cardiovascular disease (CVD), but it is unknown if successful depression treatment reduces CVD risk.
Methods
Using eIMPACT trial data, we examined the effect of modernized collaborative care for depression on indicators of CVD risk. A total of 216 primary care patients with depression and elevated CVD risk were randomized to 12 months of the eIMPACT intervention (internet cognitive-behavioral therapy [CBT], telephonic CBT, and select antidepressant medications) or usual primary care. CVD-relevant health behaviors (self-reported CVD prevention medication adherence, sedentary behavior, and sleep quality) and traditional CVD risk factors (blood pressure and lipid fractions) were assessed over 12 months. Incident CVD events were tracked over four years using a statewide health information exchange.
Results
The intervention group exhibited greater improvement in depressive symptoms (p < 0.01) and sleep quality (p < 0.01) than the usual care group, but there was no intervention effect on systolic blood pressure (p = 0.36), low-density lipoprotein cholesterol (p = 0.38), high-density lipoprotein cholesterol (p = 0.79), triglycerides (p = 0.76), CVD prevention medication adherence (p = 0.64), or sedentary behavior (p = 0.57). There was an intervention effect on diastolic blood pressure that favored the usual care group (p = 0.02). The likelihood of an incident CVD event did not differ between the intervention (13/107, 12.1%) and usual care (9/109, 8.3%) groups (p = 0.39).
Conclusions
Successful depression treatment alone is not sufficient to lower the heightened CVD risk of people with depression. Alternative approaches are needed.
Threat avoidance is a prominent symptom of affective disorders, yet its biological basis remains poorly understood. Here, we used a validated task, the Joystick Operated Runway Task (JORT), combined with fMRI, to explore whether abnormal function in neural circuits responsible for avoidance underlies these symptoms. Eighteen individuals with major depressive disorder (MDD) and 17 unaffected controls underwent the task, which involved using physical effort to avoid threatening stimuli, paired with mild electric shocks on certain trials. Activity during anticipation and avoidance of threats was explored and compared between groups. Anticipation of aversive stimuli was associated with significant activation in the dorsal anterior cingulate cortex, superior frontal gyrus, and striatum, while active avoidance of aversive stimuli was associated with activity in dorsal anterior cingulate cortex, insula, and prefrontal cortex. There were no significant group differences in neural activity or behavioral performance on the JORT; however, participants with depression reported more dread while being chased on the task. The JORT effectively identified neural systems involved in avoidance and anticipation of aversive stimuli. However, the absence of significant differences in behavioral performance and activation between depressed and non-depressed groups suggests that MDD is not associated with abnormal function in these networks. Future research should investigate the basis of passive avoidance in major depression. Further, the JORT should be explored in patients with anxiety disorders, where threat avoidance may be a more prominent characteristic of the disorder.
To determine the incidence of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection among healthcare personnel (HCP) and to assess occupational risks for SARS-CoV-2 infection.
Design:
Prospective cohort of healthcare personnel (HCP) followed for 6 months from May through December 2020.
Setting:
Large academic healthcare system including 4 hospitals and affiliated clinics in Atlanta, Georgia.
Participants:
HCP, including those with and without direct patient-care activities, working during the coronavirus disease 2019 (COVID-19) pandemic.
Methods:
Incident SARS-CoV-2 infections were determined through serologic testing for SARS-CoV-2 IgG at enrollment, at 3 months, and at 6 months. HCP completed monthly surveys regarding occupational activities. Multivariable logistic regression was used to identify occupational factors that increased the risk of SARS-CoV-2 infection.
Results:
Of the 304 evaluable HCP that were seronegative at enrollment, 26 (9%) seroconverted for SARS-CoV-2 IgG by 6 months. Overall, 219 participants (73%) self-identified as White race, 119 (40%) were nurses, and 121 (40%) worked on inpatient medical-surgical floors. In a multivariable analysis, HCP who identified as Black race were more likely to seroconvert than HCP who identified as White (odds ratio, 4.5; 95% confidence interval, 1.3–14.2). Increased risk for SARS-CoV-2 infection was not identified for any occupational activity, including spending >50% of a typical shift at a patient’s bedside, working in a COVID-19 unit, or performing or being present for aerosol-generating procedures (AGPs).
Conclusions:
In our study cohort of HCP working in an academic healthcare system, <10% had evidence of SARS-CoV-2 infection over 6 months. No specific occupational activities were identified as increasing risk for SARS-CoV-2 infection.
Copy number variants (CNVs) have been associated with the risk of schizophrenia, autism and intellectual disability. However, little is known about their spectrum of psychopathology in adulthood.
Methods
We investigated the psychiatric phenotypes of adult CNV carriers and compared probands, who were ascertained through clinical genetics services, with carriers who were not. One hundred twenty-four adult participants (age 18–76), each bearing one of 15 rare CNVs, were recruited through a variety of sources including clinical genetics services, charities for carriers of genetic variants, and online advertising. A battery of psychiatric assessments was used to determine psychopathology.
Results
The frequencies of psychopathology were consistently higher for the CNV group compared to general population rates. We found particularly high rates of neurodevelopmental disorders (NDDs) (48%), mood disorders (42%), anxiety disorders (47%) and personality disorders (73%) as well as high rates of psychiatric multimorbidity (median number of diagnoses: 2 in non-probands, 3 in probands). NDDs [odds ratio (OR) = 4.67, 95% confidence interval (CI) 1.32–16.51; p = 0.017) and psychotic disorders (OR = 6.8, 95% CI 1.3–36.3; p = 0.025) occurred significantly more frequently in probands (N = 45; NDD: 39[87%]; psychosis: 8[18%]) than non-probands (N = 79; NDD: 20 [25%]; psychosis: 3[4%]). Participants also had somatic diagnoses pertaining to all organ systems, particularly conotruncal cardiac malformations (in individuals with 22q11.2 deletion syndrome specifically), musculoskeletal, immunological, and endocrine diseases.
Conclusions
Adult CNV carriers had a markedly increased rate of anxiety and personality disorders not previously reported and high rates of psychiatric multimorbidity. Our findings support in-depth psychiatric and medical assessments of carriers of CNVs and the establishment of multidisciplinary clinical services.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
To estimate prior severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection among skilled nursing facility (SNF) staff in the state of Georgia and to identify risk factors for seropositivity as of fall 2020.
Design:
Baseline survey and seroprevalence of the ongoing longitudinal Coronavirus 2019 (COVID-19) Prevention in Nursing Homes study.
Setting:
The study included 14 SNFs in the state of Georgia.
Participants:
In total, 792 SNF staff employed or contracted with participating SNFs were included in this study. The analysis included 749 participants with SARS-CoV-2 serostatus results who provided age, sex, and complete survey information.
Methods:
We estimated unadjusted odds ratios (ORs) and 95% confidence intervals (95% CIs) for potential risk factors and SARS-CoV-2 serostatus. We estimated adjusted ORs using a logistic regression model including age, sex, community case rate, SNF resident infection rate, working at other facilities, and job role.
Results:
Staff working in high-infection SNFs were twice as likely (unadjusted OR, 2.08; 95% CI, 1.45–3.00) to be seropositive as those in low-infection SNFs. Certified nursing assistants and nurses were 3 times more likely to be seropositive than administrative, pharmacy, or nonresident care staff: unadjusted OR, 2.93 (95% CI, 1.58–5.78) and unadjusted OR, 3.08 (95% CI, 1.66–6.07). Logistic regression yielded similar adjusted ORs.
Conclusions:
Working at high-infection SNFs was a risk factor for SARS-CoV-2 seropositivity. Even after accounting for resident infections, certified nursing assistants and nurses had a 3-fold higher risk of SARS-CoV-2 seropositivity than nonclinical staff. This knowledge can guide prioritized implementation of safer ways for caregivers to provide necessary care to SNF residents.
We aimed to evaluate how coronavirus (COVID-19) restrictions had altered individual's drinking behaviours, including consumption, hangover experiences, and motivations to drink, and changing levels of depression and anxiety.
Method
We conducted an online cross-sectional self-report survey. Whole group analysis compared pre- versus post-COVID restrictions. A correlation coefficient matrix evaluated the associations between all outcome scores. Self-report data was compared with Alcohol Use Disorders Identification Test (AUDIT) scores from the 2014 Adult Psychiatric Morbidity Survey. Multiple linear modelling (MLM) was calculated to identify factors associated with increasing AUDIT scores and post-restriction AUDIT scores.
Results
In total, 346 individuals completed the survey, of which 336 reported drinking and were therefore analysed. After COVID-19 restrictions 23.2% of respondents reported an increased AUDIT score, and 60.1% a decreased score. AUDIT score change was positively correlated with change in depression (P < 0.01, r = 0.15), anxiety (P < 0.01, r = 0.15) and drinking to cope scores (P < 0.0001, r = 0.35). MLM revealed that higher AUDIT scores were associated with age, mental illness, lack of a garden, self-employed or furloughed individuals, a confirmed COVID-19 diagnosis and smoking status.
Conclusions
COVID-19 restrictions decreased alcohol consumption for the majority of individuals in this study. However, a small proportion increased their consumption; this related to drinking to cope and increased depression and anxiety.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
Ethnohistoric accounts indicate that the people of Australia's Channel Country engaged in activities rarely recorded elsewhere on the continent, including food storage, aquaculture and possible cultivation, yet there has been little archaeological fieldwork to verify these accounts. Here, the authors report on a collaborative research project initiated by the Mithaka people addressing this lack of archaeological investigation. The results show that Mithaka Country has a substantial and diverse archaeological record, including numerous large stone quarries, multiple ritual structures and substantial dwellings. Our archaeological research revealed unknown aspects, such as the scale of Mithaka quarrying, which could stimulate re-evaluation of Aboriginal socio-economic systems in parts of ancient Australia.
Among 353 healthcare personnel in a longitudinal cohort in 4 hospitals in Atlanta, Georgia (May–June 2020), 23 (6.5%) had severe acute respiratory coronavirus virus 2 (SARS-CoV-2) antibodies. Spending >50% of a typical shift at the bedside (OR, 3.4; 95% CI, 1.2–10.5) and black race (OR, 8.4; 95% CI, 2.7–27.4) were associated with SARS-CoV-2 seropositivity.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
In this chapter, the authors discuss the issues related to post-operative neonatal apnea with an example of an infant hernia repair. Neonatal apnea, its etiology and associated risk factors is reviewed. The use of infant spinal anesthesia versus general anesthesia and its relationship to neonatal post-operative apnea is discussed.
As demonstrated by neuroimaging data, the human brain contains systems that control responses to threat. The revised Reinforcement Sensitivity Theory of personality predicts that individual differences in the reactivity of these brain systems produce anxiety and fear-related personality traits. Here we discuss some of the challenges in testing this theory and, as an example, present a pilot study that aimed to dissociate brain activity during pursuit by threat and goal conflict. We did this by translating the Mouse Defense Test Battery for human fMRI use. In this version, dubbed the Joystick Operated Runway Task (JORT), we repeatedly exposed 24 participants to pursuit and goal conflict, with and without threat of electric shock. The runway design of JORT allowed the effect of threat distance on brain activation to be evaluated independently of context. Goal conflict plus threat of electric shock caused deactivation in a network of brain areas that included the fusiform and middle temporal gyri, as well as the default mode network core, including medial frontal regions, precuneus and posterior cingulate gyrus, and laterally the inferior parietal and angular gyri. Consistent with earlier research, we also found that imminent threat activated the midbrain and that this effect was significantly stronger during the simple pursuit condition than during goal conflict. Also consistent with earlier research, we found significantly greater hippocampal activation during goal conflict than pursuit by imminent threat. In conclusion, our results contribute knowledge to theories linking anxiety disorders to altered functioning in defensive brain systems and also highlight challenges in this research domain.
The initial classic Fontan utilising a direct right atrial appendage to pulmonary artery anastomosis led to numerous complications. Adults with such complications may benefit from conversion to a total cavo-pulmonary connection, the current standard palliation for children with univentricular hearts.
Methods:
A single institution, retrospective chart review was conducted for all Fontan conversion procedures performed from July, 1999 through January, 2017. Variables analysed included age, sex, reason for Fontan conversion, age at Fontan conversion, and early mortality or heart transplant within 1 year after Fontan conversion.
Results:
A total of 41 Fontan conversion patients were identified. Average age at Fontan conversion was 24.5 ± 9.2 years. Dominant left ventricular physiology was present in 37/41 (90.2%) patients. Right-sided heart failure occurred in 39/41 (95.1%) patients and right atrial dilation was present in 33/41 (80.5%) patients. The most common causes for Fontan conversion included atrial arrhythmia in 37/41 (90.2%), NYHA class II HF or greater in 31/41 (75.6%), ventricular dysfunction in 23/41 (56.1%), and cirrhosis or fibrosis in 7/41 (17.1%) patients. Median post-surgical follow-up was 6.2 ± 4.9 years. Survival rates at 30 days, 1 year, and greater than 1-year post-Fontan conversion were 95.1, 92.7, and 87.8%, respectively. Two patients underwent heart transplant: the first within 1 year of Fontan conversion for heart failure and the second at 5.3 years for liver failure.
Conclusions:
Fontan conversion should be considered early when atrial arrhythmias become common rather than waiting for severe heart failure to ensue, and Fontan conversion can be accomplished with an acceptable risk profile.
To determine the impact of recurrent Clostridium difficile infection (RCDI) on patient behaviors following illness.
METHODS
Using a computer algorithm, we searched the electronic medical records of 7 Chicago-area hospitals to identify patients with RCDI (2 episodes of CDI within 15 to 56 days of each other). RCDI was validated by medical record review. Patients were asked to complete a telephone survey. The survey included questions regarding general health, social isolation, symptom severity, emotional distress, and prevention behaviors.
RESULTS
In total, 119 patients completed the survey (32%). On average, respondents were 57.4 years old (standard deviation, 16.8); 57% were white, and ~50% reported hospitalization for CDI. At the time of their most recent illness, patients rated their diarrhea as high severity (58.5%) and their exhaustion as extreme (30.7%). Respondents indicated that they were very worried about getting sick again (41.5%) and about infecting others (31%). Almost 50% said that they have washed their hands more frequently (47%) and have increased their use of soap and water (45%) since their illness. Some of these patients (22%–32%) reported eating out less, avoiding certain medications and public areas, and increasing probiotic use. Most behavioral changes were unrelated to disease severity.
CONCLUSION
Having had RCDI appears to increase prevention-related behaviors in some patients. While some behaviors are appropriate (eg, handwashing), others are not supported by evidence of decreased risk and may negatively impact patient quality of life. Providers should discuss appropriate prevention behaviors with their patients and should clarify that other behaviors (eg, eating out less) will not affect their risk of future illness.
Laboratory creep deformation experiments have been conducted on initially isotropic laboratory-made samples of polycrystalline ice. Steady-state tertiary creep rates, , were determined at strains exceeding 10% in either uniaxial-compression or simple-shear experiments. Isotropic minimum strain rates, , determined at ˜1 % strain, provide a reference for comparing the relative magnitude of tertiary creep rates in shear and compression through the use of strain-rate enhancement factors, E, defined as the ratio of corresponding tertiary and isotropic minimum creep rates, i.e. . The magnitude of strain-rate enhancement in simple shear was found to exceed that in uniaxial compression by a constant factor of 2.3. Results of experiments conducted at octahedral shear stresses of to = 0.040.80 MPa indicate a creep power-law stress exponent of n = 3 for isotropic minimum creep rates and n = 3.5 for tertiary creep rates. The difference in stress exponents for minimum and tertiary creep regimes can be interpreted as a t0 stress-dependent level of strain-rate enhancement, i.e. .The implications of these results for deformation in complex multicomponent stress configurations and at stresses below those used in the current experiments are discussed.
The northwestern sector of the Amery Ice Shelf, East Antarctica, has a layered structure, due to the presence of both meteoric ice and a marine ice layer resulting from sub-shelf freezing processes. Crystal orientation fabric and grain-size data are presented for ice cores obtained from two boreholes ˜70 km apart on approximately the same flowline. Multiple-maxima crystal orientation fabrics and large mean grain sizes in the meteoric ice are indicative of stress relaxation and subsequent grain growth in ice that has flowed into the Amery Ice Shelf. Strongly anisotropic single-maximum crystal orientation fabrics and rectangular textures near the base of the ˜200 m thick marine ice layer suggest accretion occurs by the accumulation of frazil ice platelets. Crystal orientation fabrics in older marine ice exhibit vertical large circle girdle patterns, influenced by the complex stress configurations that exist towards the margins of the ice shelf. Post-accumulation grain growth and fabric development in the marine ice layer are restricted by a high concentration of brine and insoluble particulate inclusions. Differences in the meteoric and marine ice crystallography are indicative of the contrasting rheological properties of these layers, which must be considered in relation to large-scale ice-shelf dynamics.