We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Efficient evidence generation to assess the clinical and economic impact of medical therapies is critical amid rising healthcare costs and aging populations. However, drug development and clinical trials remain far too expensive and inefficient for all stakeholders. On October 25–26, 2023, the Duke Clinical Research Institute brought together leaders from academia, industry, government agencies, patient advocacy, and nonprofit organizations to explore how different entities and influencers in drug development and healthcare can realign incentive structures to efficiently accelerate evidence generation that addresses the highest public health needs. Prominent themes surfaced, including competing research priorities and incentives, inadequate representation of patient population in clinical trials, opportunities to better leverage existing technology and infrastructure in trial design, and a need for heightened transparency and accountability in research practices. The group determined that together these elements contribute to an inefficient and costly clinical research enterprise, amplifying disparities in population health and sustaining gaps in evidence that impede advancements in equitable healthcare delivery and outcomes. The goal of addressing the identified challenges is to ultimately make clinical trials faster, more inclusive, and more efficient across diverse communities and settings.
Several hypotheses may explain the association between substance use, posttraumatic stress disorder (PTSD), and depression. However, few studies have utilized a large multisite dataset to understand this complex relationship. Our study assessed the relationship between alcohol and cannabis use trajectories and PTSD and depression symptoms across 3 months in recently trauma-exposed civilians.
Methods
In total, 1618 (1037 female) participants provided self-report data on past 30-day alcohol and cannabis use and PTSD and depression symptoms during their emergency department (baseline) visit. We reassessed participant's substance use and clinical symptoms 2, 8, and 12 weeks posttrauma. Latent class mixture modeling determined alcohol and cannabis use trajectories in the sample. Changes in PTSD and depression symptoms were assessed across alcohol and cannabis use trajectories via a mixed-model repeated-measures analysis of variance.
Results
Three trajectory classes (low, high, increasing use) provided the best model fit for alcohol and cannabis use. The low alcohol use class exhibited lower PTSD symptoms at baseline than the high use class; the low cannabis use class exhibited lower PTSD and depression symptoms at baseline than the high and increasing use classes; these symptoms greatly increased at week 8 and declined at week 12. Participants who already use alcohol and cannabis exhibited greater PTSD and depression symptoms at baseline that increased at week 8 with a decrease in symptoms at week 12.
Conclusions
Our findings suggest that alcohol and cannabis use trajectories are associated with the intensity of posttrauma psychopathology. These findings could potentially inform the timing of therapeutic strategies.
Analyses of macroscopic charcoal, sediment geochemistry (%C, %N, C/N, δ13C, δ15N), and fossil pollen were conducted on a sediment core recovered from Stella Lake, Nevada, establishing a 2000 year record of fire history and vegetation change for the Great Basin. Charcoal accumulation rates (CHAR) indicate that fire activity, which was minimal from the beginning of the first millennium to AD 750, increased slightly at the onset of the Medieval Climate Anomaly (MCA). Observed changes in catchment vegetation were driven by hydroclimate variability during the early MCA. Two notable increases in CHAR, which occurred during the Little Ice Age (LIA), were identified as major fire events within the catchment. Increased C/N, enriched δ15N, and depleted δ13C values correspond with these events, providing additional evidence for the occurrence of catchment-scale fire events during the late fifteenth and late sixteenth centuries. Shifts in the vegetation community composition and structure accompanied these fires, with Pinus and Picea decreasing in relative abundance and Poaceae increasing in relative abundance following the fire events. During the LIA, the vegetation change and lacustrine geochemical response was most directly influenced by the occurrence of catchment-scale fires, not regional hydroclimate.
This is the first report on the association between trauma exposure and depression from the Advancing Understanding of RecOvery afteR traumA(AURORA) multisite longitudinal study of adverse post-traumatic neuropsychiatric sequelae (APNS) among participants seeking emergency department (ED) treatment in the aftermath of a traumatic life experience.
Methods
We focus on participants presenting at EDs after a motor vehicle collision (MVC), which characterizes most AURORA participants, and examine associations of participant socio-demographics and MVC characteristics with 8-week depression as mediated through peritraumatic symptoms and 2-week depression.
Results
Eight-week depression prevalence was relatively high (27.8%) and associated with several MVC characteristics (being passenger v. driver; injuries to other people). Peritraumatic distress was associated with 2-week but not 8-week depression. Most of these associations held when controlling for peritraumatic symptoms and, to a lesser degree, depressive symptoms at 2-weeks post-trauma.
Conclusions
These observations, coupled with substantial variation in the relative strength of the mediating pathways across predictors, raises the possibility of diverse and potentially complex underlying biological and psychological processes that remain to be elucidated in more in-depth analyses of the rich and evolving AURORA database to find new targets for intervention and new tools for risk-based stratification following trauma exposure.
Individuals with tardive dyskinesia (TD) who completed a long-term study (KINECT 3 or KINECT 4) of valbenazine (40 or 80 mg/day, once-daily for up to 48 weeks followed by 4-week washout) were enrolled in a subsequent study (NCT02736955) that was primarily designed to further evaluate the long-term safety of valbenazine.
Methods
Participants were initiated at 40 mg/day (following prior valbenazine washout). At week 4, dosing was escalated to 80 mg/day based on tolerability and clinical assessment of TD; reduction to 40 mg/day was allowed for tolerability. The study was planned for 72 weeks or until termination due to commercial availability of valbenazine. Assessments included the Clinical Global Impression of Severity-TD (CGIS-TD), Patient Satisfaction Questionnaire (PSQ), and treatment-emergent adverse events (TEAEs).
Results
At study termination, 85.7% (138/161) of participants were still active. Four participants had reached week 60, and none reached week 72. The percentage of participants with a CGIS-TD score ≤2 (normal/not ill or borderline ill) increased from study baseline (14.5% [23/159]) to week 48 (64.3% [36/56]). At baseline, 98.8% (158/160) of participants rated their prior valbenazine experience with a PSQ score ≤2 (very satisfied or somewhat satisfied). At week 48, 98.2% (55/56) remained satisfied. Before week 4 (dose escalation), 9.4% of participants had ≥1 TEAE. After week 4, the TEAE incidence was 49.0%. No TEAE occurred in ≥5% of participants during treatment (before or after week 4).
Conclusions
Valbenazine was well-tolerated and persistent improvements in TD were found in adults who received once-daily treatment for >1 year.
Drawing on a landscape analysis of existing data-sharing initiatives, in-depth interviews with expert stakeholders, and public deliberations with community advisory panels across the U.S., we describe features of the evolving medical information commons (MIC). We identify participant-centricity and trustworthiness as the most important features of an MIC and discuss the implications for those seeking to create a sustainable, useful, and widely available collection of linked resources for research and other purposes.
To evaluate whole-genome sequencing (WGS) as a molecular typing tool for MRSA outbreak investigation.
Design
Investigation of MRSA colonization/infection in a neonatal intensive care unit (NICU) over 3 years (2014–2017).
Setting
Single-center level IV NICU.
Patients
NICU infants and healthcare workers (HCWs).
Methods
Infants were screened for MRSA using a swab of the anterior nares, axilla, and groin, initially by targeted (ring) screening, and later by universal weekly screening. Clinical cultures were collected as indicated. HCWs were screened once using swabs of the anterior nares. MRSA isolates were typed using WGS with core-genome multilocus sequence typing (cgMLST) analysis and by pulsed-field gel electrophoresis (PFGE). Colonized and infected infants and HCWs were decolonized. Control strategies included reinforcement of hand hygiene, use of contact precautions, cohorting, enhanced environmental cleaning, and remodeling of the NICU.
Results
We identified 64 MRSA-positive infants: 53 (83%) by screening and 11 (17%) by clinical cultures. Of 85 screened HCWs, 5 (6%) were MRSA positive. WGS of MRSA isolates identified 2 large clusters (WGS groups 1 and 2), 1 small cluster (WGS group 3), and 8 unrelated isolates. PFGE failed to distinguish WGS group 2 and 3 isolates. WGS groups 1 and 2 were codistributed over time. HCW MRSA isolates were primarily in WGS group 1. New infant MRSA cases declined after implementation of the control interventions.
Conclusion
We identified 2 contemporaneous MRSA outbreaks alongside sporadic cases in a NICU. WGS was used to determine strain relatedness at a higher resolution than PFGE and was useful in guiding efforts to control MRSA transmission.
This chapter on marital satisfaction begins with the historical origins of such research. The first major section of the chapter reviews research on marital satisfaction, starting with five key features of this research (e.g., using self-report measures, largely non-theoretical) and then provides findings in multiple domains: behavior, cognition, and emotion. A middle section distinguishes two approaches: (a) an interpersonal approach that typically looks at patterns of interaction (e.g., communication, companionship, conflict), and tends to use terms such as adjustment and (b) an intrapersonal approach that focuses on individual judgments, namely subjective evaluations of relationships, and tends to use such terms as satisfaction and happiness. At the operational level, much of the past research has used measures lacking precision developed on the basis of classical test theory. Item Response Theory (IRT) analysis is now being used to develop relationship satisfaction measures. Instead of seeing marital satisfaction as a bipolar, unidimensional construct with a positive and negative end on the same continuum, the authors argue the field will advance by using a two-dimensional conceptualization and measurement approach: the experience of positive and negative affect are substantively distinct yet related phenomena, best assessed separately. The chapter concludes with seven issues needing to be resolved.
Our current knowledge of star formation and accretion luminosity at high redshift (z > 3–4), as well as the possible connections between them, relies mostly on observations in the rest-frame ultraviolet, which are strongly affected by dust obscuration. Due to the lack of sensitivity of past and current infrared instrumentation, so far it has not been possible to get a glimpse into the early phases of the dust-obscured Universe. Among the next generation of infrared observatories, SPICA, observing in the 12–350 µm range, will be the only facility that can enable us to trace the evolution of the obscured star-formation rate and black-hole accretion rate densities over cosmic time, from the peak of their activity back to the reionisation epoch (i.e., 3 < z ≲ 6–7), where its predecessors had severe limitations. Here, we discuss the potential of photometric surveys performed with the SPICA mid-infrared instrument, enabled by the very low level of impact of dust obscuration in a band centred at 34 µm. These unique unbiased photometric surveys that SPICA will perform will fully characterise the evolution of AGNs and star-forming galaxies after reionisation.
We acquired center-line surface elevations from glaciers in the St Elias Mountains of Alaska/northwestern Canada using aircraft laser altimetry during 2000–05, and compared these with repeat measurements acquired in 2007. The resulting elevation changes were used to estimate the mass balance of 32 900 km2 of glaciers in the St Elias Mountains during September 2003 to August 2007, yielding a value of −21.2 ± 3.8 Gt a−1, equivalent to an area-averaged mass balance of −0.64 ± 0.12 m a−1 water equivalent (w.e.). High-resolution (2 arc-degrees spatial and 10 day temporal) Gravity Recovery and Climate Experiment (GRACE) mass-balance estimates during this time period were scaled to glaciers of the St Elias Mountains, yielding a value of −20.6 ± 3.0 Gt a−1, or an area-averaged mass balance of −0.63 ± 0.09 m a−1 w.e. The difference in balance estimates (altimetry minus GRACE) was −0.6 ± 4.8 Gt a−1, well within the estimated errors. Differences likely resulted from uncertainties in subgrid sampling of the GRACE mass concentration (mascon) solutions, and from errors in assigning an appropriate near-surface density in the altimetry estimates. The good correspondence between GRACE and aircraft altimetry data suggests that high-resolution GRACE mascon solutions can be used to accurately assess mass-balance trends of mountain glacier regions that are undergoing large changes.
The mass changes of the Gulf of Alaska (GoA) glaciers are computed from the Gravity Recovery and Climate Experiment (GRACE) inter-satellite range-rate data for the period April 2003–September 2007. Through the application of unique processing techniques and a surface mass concentration (mascon) parameterization, the mass variations in the GoA glacier regions have been estimated at high temporal (10 day) and spatial (2 × 2 arc-degrees) resolution. The mascon solutions are directly estimated from a reduction of the GRACE K-band inter-satellite range-rate data and, unlike previous GRACE solutions for the GoA glaciers, do not exhibit contamination by leakage from mass change occurring outside the region of interest. The mascon solutions reveal considerable temporal and spatial variation within the GoA glacier region, with the largest negative mass balances observed in the St Elias Mountains including the Yakutat and Glacier Bay regions. The most rapid losses occurred during the 2004 melt season due to record temperatures in Alaska during that year. The total mass balance of the GoA glacier region was −84 ± 5 Gt a−1 contributing 0.23 ± 0.01 mm a−1 to global sea-level rise from April 2003 through March 2007. Highlighting the large seasonal and interannual variability of the GoA glaciers, the rate determined over the period April 2003–March 2006 is −102 ± 5 Gt a−1, which includes the anomalously high temperatures of 2004 and does not include the large 2007 winter balance-year snowfall. The mascon solutions agree well with regional patterns of glacier mass loss determined from aircraft altimetry and in situ measurements.
Several application modes and methods (schemes) of using herbicides are available to control undesirable vegetation on electric transmission line rights-of-way (ROW). Preferential use of a management scheme can be based on its cost effectiveness, i.e., degree of vegetation control and treatment cost. A treatment that increases/maintains desirable plants, decreases/maintains undesirable plants, and has relatively low cost, can be considered cost effective. Three common herbicides, 2,4-D, picloram and triclopyr, were applied in the field to test treatment mode (selective and nonselective) and method (cut stump, basal, and stem-foliar) effects on cost effectiveness during initial clearing and first and second conversion cycles on one electric transmission line ROW in Upstate New York. Clear or selective cutting with no herbicide was most cost effective during initial clearing. Nonselective and selective stem-foliar schemes were most cost effective during the first and second conversion cycles, respectively.
Tephra-fall deposits from Cook Inlet volcanoes were detected in sediment cores from Tustumena and Paradox Lakes, Kenai Peninsula, Alaska, using magnetic susceptibility and petrography. The ages of tephra layers were estimated using 21 14C ages on macrofossils. Tephras layers are typically fine, gray ash, 1–5 mm thick, and composed of varying proportions of glass shards, pumice, and glass-coated phenocrysts. Of the two lakes, Paradox Lake contained a higher frequency of tephra (0.8 tephra/100 yr; 109 over the 13,200-yr record). The unusually large number of tephra in this lake relative to others previously studied in the area is attributed to the lake's physiography, sedimentology, and limnology. The frequency of ash fall was not constant through the Holocene. In Paradox Lake, tephra layers are absent between ca. 800–2200, 3800–4800, and 9000–10,300 cal yr BP, despite continuously layered lacustrine sediment. In contrast, between 5000 and 9000 cal yr BP, an average of 1.7 tephra layers are present per 100 yr. The peak period of tephra fall (7000–9000 cal yr BP; 2.6 tephra/100 yr) in Paradox Lake is consistent with the increase in volcanism between 7000 and 9000 yr ago recorded in the Greenland ice cores.