We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Are you or someone you know struggling with hoarding disorder, feeling ashamed or guilty about your belongings, and afraid to let them go? It's more common than you might think, affecting up to 6% of the general population. But despite its prevalence, seeking help can be challenging. This new book provides a clear description of hoarding, exploring it as a symptom of other issues as well as a condition in its own right. You'll learn about different treatment options and find step-by-step guidance and tools for recovery in the self-help section. Personal narratives and case studies make this guide accessible and relatable for those affected by hoarding, as well as their loved ones and health professionals. Don't let hoarding disorder control your life - take the first step towards recovery today with this invaluable resource.
Migraine and post-traumatic stress disorder (PTSD) are both twice as common in women as men. Cross-sectional studies have shown associations between migraine and several psychiatric conditions, including PTSD. PTSD is disproportionally common among patients in headache clinics, and individuals with migraine and PTSD report greater disability from migraines and more frequent medication use. To further clarify the nature of the relationship between PTSD and migraine, we conducted bidirectional analyses of the association between (1) migraine and incident PTSD and (2) PTSD and incident migraine.
Methods
We used longitudinal data from 1989–2020 among the 33,327 Nurses’ Health Study II respondents to the 2018 stress questionnaire. We used log-binomial models to estimate the relative risk of developing PTSD among women with migraine and the relative risk of developing migraine among individuals with PTSD, trauma-exposed individuals without PTSD, and individuals unexposed to trauma, adjusting for race, education, marital status, high blood pressure, high cholesterol, alcohol intake, smoking, and body mass index.
Results
Overall, 48% of respondents reported ever experiencing migraine, 82% reported experiencing trauma and 9% met the Diagnostic and Statistical Manual of Mental Disorders-5 criteria for PTSD. Of those reporting migraine and trauma, 67% reported trauma before migraine onset, 2% reported trauma and migraine onset in the same year and 31% reported trauma after migraine onset. We found that migraine was associated with incident PTSD (adjusted relative risk [RR]: 1.26, 95% confidence interval [CI]: 1.14–1.39). PTSD, but not trauma without PTSD, was associated with incident migraine (adjusted RR: 1.20, 95% CI: 1.14–1.27). Findings were consistently stronger in both directions among those experiencing migraine with aura.
Conclusions
Our study provides further evidence that migraine and PTSD are strongly comorbid and found associations of similar magnitude between migraine and incident PTSD and PTSD and incident migraine.
In response to the COVID-19 pandemic, we rapidly implemented a plasma coordination center, within two months, to support transfusion for two outpatient randomized controlled trials. The center design was based on an investigational drug services model and a Food and Drug Administration-compliant database to manage blood product inventory and trial safety.
Methods:
A core investigational team adapted a cloud-based platform to randomize patient assignments and track inventory distribution of control plasma and high-titer COVID-19 convalescent plasma of different blood groups from 29 donor collection centers directly to blood banks serving 26 transfusion sites.
Results:
We performed 1,351 transfusions in 16 months. The transparency of the digital inventory at each site was critical to facilitate qualification, randomization, and overnight shipments of blood group-compatible plasma for transfusions into trial participants. While inventory challenges were heightened with COVID-19 convalescent plasma, the cloud-based system, and the flexible approach of the plasma coordination center staff across the blood bank network enabled decentralized procurement and distribution of investigational products to maintain inventory thresholds and overcome local supply chain restraints at the sites.
Conclusion:
The rapid creation of a plasma coordination center for outpatient transfusions is infrequent in the academic setting. Distributing more than 3,100 plasma units to blood banks charged with managing investigational inventory across the U.S. in a decentralized manner posed operational and regulatory challenges while providing opportunities for the plasma coordination center to contribute to research of global importance. This program can serve as a template in subsequent public health emergencies.
Previous studies in rodents suggest that mismatch between fetal and postnatal nutrition predisposes individuals to metabolic diseases. We hypothesized that in nonhuman primates (NHP), fetal programming of maternal undernutrition (MUN) persists postnatally with a dietary mismatch altering metabolic molecular systems that precede standard clinical measures. We used unbiased molecular approaches to examine response to a high fat, high-carbohydrate diet plus sugar drink (HFCS) challenge in NHP juvenile offspring of MUN pregnancies compared with controls (CON). Pregnant baboons were fed ad libitum (CON) or 30% calorie reduction from 0.16 gestation through lactation; weaned offspring were fed chow ad libitum. MUN offspring were growth restricted at birth. Liver, omental fat, and skeletal muscle gene expression, and liver glycogen, muscle mitochondria, and fat cell size were quantified. Before challenge, MUN offspring had lower body mass index (BMI) and liver glycogen, and consumed more sugar drink than CON. After HFCS challenge, MUN and CON BMIs were similar. Molecular analyses showed HFCS response differences between CON and MUN for muscle and liver, including hepatic splicing and unfolded protein response. Altered liver signaling pathways and glycogen content between MUN and CON at baseline indicate in utero programming persists in MUN juveniles. MUN catchup growth during consumption of HFCS suggests increased risk of obesity, diabetes, and cardiovascular disease. Greater sugar drink consumption in MUN demonstrates altered appetitive drive due to programming. Differences in blood leptin, liver glycogen, and tissue-specific molecular response to HFCS suggest MUN significantly impacts juvenile offspring ability to manage an energy rich diet.
Selenium (Se) deficiency among populations in Ethiopia is consistent with low concentrations of Se in soil and crops that could be addressed partly by Se-enriched fertilisers. This study examines the disease burden of Se deficiency in Ethiopia and evaluates the cost-effectiveness of Se agronomic biofortification. A disability-adjusted life years (DALY) framework was used, considering goiter, anaemia, and cognitive dysfunction among children and women. The potential efficiency of Se agronomic biofortification was calculated from baseline crop composition and response to Se fertilisers based on an application of 10 g/ha Se fertiliser under optimistic and pessimistic scenarios. The calculated cost per DALY was compared against gross domestic product (GDP; below 1–3 times national GDP) to consider as a cost-effective intervention. The existing national food basket supplies a total of 28·2 µg of Se for adults and 11·3 µg of Se for children, where the risk of inadequate dietary Se reaches 99·1 %–100 %. Cereals account for 61 % of the dietary Se supply. Human Se deficiency contributes to 0·164 million DALYs among children and women. Hence, 52 %, 43 %, and 5 % of the DALYs lost are attributed to anaemia, goiter, and cognitive dysfunction, respectively. Application of Se fertilisers to soils could avert an estimated 21·2–67·1 %, 26·6–67·5 % and 19·9–66·1 % of DALY via maize, teff and wheat at a cost of US$129·6–226·0, US$149·6–209·1 and US$99·3–181·6, respectively. Soil Se fertilisation of cereals could therefore be a cost-effective strategy to help alleviate Se deficiency in Ethiopia, with precedents in Finland.
Inflammation and infections such as malaria affect micronutrient biomarker concentrations and hence estimates of nutritional status. It is unknown whether correction for C-reactive protein (CRP) and α1-acid glycoprotein (AGP) fully captures the modification in ferritin concentrations during a malaria infection, or whether environmental and sociodemographic factors modify this association. Cross-sectional data from eight surveys in children aged 6–59 months (Cameroon, Cote d’Ivoire, Kenya, Liberia, Malawi, Nigeria and Zambia; n 6653) from the Biomarkers Reflecting Inflammation and Nutritional Determinants of Anaemia (BRINDA) project were pooled. Ferritin was adjusted using the BRINDA adjustment method, with values < 12 μg/l indicating iron deficiency. The association between current or recent malaria infection, detected by microscopy or rapid test kit, and inflammation-adjusted ferritin was estimated using pooled multivariable linear regression. Age, sex, malaria endemicity profile (defined by the Plasmodium falciparum infection prevalence) and malaria diagnostic methods were examined as effect modifiers. Unweighted pooled malaria prevalence was 26·0 % (95 % CI 25·0, 27·1) and unweighted pooled iron deficiency was 41·9 % (95 % CI 40·7, 43·1). Current or recent malaria infection was associated with a 44 % (95 % CI 39·0, 52·0; P < 0·001) increase in inflammation-adjusted ferritin after adjusting for age and study identifier. In children, ferritin increased less with malaria infection as age and malaria endemicity increased. Adjustment for malaria increased the prevalence of iron deficiency, but the effect was small. Additional information would help elucidate the underlying mechanisms of the role of endemicity and age in the association between malaria and ferritin.
Background: After a transient ischemic attack (TIA) or minor stroke, the long-term risk of subsequent stroke is uncertain. Methods: Electronic databases were searched for observational studies reporting subsequent stroke during a minimum follow-up of 1 year in patients with TIA or minor stroke. Unpublished data on number of stroke events and exact person-time at risk contributed by all patients during discrete time intervals of follow-up were requested from the authors of included studies. This information was used to calculate the incidence of stroke in individual studies, and results across studies were pooled using random-effects meta-analysis. Results: Fifteen independent cohorts involving 129794 patients were included in the analysis. The pooled incidence rate of subsequent stroke per 100 person-years was 6.4 events in the first year and 2.0 events in the second through tenth years, with cumulative incidences of 14% at 5 years and 21% at 10 years. Based on 10 studies with information available on fatal stroke, the pooled case fatality rate of subsequent stroke was 9.5% (95% CI, 5.9 – 13.8). Conclusions: One in five patients is expected to experience a subsequent stroke within 10 years after a TIA or minor stroke, with every tenth patient expected to die from their subsequent stroke.
Lechuguilla Cave is a hypogene cave formed by oxidation of ascending hydrogen sulfide from the Delaware Basin. A unique sediment deposit with characteristics suggesting derivation from the land surface, some 285 m above, was investigated. At this location, the observed stratigraphy (oldest to youngest) was: bedrock floor (limestone), cave clouds (secondary calcite), calcite-cemented silstone, finely laminated clay, and calcite rafts. Grain-size analysis indicates that the laminated clay deposits are composed of 59-82% clay-size minerals. The major minerals of the clay were determined by X-ray diffraction analysis and consist of interstratified illite-smectite, kaolinite, illite, goethite, and quartz. Scanning electron microscopy observations show that most of the clay deposit is composed of densely packed irregularshaped clay-size flakes. One sample from the top of the deposit was detrital, containing well-rounded, silt-size particles.
Surface soils are probably the source of the clay minerals. The small amount of sand- and silt-size particles suggests that detrital particles were transported in suspension. The lack of endellite and alunite is evidence that the clays were emplaced after the sulfuric-acid dissolution stage of cave formation. Fossil evidence also suggests a previously existing link to the surface.
Operational Risk is one of the most difficult risks to model. It is a large and diverse category covering anything from cyber losses to mis-selling fines; and from processing errors to HR issues. Data is usually lacking, particularly for low frequency, high impact losses, and consequently there can be a heavy reliance on expert judgement. This paper seeks to help actuaries and other risk professionals tasked with the challenge of validating models of operational risks. It covers the loss distribution and scenario-based approaches most commonly used to model operational risks, as well as Bayesian Networks. It aims to give a comprehensive yet practical guide to how one may validate each of these and provide assurance that the model is appropriate for a firm’s operational risk profile.
Understanding the factors contributing to optimal cognitive function throughout the aging process is essential to better understand successful cognitive aging. Processing speed is an age sensitive cognitive domain that usually declines early in the aging process; however, this cognitive skill is essential for other cognitive tasks and everyday functioning. Evaluating brain network interactions in cognitively healthy older adults can help us understand how brain characteristics variations affect cognitive functioning. Functional connections among groups of brain areas give insight into the brain’s organization, and the cognitive effects of aging may relate to this large-scale organization. To follow-up on our prior work, we sought to replicate our findings regarding network segregation’s relationship with processing speed. In order to address possible influences of node location or network membership we replicated the analysis across 4 different node sets.
Participants and Methods:
Data were acquired as part of a multi-center study of 85+ cognitively normal individuals, the McKnight Brain Aging Registry (MBAR). For this analysis, we included 146 community-dwelling, cognitively unimpaired older adults, ages 85-99, who had undergone structural and BOLD resting state MRI scans and a battery of neuropsychological tests. Exploratory factor analysis identified the processing speed factor of interest. We preprocessed BOLD scans using fmriprep, Ciftify, and XCPEngine algorithms. We used 4 different sets of connectivity-based parcellation: 1)MBAR data used to define nodes and Power (2011) atlas used to determine node network membership, 2) Younger adults data used to define nodes (Chan 2014) and Power (2011) atlas used to determine node network membership, 3) Older adults data from a different study (Han 2018) used to define nodes and Power (2011) atlas used to determine node network membership, and 4) MBAR data used to define nodes and MBAR data based community detection used to determine node network membership.
Segregation (balance of within-network and between-network connections) was measured within the association system and three wellcharacterized networks: Default Mode Network (DMN), Cingulo-Opercular Network (CON), and Fronto-Parietal Network (FPN). Correlation between processing speed and association system and networks was performed for all 4 node sets.
Results:
We replicated prior work and found the segregation of both the cortical association system, the segregation of FPN and DMN had a consistent relationship with processing speed across all node sets (association system range of correlations: r=.294 to .342, FPN: r=.254 to .272, DMN: r=.263 to .273). Additionally, compared to parcellations created with older adults, the parcellation created based on younger individuals showed attenuated and less robust findings as those with older adults (association system r=.263, FPN r=.255, DMN r=.263).
Conclusions:
This study shows that network segregation of the oldest-old brain is closely linked with processing speed and this relationship is replicable across different node sets created with varied datasets. This work adds to the growing body of knowledge about age-related dedifferentiation by demonstrating replicability and consistency of the finding that as essential cognitive skill, processing speed, is associated with differentiated functional networks even in very old individuals experiencing successful cognitive aging.
Bayesian statistical approaches are extensively used in new statistical methods but have not been adopted at the same rate in clinical and translational (C&T) research. The goal of this paper is to accelerate the transition of new methods into practice by improving the C&T researcher’s ability to gain confidence in interpreting and implementing Bayesian analyses.
Methods:
We developed a Bayesian data analysis plan and implemented that plan for a two-arm clinical trial comparing the effectiveness of a new opioid in reducing time to discharge from the post-operative anesthesia unit and nerve block usage in surgery. Through this application, we offer a brief tutorial on Bayesian methods and exhibit how to apply four Bayesian statistical packages from STATA, SAS, and RStan to conduct linear and logistic regression analyses in clinical research.
Results:
The analysis results in our application were robust to statistical package and consistent across a wide range of prior distributions. STATA was the most approachable package for linear regression but was more limited in the models that could be fitted and easily summarized. SAS and R offered more straightforward documentation and data management for the posteriors. They also offered direct programming of the likelihood making them more easily extendable to complex problems.
Conclusion:
Bayesian analysis is now accessible to a broad range of data analysts and should be considered in more C&T research analyses. This will allow C&T research teams the ability to adopt and interpret Bayesian methodology in more complex problems where Bayesian approaches are often needed.
We identify a set of essential recent advances in climate change research with high policy relevance, across natural and social sciences: (1) looming inevitability and implications of overshooting the 1.5°C warming limit, (2) urgent need for a rapid and managed fossil fuel phase-out, (3) challenges for scaling carbon dioxide removal, (4) uncertainties regarding the future contribution of natural carbon sinks, (5) intertwinedness of the crises of biodiversity loss and climate change, (6) compound events, (7) mountain glacier loss, (8) human immobility in the face of climate risks, (9) adaptation justice, and (10) just transitions in food systems.
Technical summary
The Intergovernmental Panel on Climate Change Assessment Reports provides the scientific foundation for international climate negotiations and constitutes an unmatched resource for researchers. However, the assessment cycles take multiple years. As a contribution to cross- and interdisciplinary understanding of climate change across diverse research communities, we have streamlined an annual process to identify and synthesize significant research advances. We collected input from experts on various fields using an online questionnaire and prioritized a set of 10 key research insights with high policy relevance. This year, we focus on: (1) the looming overshoot of the 1.5°C warming limit, (2) the urgency of fossil fuel phase-out, (3) challenges to scale-up carbon dioxide removal, (4) uncertainties regarding future natural carbon sinks, (5) the need for joint governance of biodiversity loss and climate change, (6) advances in understanding compound events, (7) accelerated mountain glacier loss, (8) human immobility amidst climate risks, (9) adaptation justice, and (10) just transitions in food systems. We present a succinct account of these insights, reflect on their policy implications, and offer an integrated set of policy-relevant messages. This science synthesis and science communication effort is also the basis for a policy report contributing to elevate climate science every year in time for the United Nations Climate Change Conference.
Social media summary
We highlight recent and policy-relevant advances in climate change research – with input from more than 200 experts.
Since the initial publication of A Compendium of Strategies to Prevent Healthcare-Associated Infections in Acute Care Hospitals in 2008, the prevention of healthcare-associated infections (HAIs) has continued to be a national priority. Progress in healthcare epidemiology, infection prevention, antimicrobial stewardship, and implementation science research has led to improvements in our understanding of effective strategies for HAI prevention. Despite these advances, HAIs continue to affect ∼1 of every 31 hospitalized patients,1 leading to substantial morbidity, mortality, and excess healthcare expenditures,1 and persistent gaps remain between what is recommended and what is practiced.
The widespread impact of the coronavirus disease 2019 (COVID-19) pandemic on HAI outcomes2 in acute-care hospitals has further highlighted the essential role of infection prevention programs and the critical importance of prioritizing efforts that can be sustained even in the face of resource requirements from COVID-19 and future infectious diseases crises.3
The Compendium: 2022 Updates document provides acute-care hospitals with up-to-date, practical expert guidance to assist in prioritizing and implementing HAI prevention efforts. It is the product of a highly collaborative effort led by the Society for Healthcare Epidemiology of America (SHEA), the Infectious Disease Society of America (IDSA), the Association for Professionals in Infection Control and Epidemiology (APIC), the American Hospital Association (AHA), and The Joint Commission, with major contributions from representatives of organizations and societies with content expertise, including the Centers for Disease Control and Prevention (CDC), the Pediatric Infectious Disease Society (PIDS), the Society for Critical Care Medicine (SCCM), the Society for Hospital Medicine (SHM), the Surgical Infection Society (SIS), and others.
Partial anomalous venous connection with sinus venosus atrial septal defect is repaired with different approaches including the Warden procedure. Complications include stenosis of the superior caval vein and pulmonary venous baffle; however, cyanosis is rarely seen post-operatively. We report a patient presenting with cyanosis 5 years after a Warden, which was treated with a transcatheter approach.
New technologies and disruptions related to Coronavirus disease-2019 have led to expansion of decentralized approaches to clinical trials. Remote tools and methods hold promise for increasing trial efficiency and reducing burdens and barriers by facilitating participation outside of traditional clinical settings and taking studies directly to participants. The Trial Innovation Network, established in 2016 by the National Center for Advancing Clinical and Translational Science to address critical roadblocks in clinical research and accelerate the translational research process, has consulted on over 400 research study proposals to date. Its recommendations for decentralized approaches have included eConsent, participant-informed study design, remote intervention, study task reminders, social media recruitment, and return of results for participants. Some clinical trial elements have worked well when decentralized, while others, including remote recruitment and patient monitoring, need further refinement and assessment to determine their value. Partially decentralized, or “hybrid” trials, offer a first step to optimizing remote methods. Decentralized processes demonstrate potential to improve urban-rural diversity, but their impact on inclusion of racially and ethnically marginalized populations requires further study. To optimize inclusive participation in decentralized clinical trials, efforts must be made to build trust among marginalized communities, and to ensure access to remote technology.