We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Objectives/Goals: We describe the prevalence of individuals with household exposure to SARS-CoV-2, who subsequently report symptoms consistent with COVID-19, while having PCR results persistently negative for SARS-CoV-2 (S[+]/P[-]). We assess whether paired serology can assist in identifying the true infection status of such individuals. Methods/Study Population: In a multicenter household transmission study, index patients with SARS-CoV-2 were identified and enrolled together with their household contacts within 1 week of index’s illness onset. For 10 consecutive days, enrolled individuals provided daily symptom diaries and nasal specimens for polymerase chain reaction (PCR). Contacts were categorized into 4 groups based on presence of symptoms (S[+/-]) and PCR positivity (P[+/-]). Acute and convalescent blood specimens from these individuals (30 days apart) were subjected to quantitative serologic analysis for SARS-CoV-2 anti-nucleocapsid, spike, and receptor-binding domain antibodies. The antibody change in S[+]/P[-] individuals was assessed by thresholds derived from receiver operating characteristic (ROC) analysis of S[+]/P[+] (infected) versusS[-]/P[-] (uninfected). Results/Anticipated Results: Among 1,433 contacts, 67% had ≥1 SARS-CoV-2 PCR[+] result, while 33% remained PCR[-]. Among the latter, 55% (n = 263) reported symptoms for at least 1 day, most commonly congestion (63%), fatigue (63%), headache (62%), cough (59%), and sore throat (50%). A history of both previous infection and vaccination was present in 37% of S[+]/P[-] individuals, 38% of S[-]/P[-], and 21% of S[+]/P[+] (P<0.05). Vaccination alone was present in 37%, 41%, and 52%, respectively. ROC analyses of paired serologic testing of S[+]/P[+] (n = 354) vs. S[-]/P[-] (n = 103) individuals found anti-nucleocapsid data had the highest area under the curve (0.87). Based on the 30-day antibody change, 6.9% of S[+]/P[-] individuals demonstrated an increased convalescent antibody signal, although a similar seroresponse in 7.8% of the S[-]/P[-] group was observed. Discussion/Significance of Impact: Reporting respiratory symptoms was common among household contacts with persistent PCR[-] results. Paired serology analyses found similar seroresponses between S[+]/P[-] and S[-]/P[-] individuals. The symptomatic-but-PCR-negative phenomenon, while frequent, is unlikely attributable to true SARS-CoV-2 infections that go missed by PCR.
Water hyacinth is a highly invasive aquatic species in the southern United States that requires intensive management through frequent herbicide applications. Quantifying management success in large-scale operations is challenging with traditional survey methods that rely on boat-based teams and can be time-consuming and labor-intensive. In contrast, an unmanned aerial system (UAS) allows a single operator to survey a waterbody more efficiently and rapidly, enhancing both coverage and data collection. Therefore, the objective of this research was to develop remote sensing techniques to assess herbicide efficacy for water hyacinth control in an outdoor mesocosm study. Experiments were conducted in spring and summer 2023 to compare and correlate data from visual evaluations of herbicide efficacy against nine vegetation indices (VIs) derived from UAS-based red-green-blue imagery. Penoxsulam, carfentrazone, diquat, 2,4-D, florpyrauxifen-benzyl, and glyphosate were applied at two rates, and experimental units were evaluated for 6 wk. The carotenoid reflectance index (CRI) had the highest Spearman’s correlation coefficient with visually evaluated efficacy for 2,4-D, diquat, and florpyrauxifen benzyl (> −0.77). The visible atmospherically resistance index (VARI) had the highest correlation with carfentrazone and penoxsulam treatments (> −0.70), and the excess greenness minus redness index had the highest correlation for glyphosate treatments (> −0.83). CRI had the highest correlation coefficient with the most herbicide treatments, and it was the only VI tested that did not include the red band. These VIs were satisfactory predictors of mid-range visually evaluated herbicide efficacy values but were poorly correlated with extremely low and high values, corresponding to nontreated and necrotic plants. Future research should focus on applying findings to real-world (nonexperimental) field conditions and testing imagery with spectral bands beyond the visible range.
There is mounting interest in the dual health and environmental benefits of plant-based diets. Such diets prioritise whole foods of plant origin and moderate (though occasionally exclude) animal-sourced foods. However, the evidence base on plant-based diets and health outcomes in Australasia is limited and diverse, making it unsuitable for systematic review. This review aimed to assess the current state of play, identify research gaps, and suggest good practice recommendations. The consulted evidence base included key studies on plant-based diets and cardiometabolic health or mortality outcomes in Australian and New Zealand adults. Most studies were observational, conducted in Australia, published within the last decade, and relied on a single dietary assessment about 10–30 years ago. Plant-based diets were often examined using categories of vegetarianism, intake of plant or animal protein, or dietary indices. Health outcomes included mortality, type 2 diabetes and insulin resistance, obesity, cardiovascular disease, and metabolic syndrome. While Australia has an emerging and generally favourable evidence base on plant-based diets and health outcomes, New Zealand’s evidence base is still nascent. The lack of similar studies hinders the ability to judge the overall certainty of evidence, which could otherwise inform public health policies and strategies without relying on international studies with unconfirmed applicability. The proportional role of plant- and animal-sourced foods in healthy, sustainable diets in Australasia is an underexplored research area with potentially far-reaching implications, especially concerning nutrient adequacy and the combined health and environmental impacts.
Spatial analysis and disease mapping have the potential to enhance understanding of tuberculosis (TB) dynamics, whose spatial dynamics may be complicated by the mix of short and long-range transmission and long latency periods. TB notifications in Nam Dinh Province for individuals aged 15 and older from 2013 to 2022 were analyzed with a variety of spatio-temporal methods. The study commenced with an analysis of spatial autocorrelation to identify clustering patterns, followed by the evaluation of several candidate Bayesian spatio-temporal models. These models varied from simple assessments of spatial heterogeneity to more complex configurations incorporating covariates and interactions. The findings highlighted a peak in the TB notification rate in 2017, with 98 cases per 100,000 population, followed by a sharp decline in 2021. Significant spatial autocorrelation at the commune level was detected over most of the 10-year period. The Bayesian model that best balanced goodness-of-fit and complexity indicated that TB trends were associated with poverty: each percentage point increase in the proportion of poor households was associated with a 1.3% increase in TB notifications, emphasizing a significant socioeconomic factor in TB transmission dynamics. The integration of local socioeconomic data with spatio-temporal analysis could further enhance our understanding of TB epidemiology.
We present the first results from a new backend on the Australian Square Kilometre Array Pathfinder, the Commensal Realtime ASKAP Fast Transient COherent (CRACO) upgrade. CRACO records millisecond time resolution visibility data, and searches for dispersed fast transient signals including fast radio bursts (FRB), pulsars, and ultra-long period objects (ULPO). With the visibility data, CRACO can localise the transient events to arcsecond-level precision after the detection. Here, we describe the CRACO system and report the result from a sky survey carried out by CRACO at 110-ms resolution during its commissioning phase. During the survey, CRACO detected two FRBs (including one discovered solely with CRACO, FRB 20231027A), reported more precise localisations for four pulsars, discovered two new RRATs, and detected one known ULPO, GPM J1839 $-$10, through its sub-pulse structure. We present a sensitivity calibration of CRACO, finding that it achieves the expected sensitivity of 11.6 Jy ms to bursts of 110 ms duration or less. CRACO is currently running at a 13.8 ms time resolution and aims at a 1.7 ms time resolution before the end of 2024. The planned CRACO has an expected sensitivity of 1.5 Jy ms to bursts of 1.7 ms duration or less and can detect $10\times$ more FRBs than the current CRAFT incoherent sum system (i.e. 0.5 $-$2 localised FRBs per day), enabling us to better constrain the models for FRBs and use them as cosmological probes.
Peripheral inflammatory markers, including serum interleukin 6 (IL-6), are associated with depression, but less is known about how these markers associate with depression at different stages of the life course.
Methods
We examined the associations between serum IL-6 levels at baseline and subsequent depression symptom trajectories in two longitudinal cohorts: ALSPAC (age 10–28 years; N = 4,835) and UK Biobank (39–86 years; N = 39,613) using multilevel growth curve modeling. Models were adjusted for sex, BMI, and socioeconomic factors. Depressive symptoms were measured using the Short Moods and Feelings Questionnaire in ALSPAC (max time points = 11) and the Patient Health Questionnaire-2 in UK Biobank (max time points = 8).
Results
Higher baseline IL-6 was associated with worse depression symptom trajectories in both cohorts (largest effect size: 0.046 [ALSPAC, age 16 years]). These associations were stronger in the younger ALSPAC cohort, where additionally higher IL-6 levels at age 9 years was associated with worse depression symptoms trajectories in females compared to males. Weaker sex differences were observed in the older cohort, UK Biobank. However, statistically significant associations (pFDR <0.05) were of smaller effect sizes, typical of large cohort studies.
Conclusions
These findings suggest that systemic inflammation may influence the severity and course of depressive symptoms across the life course, which is apparent regardless of age and differences in measures and number of time points between these large, population-based cohorts.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
Deutetrabenazine is a vesicular monoamine transporter type 2 inhibitor (VMAT2i) for treatment of adults with tardive dyskinesia (TD) and Huntington disease (HD)-related chorea. A 4-week patient titration kit was launched (July 2021) to assist patients in titrating to optimal deutetrabenazine dosages.
Methods
START is an ongoing, routine-care, 2-cohort (TD and HD) study evaluating deutetrabenazine dosing patterns, effectiveness, and treatment satisfaction when initiated using a 4-week patient titration kit, with further titration allowed based on effectiveness and tolerability. Patient satisfaction with the kit was assessed via questionnaire at week 8. Results from the first 50 patients enrolled in the TD cohort are presented in this interim analysis.
Results
50 patients in the TD cohort were included (mean age, 58.7 years, 66% female, 74% White, mean baseline Abnormal Involuntary Movement Scale [AIMS] total motor score, 13.8). 39 of 50 (78%) patients successfully completed the titration kit (completed within 5 weeks or reached optimal dose [≥24 mg/day] within 4 weeks; mean [SE] days, 27.5 [0.32]). Mean (SE) time to reach optimal dosage for the 38 (76%) patients who reached it was 46.3 (5.48) days. Mean (SE) deutetrabenazine dosages were 27.7 (0.92) mg/day at week 4, 32.5 (1.00) mg/day at week 8, and 32.8 (1.18) mg/day at week 12. After completion of the kit, mean (SE) dosage was 31.8 (1.24) mg/day, and 95% of patients reaching week 12 had a maintenance dosage ≥24 mg/day. Mean (SE) adherence with the kit was 97.2% (1.39%). 22% of patients had an adverse event (AE); AEs led to dose reduction for 2%, drug interruption for 2%, and study discontinuation for 6% of patients. Serious and treatment-related adverse events were reported for 2% and 6% of patients. 24 of 49 (49%)23 of 49 patients achieved treatment success (“much”/“very much” improved) at week 12 per Clinical Global Impression of Change (GIC); 23 or 49 (47%) per Patient GIC. Total motor AIMS scores were reduced by 4.8 points at week 12. Among the 39 (78%) patients who responded to the questionnaire, 72% found it easy to understand when/which dosage to take, 77% easy to remember to take their medication, 74% easy to change the dose weekly, 69% easy to follow kit instructions, and 77% easy to use the kit overall.
Conclusions
78% of patients with TD successfully completed the 4-week titration kit in approximately 4 weeks, with adherence rates of 97.2%. 95% of patients reaching week 12 had a maintenance dosage ≥24 mg/day. 49% of patients achieved treatment success based on Clinical GIC. Patients reported high levels of satisfaction with the titration kit and 77% found it easy to use. The 4-week patient titration kit enabled patients to titrate DTBZ to an optimal dosage and experience effectiveness similar to the pivotal clinical trials.
This systematic review evaluates the use of Normothermic Machine Perfusion (NMP) as a testbed for developing peripheral nerve and muscle interfaces for bionic prostheses. Our findings suggest that NMP offers a viable alternative to traditional models, with significant implications for future research and clinical applications. A literature search was performed using Ovid MEDLINE (1946 to October 2023), revealing 559 abstracts.
No studies using nerve and/or muscle electrodes for the testing or development of bionic interface technologies were identified, except for one conference abstract. NMP could serve as a test bed for future development of interface biocompatibility, selectivity, stability and data transfer, whilst complying with ethical practices and potentially offering greater relevance for human translation. Implemention of machine perfusion requires experienced personnel. Encompassing artificial intelligence alongside machine learning will provide a significant contribution to advancing interface technologies for multiple neurological disorders.
A Bayes estimation procedure is introduced that allows the nature and strength of prior beliefs to be easily specified and modal posterior estimates to be obtained as easily as maximum likelihood estimates. The procedure is based on constructing posterior distributions that are formally identical to likelihoods, but are based on sampled data as well as artificial data reflecting prior information. Improvements in performance of modal Bayes procedures relative to maximum likelihood estimation are illustrated for Rasch-type models. Improvements range from modest to dramatic, depending on the model and the number of items being considered.
This abstract was presented as the Nutrition and Optimum Life Course Theme Highlight.
The n-3 polyunsaturated fatty acids (n-3 PUFA), eicosapentaenoic acid (EPA; 20:5n-3) and docosahexaenoic acid (DHA; 22:6n-3), are known for their beneficial roles in regulating inflammation(1). The omega 3 index (O3I) refers to the percentage of EPA+DHA within the erythrocyte membrane with respect to total fatty acids and is a recognised biomarker for cardiovascular disease(2). An O3I >8% is proposed to confer the greatest level of cardioprotection(2). Fish is the richest dietary source of n-3 PUFAs and has been noted as one of the main predictors of a higher O3I(3). Current UK dietary guidelines recommend the consumption of two portions of fish per week; albeit the efficacy of these recommendations in raising the O3I is unknown(4). The aim of this study was to investigate the influence of consuming two portions of fish per week on the O3I amongst low fish consuming women of childbearing age.
Data were analysed from the iFish study(5), an 8-week randomised controlled trial where low fish consuming women, were randomly assigned to consume either no fish (n = 18) or 2 portions of tuna (n = 8) or sardines (n = 9) per week. Total n-3 PUFA concentrations of the fish provided in the intervention were 6.47g/100g for sardines and 4.57g/100g for tuna. Fasting blood samples were collected at baseline and post-intervention. The O3I was determined in red blood cells in the control and two portions of fish groups by OmegaQuant Europe. Analysis of covariance, adjusting for age, BMI, and baseline O3I, examined the effect of the fish intervention on the O3I. Chi-square test was used to compare the O3I between groups when categorised as at risk (<4%), intermediate risk (48%) and low risk (>8%).
Participants had a mean ± SD age of 25.5 ± 6.4 years. Baseline median (IQR) O3I of the cohort was 5.7 (5.2, 6.7) %. There was no significant difference in the O3I between treatment groups at baseline. Consumption of two portions of fish significantly increased the O3I when compared to the consumption of no fish [6.73 (5.41, 7.38) % vs 5.58 (5.12, 6.49) %, respectively, p = 0.034]. Those consuming two portions of sardines, an oily fish high in n-3 PUFAs, had a significantly greater O3I when compared to those consuming two portions of tuna [7.38 (6.83, 8.37) % vs 5.61 (5.29, 6.79) %, respectively, p<0.001]. Post-intervention, the proportion of participants in the low risk O3I category (>8%) was greater in the two portions of fish group when compared to the control group; albeit this did not reach statistical significance (p = 0.104).
In support of the current dietary guidelines, increasing fish consumption of low consumers to two portions of any fish per week will increase the O3I. Future research should determine the potential cardioprotective properties of a higher O3I as a result of consuming fish.
Despite dietary guidance in over 90 countries and resources like the UK’s Eatwell guide, most individuals do not adhere to or achieve dietary aims(1,2). Specifically in the UK, population intakes of free sugars remain above the <5% recommendation, at around ∼10% of total energy intakes(3). To improve adherence to messages such as ‘reducing free sugars’, it may be helpful to identify barriers and facilitators to adherence whilst individuals attempt to modify their dietary patterns.
Participants were randomly selected from a randomised controlled trial investigating the effects of three different types of advice to reduce free sugars vs control on reducing free sugar intakes(4). A semi-structured interview explored barriers and facilitators to dietary adherence. Covariate adaptive randomisation ensured equal interviews at all timepoints across the 12-week study period and from participants in each trial arm. Data were analysed using framework analysis(5).
Sixty-two interviews were conducted across a 12-month period between 2021-2022. Seven themes for barriers and facilitators to recommendation adherence, encompassing 14 subthemes, were identified: 1) Proof and impact; 2) Realities of life; 3) Personal balance and empowerment; 4) Habitual approach; 5) Is it possible?; 6) Extensive awareness and viewpoint; and 7) Power of knowledge. Emergent themes sit within a context where individuals were challenged to reduce their intakes of free sugars and/or accurately record dietary intakes, thus they relate specifically to a dietary recording and free sugar reducing scenario. Participant interviews detected both internal and external environmental factors contributing to approaches to change. These factors were interrelated to self and community awareness, describing how individuals may utilise knowledge and understanding. Intervention participants reported all themes more than control participants; excepting the sub theme ‘limited impact.’ There were no observable reporting differences between the three intervention groups. Over the 12 -week study period, the positive sub-theme ‘enables’ within the theme ‘power of knowledge’ was more prominent at intervention delivery (week-1) than week-12. Additionally sub themes ‘active’ and ‘empower’ were reported more in those with higher adherence scores. These results suggest that dietary recommendations may need to be adapted to incorporate the stage at which dietary behavioural change takes place, with some focus also on maintenance as well as change. Overall, participant reports revealed that dietary advice needs to be appropriate for the person receiving it, easily understood, applicable, and actively engaging.
Our findings, when considered with the wider literature, may help us to better understand attempts to make dietary changes based on dietary advice, and support an individualised approach to dietary management. This greater understanding will help future advice to reduce free sugar intakes, including policy and public health initiatives.
Daily sodium intake in England is ∼3.3 g/day(1), with government and scientific advice to reduce intake for cardiovascular health purposes having varying success(2). Eccrine sweat is produced during exercise or exposure to warm environments to maintain body temperature through evaporative cooling. Sweat is primarily water, but also contains appreciable amounts of electrolytes, particularly sodium, meaning sweat sodium losses could reduce daily sodium balance without the need for dietary manipulation. However, the effects of sweat sodium losses on 24-h sodium balance are unclear.
Fourteen active participants (10 males, 4 females; 23±2 years, 45±9 mL/kg/min) completed a preliminary trial and two 24-h randomised, counterbalanced experimental trials. Participants arrived fasted for baseline (0-h) measures (blood/urine samples, blood pressure, nude body mass) followed by breakfast and low-intensity intermittent cycling in the heat (∼36⁰C, ∼50% humidity) to turnover ∼2.5% body mass in sweat (EX), or the same duration of room temperature seated rest (REST). Further blood samples were collected post-EX/REST (1.5-3 h post-baseline). During EX, sweat was collected from 5 sites and water consumed to fully replace sweat losses. During REST, participants drank 100 mL/h. Food intake was individually standardised over the 24-h, with bottled water available ad-libitum. Participants collected all urine produced over the 24-h and returned the following morning to repeat baseline measures fasted (24-h). Sodium balance was estimated over the 24-h using sweat/urine losses and dietary intake. Data were analysed using 2-way ANOVA followed by Shapiro-Wilk and paired t-tests/Wilcoxon signed-rank tests. Data are mean (standard deviation).
Dietary sodium intake was 2.3 (0.3) g and participants lost 2.8 (0.3) % body mass in sweat (containing 2.5 (0.9) g sodium). Sodium balance was lower for EX (-2.0 (1.6) g vs -1.0 (1.6) g; P = 0.022), despite lower 24-h urine sodium losses in EX (1.8 (1.2) g vs 3.3 (1.7) g; P = 0.001). PostEX/REST blood sodium concentration was lower in EX (137.6 (2.3) mmol/L vs 139.9 (1.0) mmol/L; P = 0.002) but did not differ at 0-h (P = 0.906) or 24-h (P = 0.118). There was no difference in plasma volume change (P = 0.423), urine specific gravity (P = 0.495), systolic (P = 0.324) or diastolic (P = 0.274) blood pressure between trials over the 24-h. Body mass change over 24-h was not different between trials (REST +0.25 (1.10) %; EX +0.40 (0.68) %; P = 0.663).
Sweat loss through low-intensity exercise resulted in a lower sodium balance compared to rest. Although urine sodium output reduced with EX, it was not sufficient to offset exercise-induced sodium losses. Despite this, body mass, plasma volume and blood sodium concentration were not different between trials, suggesting sodium may have been lost from non-osmotic sodium stores. This suggests sweat sodium losses could be used to reduce sodium balance, although longer studies are required to confirm this thesis.
Findings from animal models suggest early exposure to polyunsaturated fatty acids (PUFAs) during pregnancy may influence developmental plasticity including adiposity(1). Birth cohort studies examining associations between offspring weight and maternal n-3 PUFA status or maternal fish intakes, the richest dietary source of n-3 PUFAs have been few and have yielded inconsistent findings. Some have reported lower weight at birth and throughout childhood with increasing maternal fish intakes and n-3 PUFA status(2), whilst others have observed positive or null associations(3,4). These have focused on the first few years of life and have been conducted within low fish-consuming populations. Our study provides novel data by examining associations between maternal fish consumption and prenatal PUFA (n-3 & n-6) status and offspring weight at birth and throughout childhood (7 & 13 years) in a high fish-eating population.
Pregnant women were enrolled in the Seychelles Child Development Study Nutrition Cohort 2 between 2008-2011. Serum PUFAs were quantified in maternal blood collected at 28-weeks’ gestation and in cord blood collected at delivery using gas-chromatography tandem mass spectrometry. Maternal fish consumption was assessed at 28-weeks’ gestation using a Fish Use Questionnaire. Childbirth weight (kg) was measured at delivery and classified according to WHO growth standards(5) (n = 1185). Child height (m), weight (kg), waist and hip circumference (cm) were recorded at 7 (n = 1167) and 13 (n = 878) years. Statistical analysis was conducted using logistic and multiple linear regression adjusting for child sex, gestational age, maternal age, BMI, alcohol use, socioeconomic status, and parity. Models at 7 & 13 years were additionally adjusted for child height and fish intakes.
Women were consuming on average 8.49 ± 4.51 fish meals/week during pregnancy. No significant associations were found between maternal fish intakes and anthropometric outcomes at birth, 7 & 13 years. No significant associations were observed between maternal PUFAs and offspring weight at birth. At both 7 & 13 years, however, higher maternal total n-6 PUFAs were associated with increased child weight [7yr; β = 0.070, p = 0.003, 13yr; β = 0.097, p = 0.004], waist circumference [7yr; β = 0.086, p = 0.003, 13yr; β = 0.105, p = 0.004], and hip circumference [7yr; β = 0.062, p = 0.027, 13yr; β = 0.090, p = 0.013]. No significant associations were found between cord n-6 PUFAs and birth weight. In quartile analysis, cord docosahexaenoic acid (DHA; C22:6n-3) concentrations <0.071mg/ml were associated with a higher risk of large for gestational age (LGA; >90th percentile) when compared to cord DHA concentrations >0.129mg/ml [OR 4.17, p = 0.017]. There were no significant associations between cord PUFAs and anthropometric outcomes at 7 & 13 years.
These findings suggest lower cord DHA, an n-3 PUFA, may be associated with higher risk of LGA at birth whilst higher n-6 PUFAs during pregnancy may be associated with adiposity development throughout childhood. Future work is needed to determine the potential long-term metabolic consequences of such associations.
In response to the COVID-19 pandemic, we rapidly implemented a plasma coordination center, within two months, to support transfusion for two outpatient randomized controlled trials. The center design was based on an investigational drug services model and a Food and Drug Administration-compliant database to manage blood product inventory and trial safety.
Methods:
A core investigational team adapted a cloud-based platform to randomize patient assignments and track inventory distribution of control plasma and high-titer COVID-19 convalescent plasma of different blood groups from 29 donor collection centers directly to blood banks serving 26 transfusion sites.
Results:
We performed 1,351 transfusions in 16 months. The transparency of the digital inventory at each site was critical to facilitate qualification, randomization, and overnight shipments of blood group-compatible plasma for transfusions into trial participants. While inventory challenges were heightened with COVID-19 convalescent plasma, the cloud-based system, and the flexible approach of the plasma coordination center staff across the blood bank network enabled decentralized procurement and distribution of investigational products to maintain inventory thresholds and overcome local supply chain restraints at the sites.
Conclusion:
The rapid creation of a plasma coordination center for outpatient transfusions is infrequent in the academic setting. Distributing more than 3,100 plasma units to blood banks charged with managing investigational inventory across the U.S. in a decentralized manner posed operational and regulatory challenges while providing opportunities for the plasma coordination center to contribute to research of global importance. This program can serve as a template in subsequent public health emergencies.
Growing evidence suggests that direct oral anticoagulants (DOACs) may be suitable for cerebral venous thrombosis (CVT). The optimal strategy regarding lead-in parenteral anticoagulation (PA) prior to DOAC is unknown.
Methods:
In this post hoc analysis of the retrospective ACTION-CVT study, we compared patients treated with DOACs as part of routine care: those given “very early” DOAC (no PA), “early” (<5 days PA) and “delayed” (5–21 days PA). We compared baseline characteristics and outcomes between the very early/early and delayed groups. The primary outcome was a composite of day-30 CVT recurrence/extension, new peripheral venous thromboembolism, cerebral edema and intracranial hemorrhage.
Results:
Of 231 patients, 11.7% had very early DOAC, 64.5% early (median [IQR] 2 [1–2] days) and 23.8% delayed (5 [5–6] days). More patients had severe clinical/radiological presentations in the delayed group; more patients had isolated headaches in the very early/early group. Outcomes were better in the very early/early groups (90-day modified Rankin Scale of 0–2; 94.3% vs. 83.9%). Primary outcome events were rare and did not differ significantly between groups (2.4% vs. 2.1% delayed; adjusted HR 1.49 [95%CI 0.17–13.11]).
Conclusions:
In this cohort of patients receiving DOAC for CVT as part of routine care, >75% had <5 days of PA. Those with very early/early initiation of DOAC had less severe clinical presentations. Low event rates and baseline differences between groups preclude conclusions about safety or effectiveness. Further prospective data will inform care.
We conducted a systematic review of the medical, nursing, forensic, and social science literature describing events and processes associated with what happens after a traumatic death in the socio-cultural context of largely Western and high-income societies. These include death notification, why survivors choose to view or not view the body, forensic practices affecting viewing the body, alternatives to viewing, and social and cultural practices following the death. We also describe how elements of these processes may act to increase or lessen some of the negative cognitive and emotional consequences for both survivors and providers. The information presented is applicable to those who may be faced with traumatic deaths, including those who work in medicine, nursing, and law enforcement, as well as first responders, forensic investigators, funeral directors, and the families of the deceased.
Social determinants of health (SDOH) are an important contributor to health status and health outcomes. In this analysis, we compare SDOH measured both at the individual and population levels in patients with high comorbidity who receive primary care at Federally Qualified Health Centers in New York and Chicago and enrolled in the Tipping Points trial.
Methods:
We analyzed individual- and population-level measures of SDOH in 1,488 patients with high comorbidity (Charlson Comorbidity Index ≥ 4) enrolled in Tipping Points. At the individual level, we used a standardized patient-reported questionnaire. At the population level, we employed patient addresses to calculate the Social Deprivation Index (SDI) and Area Deprivation Index. Multivariable regressions were conducted in addition to qualitative feedback from stakeholders.
Results:
Individual-level SDOH are distinct from population-level measures. Significant component predictors of population SDI are being unhoused, unable to pay for utilities, and difficulty accessing medical transportation. Qualitative findings mirrored these results. High comorbidity patients report significant SDOH challenges at the individual level. Fitting a binomial generalized linear model, the comorbidity score is significantly predicted by the composite individual SDOH index (p < 0.0001) controlling for age and race/ethnicity.
Conclusions:
Individual- and population-level SDOH measures provide different risk assessments. The use of community-level SDI data is informative in the aggregate but should not be used to identify patients with individual unmet social needs. Health systems should implement a standardized individualized assessment of unmet SDOH needs and build strong, enduring partnerships with community-based organizations that can provide those services.
The typological, technological, and use-wear analyses of obsidian artifacts from Terminal Classic Pook's Hill (AD 830–950+) provide opportunities to better reconstruct socioeconomic activities in this plazuela group, including long-distance trade, tool production, subsistence practices, domestic tasks, and the organization of craft production. Based on visual sourcing, most of the obsidian originated from highland Guatemala, specifically El Chayal. The majority of obsidian artifacts were prismatic blades, although both casual and bipolar reduction of blade cores and the recycling of blades from earlier occupations occurred at the site. Use-wear analysis reveals that obsidian tools were mainly used for subsistence and domestic household activities; however, the concentrations of tools with specific wear patterns (bone, ceramic, plants, and shell) at some locations in the plazuela provide evidence for local craft production among the population. Further support for craft production is provided by comparable use-wear on chert/chalcedony tools from these same locations. The products of low-level craft production were used within Pook's Hill itself and may have been distributed to neighboring communities within the Roaring Creek and Upper Belize River Valleys. Despite the sociopolitical and socioeconomic disruptions to lifeways that accompanied the Terminal Classic period, the Pook's Hill Maya seem to have experienced minimal upheaval in their daily lives and continued local low-level craft production. However, one important change in the Terminal Classic appears to be the increased difficulty in obtaining obsidian at Pook's Hill and the growing need for tool recycling and raw material conservation.
Plant growth requires the integration of internal and external cues, perceived and transduced into a developmental programme of cell division, elongation and wall thickening. Mechanical forces contribute to this regulation, and thigmomorphogenesis typically includes reducing stem height, increasing stem diameter, and a canonical transcriptomic response. We present data on a bZIP transcription factor involved in this process in grasses. Brachypodium distachyon SECONDARY WALL INTERACTING bZIP (SWIZ) protein translocated into the nucleus following mechanostimulation. Classical touch-responsive genes were upregulated in B. distachyon roots following touch, including significant induction of the glycoside hydrolase 17 family, which may be unique to grass thigmomorphogenesis. SWIZ protein binding to an E-box variant in exons and introns was associated with immediate activation followed by repression of gene expression. SWIZ overexpression resulted in plants with reduced stem and root elongation. These data further define plant touch-responsive transcriptomics and physiology, offering insights into grass mechanotranduction dynamics.