We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Current clinical guidelines for people at risk of heart disease in Australia recommend nutrition intervention in conjunction with pharmacotherapy(1). However, Australians living in rural and remote regions have less access to medical nutritional therapy (MNT) provided by Accredited Practising Dietitians (APDs) than their urban counterparts(2). The aim of the HealthyRHearts study was to trial the delivery of MNT by APDs using telehealth to eligible patients of General Practitioners (GPs) located in small to large rural towns in the Hunter New England region(3) of New South Wales, Australia. The study design was a 12-month pragmatic randomised controlled trial. The key outcome was reduced total cholesterol. The study was place-based, meaning many of the research team and APDs were based rurally, to ensure the context of the GPs and patients was already known. Eligible participants were those assessed as moderate-to-high risk of CVD by their GP. People in the intervention group received five MNT consults (totalling two hours) delivered via telehealth by APDs, and also answered a personalised nutrition questionnaire to guide their priorities and to support personalised dietary behaviour change during the counselling. Both intervention and control groups received usual care from their GP and were provided access to the Australian Eating Survey (Heart version), a 242-item online food frequency questionnaire with technology-supported personalised nutrition reports that evaluated intake relative to heart healthy eating principles. Of the 192 people who consented to participate, 132 were eligible due to their moderate-to-high risk. Pre-post participant medication use with a registered indication(4) for hypercholesterolemia, hypertension and glycemic control were documented according to class and strength (defined daily dose: DDD)(5). Nine GP practices (with 91 participants recruited) were randomised to the intervention group and seven practices (41 participants) were randomised to control. Intervention participants attended 4.3 ± 1.4 out of 5 dietetic consultations offered. Of the132 people with baseline clinical chemistry, 103 also provided a 12-month sample. Mean total cholesterol at baseline was 4.97 ± 1.13 mmol/L for both groups, with 12-m reduction of 0.26 ± 0.77 for intervention and 0.28 ± 0.79 for control (p = 0.90, unadjusted value). Median (IQR) number of medications for the intervention group was 2 (1–3) at both baseline and 12 months (p = 0.78) with 2 (1–3) and 3 (2–3) for the control group respectively. Combined DDD of all medications was 2.1 (0.5–3.8) and 2.5 (0.75–4.4) at baseline and 12 months (p = 0.77) for the intervention group and 2.7 (1.5–4.0) and 3.0 (2.0–4.5) for the control group (p = 0.30). Results suggest that medications were a significant contributor to the management of total cholesterol. Further analysis is required to evaluate changes in total cholesterol attributable to medication prescription relative to the MNT counselling received by the intervention group.
Traditional foods are increasingly being incorporated into modern diets. This is largely driven by consumers seeking alternative food sources that have superior nutritional and functional properties. Within Australia, Aboriginal and Torres Strait Islander peoples are looking to develop their traditional foods for commercial markets. However, supporting evidence to suggest these foods are safe for consumption within the wider general population is limited. At the 2022 NSA conference a keynote presentation titled ‘Decolonising food regulatory frameworks to facilitate First Peoples food sovereignty’ was presented. This presentation was followed by a manuscript titled ‘Decolonising food regulatory frameworks: Importance of recognising traditional culture when assessing dietary safety of traditional foods’, which was published in the conference proceedings journal(1). These pieces examined the current regulatory frameworks that are used to assess traditional foods and proposed a way forward that would allow Traditional Custodians to successfully develop their foods for modern markets. Building upon the previously highlighted works, this presentation will showcase best practice Indigenous engagement and collaboration principles in the development of traditionally used food products. To achieve this, we collaborated with a collective of Gamilaraay peoples who are looking to reignite their traditional grain practices and develop grain-based food products. To meet the current food safety regulatory requirements, we needed to understand how this grain would fit into modern diets, which included understanding the history of use, elucidating the nutritional and functional properties that can be attributed to the grain, and developing a safety dossier(2) so that the Traditional Custodians can confidently take their product to market. To aid the Traditional Custodians in performing their due diligence, we have systemically analysed the dietary safety of the selected native grain and compared it side-by-side with commonly consumed wheat in a range of in vitro bioassays and chemical analyses. From a food safety perspective, we show that the native grain is equivalent to commonly consumed wheat. The native grain has been shown to be no more toxic than wheat within our biological screening systems. Chemical analysis showed that the level of contaminants are below tolerable limits, and we were not able to identify any chemical classes of concern. Our initial findings support the history of safe use and suggest that the tested native grain species would be no less safe than commonly consumed wheat. This risk assessment and previously published nutritional study(3) provides an overall indication that the grain is nutritionally superior and viable for commercial development. The learnings from this project can direct the future risk assessment of traditional foods and therefore facilitate the safe market access of a broader range of traditionally used foods. Importantly, the methods presented are culturally safe and financially viable for the small businesses hoping to enter the market.
Posttraumatic stress disorder (PTSD) has been associated with advanced epigenetic age cross-sectionally, but the association between these variables over time is unclear. This study conducted meta-analyses to test whether new-onset PTSD diagnosis and changes in PTSD symptom severity over time were associated with changes in two metrics of epigenetic aging over two time points.
Methods
We conducted meta-analyses of the association between change in PTSD diagnosis and symptom severity and change in epigenetic age acceleration/deceleration (age-adjusted DNA methylation age residuals as per the Horvath and GrimAge metrics) using data from 7 military and civilian cohorts participating in the Psychiatric Genomics Consortium PTSD Epigenetics Workgroup (total N = 1,367).
Results
Meta-analysis revealed that the interaction between Time 1 (T1) Horvath age residuals and new-onset PTSD over time was significantly associated with Horvath age residuals at T2 (meta β = 0.16, meta p = 0.02, p-adj = 0.03). The interaction between T1 Horvath age residuals and changes in PTSD symptom severity over time was significantly related to Horvath age residuals at T2 (meta β = 0.24, meta p = 0.05). No associations were observed for GrimAge residuals.
Conclusions
Results indicated that individuals who developed new-onset PTSD or showed increased PTSD symptom severity over time evidenced greater epigenetic age acceleration at follow-up than would be expected based on baseline age acceleration. This suggests that PTSD may accelerate biological aging over time and highlights the need for intervention studies to determine if PTSD treatment has a beneficial effect on the aging methylome.
Scholarly and practitioner interest in authentic leadership has grown at an accelerating rate over the last decade, resulting in a proliferation of publications across diverse social science disciplines. Accompanying this interest has been criticism of authentic leadership theory and the methods used to explore it. We conducted a systematic review of 303 scholarly articles published from 2010 to 2023 to critically assess the conceptual and empirical strengths and limitations of this literature and map the nomological network of the authentic leadership construct. Results indicate that much of the extant research does not follow best practices in terms of research design and analysis. Based on the findings obtained, an agenda for advancing authentic leadership theory and research that embraces a signaling theory perspective is proposed.
Childhood bullying is a public health priority. We evaluated the effectiveness and costs of KiVa, a whole-school anti-bullying program that targets the peer context.
Methods
A two-arm pragmatic multicenter cluster randomized controlled trial with embedded economic evaluation. Schools were randomized to KiVa-intervention or usual practice (UP), stratified on school size and Free School Meals eligibility. KiVa was delivered by trained teachers across one school year. Follow-up was at 12 months post randomization. Primary outcome: student-reported bullying-victimization; secondary outcomes: self-reported bullying-perpetration, participant roles in bullying, empathy and teacher-reported Strengths and Difficulties Questionnaire. Outcomes were analyzed using multilevel linear and logistic regression models.
Findings
Between 8/11/2019–12/02/2021, 118 primary schools were recruited in four trial sites, 11 111 students in primary analysis (KiVa-intervention: n = 5944; 49.6% female; UP: n = 5167, 49.0% female). At baseline, 21.6% of students reported being bullied in the UP group and 20.3% in the KiVa-intervention group, reducing to 20.7% in the UP group and 17.7% in the KiVa-intervention group at follow-up (odds ratio 0.87; 95% confidence interval 0.78 to 0.97, p value = 0.009). Students in the KiVa group had significantly higher empathy and reduced peer problems. We found no differences in bullying perpetration, school wellbeing, emotional or behavioral problems. A priori subgroup analyses revealed no differences in effectiveness by socioeconomic gradient, or by gender. KiVa costs £20.78 more per pupil than usual practice in the first year, and £1.65 more per pupil in subsequent years.
Interpretation
The KiVa anti-bullying program is effective at reducing bullying victimization with small-moderate effects of public health importance.
Funding
The study was funded by the UK National Institute for Health and Care Research (NIHR) Public Health Research program (17-92-11). Intervention costs were funded by the Rayne Foundation, GwE North Wales Regional School Improvement Service, Children's Services, Devon County Council and HSBC Global Services (UK) Ltd.
In decision-making, especially for sustainability, choosing the right assessment tools is crucial but challenging due to the abundance of options. A new method is introduced to streamline this process, aiding policymakers and managers. This method involves four phases: scoping, cataloging, selection, and validation, combining data analysis with stakeholder engagement. Using the food system as an example, the approach demonstrates how practitioners can select tools effectively based on input variables and desired outcomes to address sustainability risks. This method can be applied across various sectors, offering a systematic way to enhance decision-making and manage sustainability effectively.
Technical Summary
Decision making frequently entails the selection and application of assessment tools. For sustainability decisions there are a plethora of tools available for environmental assessment, yet no established and clear approach to determine which tools are appropriate and resource efficient for application. Here we present an extensive inventory of tools and a novel taxonomic method which enables efficient, effective tool selection to improve decision making for policymakers and managers. The tool selection methodology follows four main phases based on the divergence-convergence logic; a scoping phase, cataloging phase, selection phase and validation phase. This approach combines elements of data-driven analysis with participatory techniques for stakeholder engagement to achieve buy-in and to ensure efficient management of progress and agile course correction when needed. It builds on the current limited range and scope of approaches to tool selection, and is flexible and Artificial Intelligence-ready in order to facilitate more rapid integration and uptake. Using the food system as a case study, we demonstrate how practitioners can use available input variables and desired output metrics to select the most appropriate tools to manage sustainability risks, with the approach having wide applicability to other sectors.
Social Media Summary
New method simplifies tool selection for sustainable decisions, aiding policymakers & managers. #Sustainability #DecisionMaking
Inhibitory control plays an important role in children’s cognitive and socioemotional development, including their psychopathology. It has been established that contextual factors such as socioeconomic status (SES) and parents’ psychopathology are associated with children’s inhibitory control. However, the relations between the neural correlates of inhibitory control and contextual factors have been rarely examined in longitudinal studies. In the present study, we used both event-related potential (ERP) components and time-frequency measures of inhibitory control to evaluate the neural pathways between contextual factors, including prenatal SES and maternal psychopathology, and children’s behavioral and emotional problems in a large sample of children (N = 560; 51.75% females; Mage = 7.13 years; Rangeage = 4–11 years). Results showed that theta power, which was positively predicted by prenatal SES and was negatively related to children’s externalizing problems, mediated the longitudinal and negative relation between them. ERP amplitudes and latencies did not mediate the longitudinal association between prenatal risk factors (i.e., prenatal SES and maternal psychopathology) and children’s internalizing and externalizing problems. Our findings increase our understanding of the neural pathways linking early risk factors to children’s psychopathology.
Magnetic reconnection is an important process in astrophysical environments, as it reconfigures magnetic field topology and converts magnetic energy into thermal and kinetic energy. In extreme astrophysical systems, such as black hole coronae and pulsar magnetospheres, radiative cooling modifies the energy partition by radiating away internal energy, which can lead to the radiative collapse of the reconnection layer. In this paper, we perform two- and three-dimensional simulations to model the MARZ (Magnetic Reconnection on Z) experiments, which are designed to access cooling rates in the laboratory necessary to investigate reconnection in a previously unexplored radiatively cooled regime. These simulations are performed in GORGON, an Eulerian two-temperature resistive magnetohydrodynamic code, which models the experimental geometry comprising two exploding wire arrays driven by 20 MA of current on the Z machine (Sandia National Laboratories). Radiative losses are implemented using non-local thermodynamic equilibrium tables computed using the atomic code Spk, and we probe the effects of radiation transport by implementing both a local radiation loss model and $P_{1/3}$ multi-group radiation transport. The load produces highly collisional, super-Alfvénic (Alfvén Mach number $M_A \approx 1.5$), supersonic (Sonic Mach number $M_S \approx 4-5$) strongly driven plasma flows which generate an elongated reconnection layer (Aspect Ratio $L/\delta \approx 100$, Lundquist number $S_L \approx 400$). The reconnection layer undergoes radiative collapse when the radiative losses exceed the rates of ohmic and compressional heating (cooling rate/hydrodynamic transit rate = $\tau _{\text {cool}}^{-1}/\tau _{H}^{-1}\approx 100$); this generates a cold strongly compressed current sheet, leading to an accelerated reconnection rate, consistent with theoretical predictions. Finally, the current sheet is also unstable to the plasmoid instability, but the magnetic islands are extinguished by strong radiative cooling before ejection from the layer.
This work examines the ability of commerical zeolite Y to act as a slow release agent for a number of anthelmintic drugs. Administration to rats, dosed with Nippostrongylus brasiliensis, of pyrantel and/or fenbendazole and pigs, dosed with Ascaris and Oesophagostomum, of dichlorvos (DDVP) loaded onto zeolite Y was more sucessful in killing adult worms than administration of the pure drug alone. The zeolite Y was used as supplied for initial studies and then later dealuminated for further studies. The drug loadings were monitored by thermal analysis and the loaded zeolites were used in several field trials. The results indicate that zeolite Y is a suitable vehicle for the slow release of some anthelmintics. The slow release of drug from the zeolite matrix improved its efficacy.
OBJECTIVES/GOALS: Contingency management (CM) procedures yield measurable reductions in cocaine use. This poster describes a trial aimed at using CM as a vehicle to show the biopsychosocial health benefits of reduced use, rather than total abstinence, the currently accepted metric for treatment efficacy. METHODS/STUDY POPULATION: In this 12-week, randomized controlled trial, CM was used to reduce cocaine use and evaluate associated improvements in cardiovascular, immune, and psychosocial well-being. Adults aged 18 and older who sought treatment for cocaine use (N=127) were randomized into three groups in a 1:1:1 ratio: High Value ($55) or Low Value ($13) CM incentives for cocaine-negative urine samples or a non-contingent control group. They completed outpatient sessions three days per week across the 12-week intervention period, totaling 36 clinic visits and four post-treatment follow-up visits. During each visit, participants provided observed urine samples and completed several assays of biopsychosocial health. RESULTS/ANTICIPATED RESULTS: Preliminary findings from generalized linear mixed effect modeling demonstrate the feasibility of the CM platform. Abstinence rates from cocaine use were significantly greater in the High Value group (47% negative; OR = 2.80; p = 0.01) relative to the Low Value (23% negative) and Control groups (24% negative;). In the planned primary analysis, the level of cocaine use reduction based on cocaine-negative urine samples will serve as the primary predictor of cardiovascular (e.g., endothelin-1 levels), immune (e.g., IL-10 levels) and psychosocial (e.g., Addiction Severity Index) outcomes using results from the fitted models. DISCUSSION/SIGNIFICANCE: This research will advance the field by prospectively and comprehensively demonstrating the beneficial effects of reduced cocaine use. These outcomes can, in turn, support the adoption of reduced cocaine use as a viable alternative endpoint in cocaine treatment trials.
Radiotherapy for pediatric brain tumor has been associated with late cognitive effects. Compared to conventional photon radiotherapy (XRT), proton radiotherapy (PRT) delivers less radiation to healthy brain tissue. PRT has been associated with improved long term cognitive outcomes compared to XRT. However, there is limited research comparing the effects of XRT and PRT on verbal memory outcomes.
Participants and Methods:
Survivors of pediatric brain tumor treated with either XRT (n = 29) or PRT (n = 51) completed neuropsychological testing > 1 year following radiotherapy. XRT and PRT groups were similar with respect to sex, handedness, race, age at diagnosis, age at evaluation, tumor characteristics, and treatment history (i.e., craniospinal irradiation, craniotomy, shunting, chemotherapy, radiation dose). Verbal learning and memory were assessed using the age-appropriate version of the California Verbal Learning Test (CVLT-II/CVLT-C). Measures of intellectual functioning, executive functioning, attention and adaptive behavior were also collected. Performance on neuropsychological measures was compared between treatment groups (XRT vs. PRT) using analysis of covariance (ANCOVA). On the CVLT, each participant was classified as having an encoding deficit profile (i.e., impaired learning, recall, and recognition), retrieval deficit profile (i.e., impaired recall but intact recognition), intact profile, or other profile. Chi-squared tests of independence were used to compare the probability of each memory profile between treatment groups. Pearson correlation was used to examine associations between memory performance and strategy use, intellectual functioning, adaptive behavior, attention, and executive functioning.
Results:
Overall, patients receiving PRT demonstrated superior verbal learning (CVLT Trials 1-5; t(76) = 2.61, p = .011), recall (CVLT Long Delay Free; t(76) = 3.57, p = .001) and strategy use (CVLT Semantic Clustering; t(76) = 2.29, p = .025) compared to those treated with XRT. Intact performance was more likely in the PRT group than the XRT group (71% PRT, 38% XRT; X2 = 8.14, p = .004). Encoding and retrieval deficits were both more common in the XRT group, with encoding problems being most prevalent (Encoding Deficits: 31% XRT, 12% PRT, X2 = 4.51, p = .034; Retrieval Deficits: 17% XRT, 4% PRT, X2 = 4.11, p = .043). Across all participants, semantic clustering predicted better encoding (r = .28, p = .011) and retrieval (r = .26, p = .022). Better encoding predicted higher intellectual (r = .56, p < .001) and adaptive functioning (r = .30, p = .011), and fewer parent-reported concerns about day-today attention (r = -.36, p = .002), and cognitive regulation (r = -.35, p = .002).
Conclusions:
Results suggest that PRT is associated with superior verbal memory outcomes compared to XRT, which may be driven by encoding skills and use of learning strategies. Moreover, encoding ability predicted general intellectual ability and day-to-day functioning. Future work may help to clarify underlying neural mechanisms associated with verbal memory decline following radiotherapy, which will better inform treatment approaches for survivors of pediatric brain tumor.
Schizophrenia (SCZ) is a neuropsychiatric disorder with strong genetic heritability and predicted genetic heterogeneity, but limited knowledge regarding the underlying genetic risk variants. Classification into phenotype-driven subgroups or endophenotypes is expected to facilitate genetic analysis. Here, we report a teen boy with chronic psychosis and cerebellar hypoplasia (CBLH) and analyze data on 16 reported individuals with SCZ or chronic psychosis not otherwise specified associated with cerebellar hypoplasia to look for shared features.
Participants and Methods:
We evaluated an 18-year-old boy with neurodevelopmental deficits from early childhood and onset of hallucinations and other features of SCZ at 10 years who had mild vermis-predominant CBLH on brain imaging. This prompted us to review prior reports of chronic psychosis or SCZ with cerebellar malformations using paired search terms including (1) cerebellar hypoplasia, Dandy-Walker malformation, Dandy-Walker variant, or mega-cisterna magna with (2) psychosis or SCZ. We found reports of 16 affected individuals from 13 reports. We reviewed clinical features focusing on demographic information, prenatal-perinatal history and neuropsychiatric and neurodevelopmental phenotypes, and independently reviewed brain imaging features.
Results:
All 17 individuals had classic psychiatric features of SCZ or chronic psychosis as well as shared neurodevelopmental features not previously highlighted including a downward shift in IQ of about 20 points, memory impairment, speech-language deficits, attention deficits and sleep disturbances. The brain imaging findings among these individuals consistently showed posterior vermis predominant CBLH with variable cerebellar hemisphere hypoplasia and enlarged posterior fossa (a.k.a. mega-cisterna magna). None had features of classic DWM.
Conclusions:
In 17 individuals with chronic psychosis or SCZ and cerebellar malformation, we found a high frequency of neurodevelopmental disorders, a consistent brain malformation consisting of posterior vermis-predominant (and usually symmetric) CBLH, and no evidence of prenatal risk factors. The consistent phenotype and lack of prenatal risk factors for CBLH leads us to hypothesize that psychosis or schizophrenia associated with vermis predominant CBLH comprises a homogeneous subgroup of individuals with chronic psychosis/schizophrenia that is likely to have an underlying genetic basis. No comprehensive targeted gene panel for CBLH has yet been defined, leading us to recommend trio-based exome sequencing for individuals who present with this combination of features.
White matter hyperintensity (WMH) burden is greater, has a frontal-temporal distribution, and is associated with proxies of exposure to repetitive head impacts (RHI) in former American football players. These findings suggest that in the context of RHI, WMH might have unique etiologies that extend beyond those of vascular risk factors and normal aging processes. The objective of this study was to evaluate the correlates of WMH in former elite American football players. We examined markers of amyloid, tau, neurodegeneration, inflammation, axonal injury, and vascular health and their relationships to WMH. A group of age-matched asymptomatic men without a history of RHI was included to determine the specificity of the relationships observed in the former football players.
Participants and Methods:
240 male participants aged 45-74 (60 unexposed asymptomatic men, 60 male former college football players, 120 male former professional football players) underwent semi-structured clinical interviews, magnetic resonance imaging (structural T1, T2 FLAIR, and diffusion tensor imaging), and lumbar puncture to collect cerebrospinal fluid (CSF) biomarkers as part of the DIAGNOSE CTE Research Project. Total WMH lesion volumes (TLV) were estimated using the Lesion Prediction Algorithm from the Lesion Segmentation Toolbox. Structural equation modeling, using Full-Information Maximum Likelihood (FIML) to account for missing values, examined the associations between log-TLV and the following variables: total cortical thickness, whole-brain average fractional anisotropy (FA), CSF amyloid ß42, CSF p-tau181, CSF sTREM2 (a marker of microglial activation), CSF neurofilament light (NfL), and the modified Framingham stroke risk profile (rFSRP). Covariates included age, race, education, APOE z4 carrier status, and evaluation site. Bootstrapped 95% confidence intervals assessed statistical significance. Models were performed separately for football players (college and professional players pooled; n=180) and the unexposed men (n=60). Due to differences in sample size, estimates were compared and were considered different if the percent change in the estimates exceeded 10%.
Results:
In the former football players (mean age=57.2, 34% Black, 29% APOE e4 carrier), reduced cortical thickness (B=-0.25, 95% CI [0.45, -0.08]), lower average FA (B=-0.27, 95% CI [-0.41, -.12]), higher p-tau181 (B=0.17, 95% CI [0.02, 0.43]), and higher rFSRP score (B=0.27, 95% CI [0.08, 0.42]) were associated with greater log-TLV. Compared to the unexposed men, substantial differences in estimates were observed for rFSRP (Bcontrol=0.02, Bfootball=0.27, 994% difference), average FA (Bcontrol=-0.03, Bfootball=-0.27, 802% difference), and p-tau181 (Bcontrol=-0.31, Bfootball=0.17, -155% difference). In the former football players, rFSRP showed a stronger positive association and average FA showed a stronger negative association with WMH compared to unexposed men. The effect of WMH on cortical thickness was similar between the two groups (Bcontrol=-0.27, Bfootball=-0.25, 7% difference).
Conclusions:
These results suggest that the risk factor and biological correlates of WMH differ between former American football players and asymptomatic individuals unexposed to RHI. In addition to vascular risk factors, white matter integrity on DTI showed a stronger relationship with WMH burden in the former football players. FLAIR WMH serves as a promising measure to further investigate the late multifactorial pathologies of RHI.
PD patients commonly exhibit executive dysfunction early in the disease course which may or may not predict further cognitive decline over time. Early emergence of visuospatial and memory impairments, in contrast, are more consistent predictors of an evolving dementia syndrome. Most prior studies using fMRI have focused on mechanisms of executive dysfunction and have demonstrated that PD patients exhibit hyperactivation that is dependent on the degree of cognitive impairment, suggestive of compensatory strategies. No study has evaluated whether PD patients with normal cognition (PD-NC) and PD patients with Mild Cognitive Impairment (PD-MCI) exhibit compensatory activation patterns during visuospatial task performance.
Participants and Methods:
10 PD-NC, 12 PD-MCI, and 14 age and sex-matched healthy controls (HC) participated in the study. PD participants were diagnosed with MCI based on the Movement Disorders Society Task Force, Level II assessment (comprehensive assessment). Functional magnetic resonance imaging (fMRI) was performed during a motion discrimination task that required participants to identify the direction of horizontal global coherent motion embedded within dynamic visual noise under Low and High coherence conditions. Behavioral accuracy and functional activation were evaluated using 3 * 2 analyses of covariance (ANCOVAs) (group [HC, PD-NC, PD-MCI] * Coherence [High vs. Low]) accounting for age, sex, and education. Analyses were performed in R (v4.1.2(Team, 2013)).
Results:
PD-MCI (0.702± 0.269) patients exhibited significantly lower accuracy on the motion discrimination task than HC (0.853 ± 0.241; p = 0.033) and PD-NC (0.880 ± 0.208; p =0.039). A Group * Coherence interaction was identified in which several regions, including orbitofrontal, posterior parietal and occipital cortex, showed increased activation during High relative to Low coherence trials in the PD patient groups but not in the HC group. HC showed default mode deactivation and frontal-parietal activation during Low relative to High coherence trials that was not evident in the patient groups.
Conclusions:
PD-MCI patients exhibited worse visuospatial performance on a motion discrimination task than PD-NC and HC participants and exhibited hyperactivation of the posterior parietal and occipital regions during motion discrimination, suggesting possible compensatory activation.
This article examines the relationship between legislative civility and legislative productivity in US state legislatures. The research employs data from the National Survey of State Legislative Lobbyists and from the State Policy Innovation and Diffusion (SPID) database. The former dataset is used to generate an overall civility index for each state as developed by Kettler et al. The SPID database allows one to measure the legislative productivity of a state legislature. Employing these data, negative binomial and Poisson regression models reveal that state legislatures rated as more civil by their own lobbyists produced significantly more pieces of noteworthy legislation than those legislative bodies rated as less civil. These results suggest that the quality of internal legislative dynamics matters for legislative productivity.
We present and evaluate the prospects for detecting coherent radio counterparts to gravitational wave (GW) events using Murchison Widefield Array (MWA) triggered observations. The MWA rapid-response system, combined with its buffering mode ($\sim$4 min negative latency), enables us to catch any radio signals produced from seconds prior to hours after a binary neutron star (BNS) merger. The large field of view of the MWA ($\sim$$1\,000\,\textrm{deg}^2$ at 120 MHz) and its location under the high sensitivity sky region of the LIGO-Virgo-KAGRA (LVK) detector network, forecast a high chance of being on-target for a GW event. We consider three observing configurations for the MWA to follow up GW BNS merger events, including a single dipole per tile, the full array, and four sub-arrays. We then perform a population synthesis of BNS systems to predict the radio detectable fraction of GW events using these configurations. We find that the configuration with four sub-arrays is the best compromise between sky coverage and sensitivity as it is capable of placing meaningful constraints on the radio emission from 12.6% of GW BNS detections. Based on the timescales of four BNS merger coherent radio emission models, we propose an observing strategy that involves triggering the buffering mode to target coherent signals emitted prior to, during or shortly following the merger, which is then followed by continued recording for up to three hours to target later time post-merger emission. We expect MWA to trigger on $\sim$$5-22$ BNS merger events during the LVK O4 observing run, which could potentially result in two detections of predicted coherent emission.
This study investigated sex differences in Fe status, and associations between Fe status and endurance and musculoskeletal outcomes, in military training. In total, 2277 British Army trainees (581 women) participated. Fe markers and endurance performance (2·4 km run) were measured at the start (week 1) and end (week 13) of training. Whole-body areal body mineral density (aBMD) and markers of bone metabolism were measured at week 1. Injuries during training were recorded. Training decreased Hb in men and women (mean change (–0·1 (95 % CI –0·2, –0·0) and –0·7 (95 % CI –0·9, –0·6) g/dl, both P < 0·001) but more so in women (P < 0·001). Ferritin decreased in men and women (–27 (95 % CI –28, –23) and –5 (95 % CI –8, –1) µg/l, both P ≤ 0·001) but more so in men (P < 0·001). Soluble transferrin receptor increased in men and women (2·9 (95 % CI 2·3, 3·6) and 3·8 (95 % CI 2·7, 4·9) nmol/l, both P < 0·001), with no difference between sexes (P = 0·872). Erythrocyte distribution width increased in men (0·3 (95 % CI 0·2, 0·4)%, P < 0·001) but not in women (0·1 (95 % CI –0·1, 0·2)%, P = 0·956). Mean corpuscular volume decreased in men (–1·5 (95 % CI –1·8, –1·1) fL, P < 0·001) but not in women (0·4 (95 % CI –0·4, 1·3) fL, P = 0·087). Lower ferritin was associated with slower 2·4 km run time (P = 0·018), sustaining a lower limb overuse injury (P = 0·048), lower aBMD (P = 0·021) and higher beta C-telopeptide cross-links of type 1 collagen and procollagen type 1 N-terminal propeptide (both P < 0·001) controlling for sex. Improving Fe stores before training may protect Hb in women and improve endurance and protect against injury.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.