We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Edited by
Marietta Auer, Max Planck Institute for Legal History and Legal Theory,Paul B. Miller, University of Notre Dame, Indiana,Henry E. Smith, Harvard Law School, Massachusetts,James Toomey, University of Iowa
Private law theory is pulled in opposite directions: internal and external perspectives on law; holistic and reductionist methodologies; conceptualist and nominalist views; and deontological and consequentialist approaches. Relatedly, theories tend to focus on the micro or the macro scales – interpersonal relations or societal effects – but face difficulties in connecting them. In this paper, we examine these problems in private law theory through the lens of the legal phenomenology of Adolf Reinach. According to Reinach, the law presupposes a realm of real, timeless entitles and their workings that are synthetic a priori: they are neither conventional nor contingent. Nor are they inherently moral or customary. We argue that regardless of the ontological status of what Reinach identifies as a priori, it points toward something more robust than most current theories would countenance. We illustrate the usefulness of this perspective through Reinach’s analysis of property, transfer, and representation. Reinach captures features and generalizations that have eluded analysis, as, for example, when he treats the principle of nemo dat quod non habet (‘one cannot transfer what one does not own’) as underlying all transfer even if displaced by positive rules such as good faith purchase. His views also point toward the importance of accessibility for legal concepts, including cases of tacit knowledge. Whatever its exact source, this “deep structure” of the law has the potential to partially reconcile some of the fissures in private law theory and to connect the micro and the macro through a better understanding of system in law.
Background: Our prior six-year review (n=2165) revealed 24% of patients undergoing posterior decompression surgeries (laminectomy or discectomy) sought emergency department (ED) care within three months post-surgery. We established an integrated Spine Assessment Clinic (SAC) to enhance patient outcomes and minimize unnecessary ED visits through pre-operative education, targeted QI interventions, and early post-operative follow-up. Methods: We reviewed 13 months of posterior decompression data (n=205) following SAC implementation. These patients received individualized, comprehensive pre-operative education and follow-up phone calls within 7 days post-surgery. ED visits within 90 days post-surgery were tracked using provincial databases and compared to our pre-SAC implementation data. Results: Out of 205 patients, 24 (11.6%) accounted for 34 ED visits within 90 days post-op, showing a significant reduction in ED visits from 24% to 11.6%, and decreased overall ED utilization from 42.1% to 16.6% (when accounting for multiple visits by the same patient). Early interventions including wound monitoring, outpatient bloodwork, and prescription adjustments for pain management, helped mitigate ED visits. Patient satisfaction surveys (n=62) indicated 92% were “highly satisfied” and 100% would recommend the SAC. Conclusions: The SAC reduced ED visits after posterior decompression surgery by over 50%, with pre-operative education, focused QI initiatives, and its individualized, proactive approach.
Posttraumatic stress disorder (PTSD) has been associated with advanced epigenetic age cross-sectionally, but the association between these variables over time is unclear. This study conducted meta-analyses to test whether new-onset PTSD diagnosis and changes in PTSD symptom severity over time were associated with changes in two metrics of epigenetic aging over two time points.
Methods
We conducted meta-analyses of the association between change in PTSD diagnosis and symptom severity and change in epigenetic age acceleration/deceleration (age-adjusted DNA methylation age residuals as per the Horvath and GrimAge metrics) using data from 7 military and civilian cohorts participating in the Psychiatric Genomics Consortium PTSD Epigenetics Workgroup (total N = 1,367).
Results
Meta-analysis revealed that the interaction between Time 1 (T1) Horvath age residuals and new-onset PTSD over time was significantly associated with Horvath age residuals at T2 (meta β = 0.16, meta p = 0.02, p-adj = 0.03). The interaction between T1 Horvath age residuals and changes in PTSD symptom severity over time was significantly related to Horvath age residuals at T2 (meta β = 0.24, meta p = 0.05). No associations were observed for GrimAge residuals.
Conclusions
Results indicated that individuals who developed new-onset PTSD or showed increased PTSD symptom severity over time evidenced greater epigenetic age acceleration at follow-up than would be expected based on baseline age acceleration. This suggests that PTSD may accelerate biological aging over time and highlights the need for intervention studies to determine if PTSD treatment has a beneficial effect on the aging methylome.
Objectives/Goals: We describe the prevalence of individuals with household exposure to SARS-CoV-2, who subsequently report symptoms consistent with COVID-19, while having PCR results persistently negative for SARS-CoV-2 (S[+]/P[-]). We assess whether paired serology can assist in identifying the true infection status of such individuals. Methods/Study Population: In a multicenter household transmission study, index patients with SARS-CoV-2 were identified and enrolled together with their household contacts within 1 week of index’s illness onset. For 10 consecutive days, enrolled individuals provided daily symptom diaries and nasal specimens for polymerase chain reaction (PCR). Contacts were categorized into 4 groups based on presence of symptoms (S[+/-]) and PCR positivity (P[+/-]). Acute and convalescent blood specimens from these individuals (30 days apart) were subjected to quantitative serologic analysis for SARS-CoV-2 anti-nucleocapsid, spike, and receptor-binding domain antibodies. The antibody change in S[+]/P[-] individuals was assessed by thresholds derived from receiver operating characteristic (ROC) analysis of S[+]/P[+] (infected) versusS[-]/P[-] (uninfected). Results/Anticipated Results: Among 1,433 contacts, 67% had ≥1 SARS-CoV-2 PCR[+] result, while 33% remained PCR[-]. Among the latter, 55% (n = 263) reported symptoms for at least 1 day, most commonly congestion (63%), fatigue (63%), headache (62%), cough (59%), and sore throat (50%). A history of both previous infection and vaccination was present in 37% of S[+]/P[-] individuals, 38% of S[-]/P[-], and 21% of S[+]/P[+] (P<0.05). Vaccination alone was present in 37%, 41%, and 52%, respectively. ROC analyses of paired serologic testing of S[+]/P[+] (n = 354) vs. S[-]/P[-] (n = 103) individuals found anti-nucleocapsid data had the highest area under the curve (0.87). Based on the 30-day antibody change, 6.9% of S[+]/P[-] individuals demonstrated an increased convalescent antibody signal, although a similar seroresponse in 7.8% of the S[-]/P[-] group was observed. Discussion/Significance of Impact: Reporting respiratory symptoms was common among household contacts with persistent PCR[-] results. Paired serology analyses found similar seroresponses between S[+]/P[-] and S[-]/P[-] individuals. The symptomatic-but-PCR-negative phenomenon, while frequent, is unlikely attributable to true SARS-CoV-2 infections that go missed by PCR.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
Trace amine-associated receptor 1 (TAAR1) agonists offer a new approach, but there is uncertainty regarding their effects, exact mechanism of action and potential role in treating psychosis.
Aims
To evaluate the available evidence on TAAR1 agonists in psychosis, using triangulation of the output of living systematic reviews (LSRs) of animal and human studies, and provide recommendations for future research prioritisation.
Method
This study is part of GALENOS (Global Alliance for Living Evidence on aNxiety, depressiOn and pSychosis). In the triangulation process, a multidisciplinary group of experts, including those with lived experience, met and appraised the first co-produced living systematic reviews from GALENOS, on TAAR1 agonists.
Results
The animal data suggested a potential antipsychotic effect, as TAAR1 agonists reduced locomotor activity induced by pro-psychotic drug treatment. Human studies showed few differences for ulotaront and ralmitaront compared with placebo in improving overall symptoms in adults with acute schizophrenia (four studies, n = 1291 participants, standardised mean difference (SMD) 0.15, 95% CI −0.05 to 0.34). Large placebo responses were seen in ulotaront phase three trials. Ralmitaront was less efficacious than risperidone (one study, n = 156 participants, SMD = −0.53, 95% CI −0.86 to −0.20). The side-effect profile of TAAR1 agonists was favourable compared with existing antipsychotics. Priorities for future studies included (a) using different animal models of psychosis with greater translational validity; (b) animal and human studies with wider outcomes including cognitive and affective symptoms and (c) mechanistic studies and investigations of other potential applications, such as adjunctive treatments and long-term outcomes. Recommendations for future iterations of the LSRs included (a) meta-analysis of individual human participant data, (b) including studies that used different methodologies and (c) assessing other disorders and symptoms.
Conclusions
This co-produced, international triangulation examined the available evidence and developed recommendations for future research and clinical applications for TAAR1 agonists in psychosis. Broader challenges included difficulties in assessing the risk of bias, reproducibility, translation and interpretability of animal models to clinical outcomes, and a lack of individual and clinical characteristics in the human data. The research will inform a separate, independent prioritisation process, led by lived experience experts, to prioritise directions for future research.
The global population and status of Snowy Owls Bubo scandiacus are particularly challenging to assess because individuals are irruptive and nomadic, and the breeding range is restricted to the remote circumpolar Arctic tundra. The International Union for Conservation of Nature (IUCN) uplisted the Snowy Owl to “Vulnerable” in 2017 because the suggested population estimates appeared considerably lower than historical estimates, and it recommended actions to clarify the population size, structure, and trends. Here we present a broad review and status assessment, an effort led by the International Snowy Owl Working Group (ISOWG) and researchers from around the world, to estimate population trends and the current global status of the Snowy Owl. We use long-term breeding data, genetic studies, satellite-GPS tracking, and survival estimates to assess current population trends at several monitoring sites in the Arctic and we review the ecology and threats throughout the Snowy Owl range. An assessment of the available data suggests that current estimates of a worldwide population of 14,000–28,000 breeding adults are plausible. Our assessment of population trends at five long-term monitoring sites suggests that breeding populations of Snowy Owls in the Arctic have decreased by more than 30% over the past three generations and the species should continue to be categorised as Vulnerable under the IUCN Red List Criterion A2. We offer research recommendations to improve our understanding of Snowy Owl biology and future population assessments in a changing world.
Rift propagation, rather than basal melt, drives the destabilization and disintegration of the Thwaites Eastern Ice Shelf. Since 2016, rifts have episodically advanced throughout the central ice-shelf area, with rapid propagation events occurring during austral spring. The ice shelf's speed has increased by ~70% during this period, transitioning from a rate of 1.65 m d−1 in 2019 to 2.85 m d−1 by early 2023 in the central area. The increase in longitudinal strain rates near the grounding zone has led to full-thickness rifts and melange-filled gaps since 2020. A recent sea-ice break out has accelerated retreat at the western calving front, effectively separating the ice shelf from what remained of its northwestern pinning point. Meanwhile, a distributed set of phase-sensitive radar measurements indicates that the basal melting rate is generally small, likely due to a widespread robust ocean stratification beneath the ice–ocean interface that suppresses basal melt despite the presence of substantial oceanic heat at depth. These observations in combination with damage modeling show that, while ocean forcing is responsible for triggering the current West Antarctic ice retreat, the Thwaites Eastern Ice Shelf is experiencing dynamic feedbacks over decadal timescales that are driving ice-shelf disintegration, now independent of basal melt.
The 7th edition of the Canadian Stroke Best Practice Recommendations (CSBPR) is a comprehensive summary of current evidence-based recommendations, appropriate for use by healthcare providers and system planners, and intended to drive healthcare excellence, improved outcomes and more integrated health systems. This edition includes a new module on the management of cerebral venous thrombosis (CVT). Cerebral venous thrombosis is defined as thrombosis of the veins of the brain, including the dural venous sinuses and/or cortical or deep veins. Cerebral venous thrombosis is a rare but potentially life-threatening type of stroke, representing 0.5–1.0% of all stroke admissions. The reported rates of CVT are approximately 10–20 per million and appear to be increasing over time. The risk of CVT is higher in women and often associated with oral contraceptive use and with pregnancy and the puerperium. This guideline addresses care for adult individuals who present to the healthcare system with current or recent symptoms of CVT. The recommendations cover the continuum of care from diagnosis and initial clinical assessment of symptomatic CVT, to acute treatment of symptomatic CVT, post-acute management, person-centered care, special considerations in the long-term management of CVT, including pregnancy and considerations related to CVT in special circumstances such as trauma and vaccination. This module also includes supporting materials such as implementation resources to facilitate the adoption of evidence into practice and performance measures to enable monitoring of uptake and effectiveness of recommendations.
Background: Canadian Emergency Departments (EDs) are overburdened. Understanding the drivers for postoperative patients to attend the ED allows for targeted interventions thereby reducing demand. We sought to identify “bounce back” patterns for subsequent QI initiatives. Methods: From April 1, 2016 to March 31, 2022, all provincial ED datasets (EDIS, STAR, Meditech) identified patients presenting within 90 days post-spine surgery. Using Canadian Classification of Health Interventions codes, laminectomies (1SC80) and discectomies (1SE87) demonstrated the highest ED visit rates. Comprehensive chart reviews were conducted identifying surgical and medical reasons for presentation within this timeframe. Results: Reviewing a cohort of 2165 post-decompression patients, 42.1% presented to the ED (n=912) with 62.8% of these directly related to surgery. Primary reasons included wound care (31.6%), pain management (31.6%), and bladder issues (retention or UTI, 11.0%). Simple wound evaluation constituted 49.7% of wound-related visits, with surgical site infection 37.6% and dehiscence 6.6% accounting for the remainder. Pain-related presentations resulted in 72.3% discharge with additional medications, and 27.7% necessitating hospital admission. New or worsening neurologic deficits were reported in 8.9% of ED visits. Conclusions: These findings illuminate crucial aspects of postoperative care and ED utilization patterns. Prioritizing patient education, pain management, and wound care could help alleviate the national ED crisis.
The brain can be represented as a network, with nodes as brain regions and edges as region-to-region connections. Nodes with the most connections (hubs) are central to efficient brain function. Current findings on structural differences in Major Depressive Disorder (MDD) identified using network approaches remain inconsistent, potentially due to small sample sizes. It is still uncertain at what level of the connectome hierarchy differences may exist, and whether they are concentrated in hubs, disrupting fundamental brain connectivity.
Methods
We utilized two large cohorts, UK Biobank (UKB, N = 5104) and Generation Scotland (GS, N = 725), to investigate MDD case–control differences in brain network properties. Network analysis was done across four hierarchical levels: (1) global, (2) tier (nodes grouped into four tiers based on degree) and rich club (between-hub connections), (3) nodal, and (4) connection.
Results
In UKB, reductions in network efficiency were observed in MDD cases globally (d = −0.076, pFDR = 0.033), across all tiers (d = −0.069 to −0.079, pFDR = 0.020), and in hubs (d = −0.080 to −0.113, pFDR = 0.013–0.035). No differences in rich club organization and region-to-region connections were identified. The effect sizes and direction for these associations were generally consistent in GS, albeit not significant in our lower-N replication sample.
Conclusion
Our results suggest that the brain's fundamental rich club structure is similar in MDD cases and controls, but subtle topological differences exist across the brain. Consistent with recent large-scale neuroimaging findings, our findings offer a connectomic perspective on a similar scale and support the idea that minimal differences exist between MDD cases and controls.
Lode, a dioctahedral illite-rich clay from Latvia belonging to the mica group of clay minerals, undergoes thermal transformation via a series of structurally disordered intermediate phases. Despite containing high levels of paramagnetic Fe substituted into the octahedral sites, 29Si and 27Al magic angle spinning nuclear magnetic resonance (MAS NMR) spectra of sufficient quality are obtained to resolve different structural units, showing clearly defined structural changes which occur in the sample during calcination to 1200 °C. However, Fe plays a significant role in broadening the Al signal, with integrated peak intensities decreasing as temperature increases. Significant differences are revealed in the thermal decomposition process by NMR spectra between pyrophyllite, Ca-montmorillonite and illite clays, possibly due to the different cations present in the interlayer. It has also been shown for illite that no structural differences at the atomic level occur when the dwell time at a particular temperature is varied and no difference is observed between samples that have different thermal histories; however, a minor effect of particle size and surface area is visible in the NMR data.
Cognitive training has shown promise for improving cognition in older adults. Aging involves a variety of neuroanatomical changes that may affect response to cognitive training. White matter hyperintensities (WMH) are one common age-related brain change, as evidenced by T2-weighted and Fluid Attenuated Inversion Recovery (FLAIR) MRI. WMH are associated with older age, suggestive of cerebral small vessel disease, and reflect decreased white matter integrity. Higher WMH load associates with reduced threshold for clinical expression of cognitive impairment and dementia. The effects of WMH on response to cognitive training interventions are relatively unknown. The current study assessed (a) proximal cognitive training performance following a 3-month randomized control trial and (b) the contribution of baseline whole-brain WMH load, defined as total lesion volume (TLV), on pre-post proximal training change.
Participants and Methods:
Sixty-two healthy older adults ages 65-84 completed either adaptive cognitive training (CT; n=31) or educational training control (ET; n=31) interventions. Participants assigned to CT completed 20 hours of attention/processing speed training and 20 hours of working memory training delivered through commercially-available Posit Science BrainHQ. ET participants completed 40 hours of educational videos. All participants also underwent sham or active transcranial direct current stimulation (tDCS) as an adjunctive intervention, although not a variable of interest in the current study. Multimodal MRI scans were acquired during the baseline visit. T1- and T2-weighted FLAIR images were processed using the Lesion Segmentation Tool (LST) for SPM12. The Lesion Prediction Algorithm of LST automatically segmented brain tissue and calculated lesion maps. A lesion threshold of 0.30 was applied to calculate TLV. A log transformation was applied to TLV to normalize the distribution of WMH. Repeated-measures analysis of covariance (RM-ANCOVA) assessed pre/post change in proximal composite (Total Training Composite) and sub-composite (Processing Speed Training Composite, Working Memory Training Composite) measures in the CT group compared to their ET counterparts, controlling for age, sex, years of education and tDCS group. Linear regression assessed the effect of TLV on post-intervention proximal composite and sub-composite, controlling for baseline performance, intervention assignment, age, sex, years of education, multisite scanner differences, estimated total intracranial volume, and binarized cardiovascular disease risk.
Results:
RM-ANCOVA revealed two-way group*time interactions such that those assigned cognitive training demonstrated greater improvement on proximal composite (Total Training Composite) and sub-composite (Processing Speed Training Composite, Working Memory Training Composite) measures compared to their ET counterparts. Multiple linear regression showed higher baseline TLV associated with lower pre-post change on Processing Speed Training sub-composite (ß = -0.19, p = 0.04) but not other composite measures.
Conclusions:
These findings demonstrate the utility of cognitive training for improving postintervention proximal performance in older adults. Additionally, pre-post proximal processing speed training change appear to be particularly sensitive to white matter hyperintensity load versus working memory training change. These data suggest that TLV may serve as an important factor for consideration when planning processing speed-based cognitive training interventions for remediation of cognitive decline in older adults.
Aging is associated with disruptions in functional connectivity within the default mode (DMN), frontoparietal control (FPCN), and cingulo-opercular (CON) resting-state networks. Greater within-network connectivity predicts better cognitive performance in older adults. Therefore, strengthening network connectivity, through targeted intervention strategies, may help prevent age-related cognitive decline or progression to dementia. Small studies have demonstrated synergistic effects of combining transcranial direct current stimulation (tDCS) and cognitive training (CT) on strengthening network connectivity; however, this association has yet to be rigorously tested on a large scale. The current study leverages longitudinal data from the first-ever Phase III clinical trial for tDCS to examine the efficacy of an adjunctive tDCS and CT intervention on modulating network connectivity in older adults.
Participants and Methods:
This sample included 209 older adults (mean age = 71.6) from the Augmenting Cognitive Training in Older Adults multisite trial. Participants completed 40 hours of CT over 12 weeks, which included 8 attention, processing speed, and working memory tasks. Participants were randomized into active or sham stimulation groups, and tDCS was administered during CT daily for two weeks then weekly for 10 weeks. For both stimulation groups, two electrodes in saline-soaked 5x7 cm2 sponges were placed at F3 (cathode) and F4 (anode) using the 10-20 measurement system. The active group received 2mA of current for 20 minutes. The sham group received 2mA for 30 seconds, then no current for the remaining 20 minutes.
Participants underwent resting-state fMRI at baseline and post-intervention. CONN toolbox was used to preprocess imaging data and conduct region of interest (ROI-ROI) connectivity analyses. The Artifact Detection Toolbox, using intermediate settings, identified outlier volumes. Two participants were excluded for having greater than 50% of volumes flagged as outliers. ROI-ROI analyses modeled the interaction between tDCS group (active versus sham) and occasion (baseline connectivity versus postintervention connectivity) for the DMN, FPCN, and CON controlling for age, sex, education, site, and adherence.
Results:
Compared to sham, the active group demonstrated ROI-ROI increases in functional connectivity within the DMN following intervention (left temporal to right temporal [T(202) = 2.78, pFDR < 0.05] and left temporal to right dorsal medial prefrontal cortex [T(202) = 2.74, pFDR < 0.05]. In contrast, compared to sham, the active group demonstrated ROI-ROI decreases in functional connectivity within the FPCN following intervention (left dorsal prefrontal cortex to left temporal [T(202) = -2.96, pFDR < 0.05] and left dorsal prefrontal cortex to left lateral prefrontal cortex [T(202) = -2.77, pFDR < 0.05]). There were no significant interactions detected for CON regions.
Conclusions:
These findings (a) demonstrate the feasibility of modulating network connectivity using tDCS and CT and (b) provide important information regarding the pattern of connectivity changes occurring at these intervention parameters in older adults. Importantly, the active stimulation group showed increases in connectivity within the DMN (a network particularly vulnerable to aging and implicated in Alzheimer’s disease) but decreases in connectivity between left frontal and temporal FPCN regions. Future analyses from this trial will evaluate the association between these changes in connectivity and cognitive performance post-intervention and at a one-year timepoint.
Nonpathological aging has been linked to decline in both verbal and visuospatial memory abilities in older adults. Disruptions in resting-state functional connectivity within well-characterized, higherorder cognitive brain networks have also been coupled with poorer memory functioning in healthy older adults and in older adults with dementia. However, there is a paucity of research on the association between higherorder functional connectivity and verbal and visuospatial memory performance in the older adult population. The current study examines the association between resting-state functional connectivity within the cingulo-opercular network (CON), frontoparietal control network (FPCN), and default mode network (DMN) and verbal and visuospatial learning and memory in a large sample of healthy older adults. We hypothesized that greater within-network CON and FPCN functional connectivity would be associated with better immediate verbal and visuospatial memory recall. Additionally, we predicted that within-network DMN functional connectivity would be associated with improvements in delayed verbal and visuospatial memory recall. This study helps to glean insight into whether within-network CON, FPCN, or DMN functional connectivity is associated with verbal and visuospatial memory abilities in later life.
Participants and Methods:
330 healthy older adults between 65 and 89 years old (mean age = 71.6 ± 5.2) were recruited at the University of Florida (n = 222) and the University of Arizona (n = 108). Participants underwent resting-state fMRI and completed verbal memory (Hopkins Verbal Learning Test - Revised [HVLT-R]) and visuospatial memory (Brief Visuospatial Memory Test - Revised [BVMT-R]) measures. Immediate (total) and delayed recall scores on the HVLT-R and BVMT-R were calculated using each test manual’s scoring criteria. Learning ratios on the HVLT-R and BVMT-R were quantified by dividing the number of stimuli (verbal or visuospatial) learned between the first and third trials by the number of stimuli not recalled after the first learning trial. CONN Toolbox was used to extract average within-network connectivity values for CON, FPCN, and DMN. Hierarchical regressions were conducted, controlling for sex, race, ethnicity, years of education, number of invalid scans, and scanner site.
Results:
Greater CON connectivity was significantly associated with better HVLT-R immediate (total) recall (ß = 0.16, p = 0.01), HVLT-R learning ratio (ß = 0.16, p = 0.01), BVMT-R immediate (total) recall (ß = 0.14, p = 0.02), and BVMT-R delayed recall performance (ß = 0.15, p = 0.01). Greater FPCN connectivity was associated with better BVMT-R learning ratio (ß = 0.13, p = 0.04). HVLT-R delayed recall performance was not associated with connectivity in any network, and DMN connectivity was not significantly related to any measure.
Conclusions:
Connectivity within CON demonstrated a robust relationship with different components of memory function as well across verbal and visuospatial domains. In contrast, FPCN only evidenced a relationship with visuospatial learning, and DMN was not significantly associated with memory measures. These data suggest that CON may be a valuable target in longitudinal studies of age-related memory changes, but also a possible target in future non-invasive interventions to attenuate memory decline in older adults.
The Eighth World Congress of Pediatric Cardiology and Cardiac Surgery (WCPCCS) will be held in Washington DC, USA, from Saturday, 26 August, 2023 to Friday, 1 September, 2023, inclusive. The Eighth World Congress of Pediatric Cardiology and Cardiac Surgery will be the largest and most comprehensive scientific meeting dedicated to paediatric and congenital cardiac care ever held. At the time of the writing of this manuscript, The Eighth World Congress of Pediatric Cardiology and Cardiac Surgery has 5,037 registered attendees (and rising) from 117 countries, a truly diverse and international faculty of over 925 individuals from 89 countries, over 2,000 individual abstracts and poster presenters from 101 countries, and a Best Abstract Competition featuring 153 oral abstracts from 34 countries. For information about the Eighth World Congress of Pediatric Cardiology and Cardiac Surgery, please visit the following website: [www.WCPCCS2023.org]. The purpose of this manuscript is to review the activities related to global health and advocacy that will occur at the Eighth World Congress of Pediatric Cardiology and Cardiac Surgery.
Acknowledging the need for urgent change, we wanted to take the opportunity to bring a common voice to the global community and issue the Washington DC WCPCCS Call to Action on Addressing the Global Burden of Pediatric and Congenital Heart Diseases. A copy of this Washington DC WCPCCS Call to Action is provided in the Appendix of this manuscript. This Washington DC WCPCCS Call to Action is an initiative aimed at increasing awareness of the global burden, promoting the development of sustainable care systems, and improving access to high quality and equitable healthcare for children with heart disease as well as adults with congenital heart disease worldwide.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.