We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Clinical high risk for psychosis (CHR) is often managed with antipsychotic medications, but their effects on neurocognitive performance and clinical outcomes remain insufficiently explored. This study investigates the association between aripiprazole and olanzapine use and cognitive and clinical outcomes in CHR individuals, compared to those receiving no antipsychotic treatment.
Methods
A retrospective analysis was conducted on 127 participants from the Shanghai At Risk for Psychosis (SHARP) cohort, categorized into three groups: aripiprazole, olanzapine, and no antipsychotic treatment. Neurocognitive performance was evaluated using the MATRICS Consensus Cognitive Battery (MCCB), while clinical symptoms were assessed through the Structured Interview for Prodromal Syndromes (SIPS) at baseline, 8 weeks, and one year.
Results
The non-medicated group demonstrated greater improvements in cognitive performance, clinical symptoms, and functional outcomes compared to the medicated groups. Among the antipsychotic groups, aripiprazole was associated with better visual learning outcomes than olanzapine. Improvements in neurocognition correlated significantly with clinical symptom relief and overall functional gains at follow-up assessments.
Conclusions
These findings suggest potential associations between antipsychotic use and cognitive outcomes in CHR populations while recognizing that observed differences may reflect baseline illness severity rather than medication effects alone. Aripiprazole may offer specific advantages over olanzapine, underscoring the importance of individualized risk-benefit evaluations in treatment planning. Randomized controlled trials are needed to establish causality.
Dietary intervention is a key strategy for preventing and managing chronic kidney disease (CKD). However, evidence on specific foods’ effects on CKD is limited. This study aims to clarify the impact of various foods on CKD risk. We used two-sample Mendelian randomisation to analyse the causal relationships between the intake of eighteen foods (e.g., cheese, processed meat, poultry, beef and non-oily fish) and CKD risk, as well as estimated glomerular filtration rate (eGFR)cr and eGFRcys levels. The inverse variance weighting method, weighted median method, MR-Egger regression, simple mode and weighted mode were employed. The sensitivity analysis included Cochran’s Q test and the Egger intercept test. According to the main method, the IVM results indicated that frequent alcohol intake was linked to higher CKD risk (P= 0·007, 0·048). Protective factors included cheese (OR = 0·71, (95 % CI: 0·53, 0·94), P= 0·017), tea (OR = 0·66, (95 % CI: 0·43, 1·00), P= 0·048) and dried fruit (OR = 0·78, (95 % CI: 0·63, 0·98), P= 0·033). Oily fish (β = 0·051, (95 % CI: 0·001, 0·102), P= 0·046) and dried fruit (β = 0·082, (95 % CI: 0·016, 0·149), P= 0·014) were associated with elevated eGFRcys. Salad/raw vegetables (β = 0·024, (95 % CI: 0·003, 0·045), P= 0·028) and dried fruit (β = 0·013, (95 % CI: 0·001, 0·031), P= 0·014) were linked to higher eGFRcr, while cereal intake (β = –0·021, (95 % CI: −0·033, −0·010), P < 0·001) was associated with lower eGFRcr. These findings provide insights for optimising dietary strategies for CKD patients.
Post-traumatic stress disorder (PTSD) is a mental health condition caused by the dysregulation or overgeneralization of memories related to traumatic events. Investigating the interplay between explicit narrative and implicit emotional memory contributes to a better understanding of the mechanisms underlying PTSD.
Methods
This case–control study focused on two groups: unmedicated patients with PTSD and a trauma-exposed control (TEC) group who did not develop PTSD. Experiments included real-time measurements of blood oxygenation changes using functional near-infrared spectroscopy during trauma narration and processing of emotional and linguistic data through natural language processing (NLP).
Results
Real-time fNIRS monitoring showed that PTSD patients (mean [SD] Oxy-Hb activation, 0.153 [0.084], 95% CI 0.124 to 0.182) had significantly higher brain activity in the left anterior medial prefrontal cortex (L-amPFC) within 10 s after expressing negative emotional words compared with the control group (0.047 [0.026], 95% CI 0.038 to 0.056; p < 0.001). In the control group, there was a significant time-series correlation between the use of negative emotional memory words and activation of the L-amPFC (latency 3.82 s, slope = 0.0067, peak value = 0.184, difference = 0.273; Spearman’s r = 0.727, p < 0.001). In contrast, the left anterior cingulate prefrontal cortex of PTSD patients remained in a state of high activation (peak value = 0.153, difference = 0.084) with no apparent latency period.
Conclusions
PTSD patients display overactivity in pathways associated with rapid emotional responses and diminished regulation in cognitive processing areas. Interventions targeting these pathways may alleviate symptoms of PTSD.
The emotion regulation network (ERN) in the brain provides a framework for understanding the neuropathology of affective disorders. Although previous neuroimaging studies have investigated the neurobiological correlates of the ERN in major depressive disorder (MDD), whether patients with MDD exhibit abnormal functional connectivity (FC) patterns in the ERN and whether the abnormal FC in the ERN can serve as a therapeutic response signature remain unclear.
Methods
A large functional magnetic resonance imaging dataset comprising 709 patients with MDD and 725 healthy controls (HCs) recruited across five sites was analyzed. Using a seed-based FC approach, we first investigated the group differences in whole-brain resting-state FC of the 14 ERN seeds between participants with and without MDD. Furthermore, an independent sample (45 MDD patients) was used to evaluate the relationship between the aforementioned abnormal FC in the ERN and symptom improvement after 8 weeks of antidepressant monotherapy.
Results
Compared to the HCs, patients with MDD exhibited aberrant FC between 7 ERN seeds and several cortical and subcortical areas, including the bilateral middle temporal gyrus, bilateral occipital gyrus, right thalamus, calcarine cortex, middle frontal gyrus, and the bilateral superior temporal gyrus. In an independent sample, these aberrant FCs in the ERN were negatively correlated with the reduction rate of the HAMD17 score among MDD patients.
Conclusions
These results might extend our understanding of the neurobiological underpinnings underlying unadaptable or inflexible emotional processing in MDD patients and help to elucidate the mechanisms of therapeutic response.
Glucagon-like peptide-1 receptor agonists (GLP-1RAs) are widely used due to their profound efficacy in glycemic control and weight management. Real-world observations have revealed potential neuropsychiatric adverse events (AEs) associated with GLP-1RAs. This study aimed to comprehensively investigate and characterize these neuropsychiatric AEs with GLP-1RAs.
Methods
We analyzed GLP-1RA adverse reaction reports using the FDA Adverse Event Reporting System database. Disproportionality analysis using reporting odds ratio (ROR) identified eight categories of neuropsychiatric AEs associated with GLP-1RAs. We conducted descriptive and time-to-onset (TTO) analyses and explored neuropsychiatric AE signals among individual GLP-1RAs for weight loss and diabetes mellitus (DM) indications.
Results
We identified 25,110 cases of GLP-1RA-related neuropsychiatric AEs. GLP-1RAs showed an association with headache (ROR 1.74, 95% confidence interval [CI] 1.65–1.84), migraine (ROR 1.28, 95%CI 1.06–1.55), and olfactory and sensory nerve abnormalities (ROR 2.44, 95%CI 1.83–3.25; ROR 1.69, 95%CI 1.54–1.85). Semaglutide showed a moderate suicide-related AEs signal in the weight loss population (ROR 2.55, 95%CI 1.97–3.31). The median TTO was 16 days (interquartile range: 3–66 days).
Conclusions
In this study, we identified eight potential neuropsychiatric adverse events (AEs) associated with GLP-1RAs and, for the first time, detected positive signals for migraine, olfactory abnormalities, and sensory abnormalities. We also observed positive suicide-related signals of semaglutide, in weight loss population. This study provides a reliable basis for further investigation of GLP-1RA-related neuropsychiatric AEs. However, as an exploratory study, our findings require confirmation through large-scale prospective studies.
Covariance-based SEM (CB-SEM) has become one of the most prominent statistical analysis techniques in understanding latent phenomena such as students and teachers’ perceptions, attitudes, or intentions and their influence on learning or teaching outcomes. This chapter introduces an alternative technique for SEM, variance-based partial least squares SEM (PLS-SEM), which has multiple advantages over CB-SEM in several situations commonly encountered in social sciences research. A case study in the English Medium Instruction (EMI) context is also demonstrated as an example to facilitate comprehension of the method. The chapter concludes with a discussion of potential applications for other EMI-related contexts and lines of inquiry.
To meet the high-precision positioning requirements for hybrid machining units, this article presents a geometric error modeling and source error identification methodology for a serial–parallel hybrid kinematic machining unit (HKMU) with five axis. A minimal kinematic error modeling of the serial–parallel HKMU is established with screw-based method after elimination of redundant errors. A set of composite error indices is formulated to describe the terminal accuracy distribution characteristics in a quantitative manner. A modified projection method is proposed to determine the actual compensable and noncompensable source errors of the HKMU by identifying such transformable source errors. Based on this, the error compensation and comparison analysis are carried out on the exemplary HKMU to numerically verify the effectiveness of the proposed modified projection method. The geometric error evaluations reveal that the parallel module has a larger impacts on the terminal accuracy of the platform of the HKMU than the serial module. The error compensation results manifest that the modified projection method can find additional compensable source errors and significantly reduce the average and maximum values of geometric errors of the HKMU. Hence, the proposed methodology can be applied to improve the accuracy of kinematic calibration of the compensable source errors and can reduce the difficulty and workload of tolerance design for noncompensable source errors of such serial–parallel hybrid mechanism.
In contemporary neuroimaging studies, it has been observed that patients with major depressive disorder (MDD) exhibit aberrant spontaneous neural activity, commonly quantified through the amplitude of low-frequency fluctuations (ALFF). However, the substantial individual heterogeneity among patients poses a challenge to reaching a unified conclusion.
Methods
To address this variability, our study adopts a novel framework to parse individualized ALFF abnormalities. We hypothesize that individualized ALFF abnormalities can be portrayed as a unique linear combination of shared differential factors. Our study involved two large multi-center datasets, comprising 2424 patients with MDD and 2183 healthy controls. In patients, individualized ALFF abnormalities were derived through normative modeling and further deconstructed into differential factors using non-negative matrix factorization.
Results
Two positive and two negative factors were identified. These factors were closely linked to clinical characteristics and explained group-level ALFF abnormalities in the two datasets. Moreover, these factors exhibited distinct associations with the distribution of neurotransmitter receptors/transporters, transcriptional profiles of inflammation-related genes, and connectome-informed epicenters, underscoring their neurobiological relevance. Additionally, factor compositions facilitated the identification of four distinct depressive subtypes, each characterized by unique abnormal ALFF patterns and clinical features. Importantly, these findings were successfully replicated in another dataset with different acquisition equipment, protocols, preprocessing strategies, and medication statuses, validating their robustness and generalizability.
Conclusions
This research identifies shared differential factors underlying individual spontaneous neural activity abnormalities in MDD and contributes novel insights into the heterogeneity of spontaneous neural activity abnormalities in MDD.
Predicting epidemic trends of coronavirus disease 2019 (COVID-19) remains a key public health concern globally today. However, the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) reinfection rate in previous studies of the transmission dynamics model was mostly a fixed value. Therefore, we proposed a meta-Susceptible-Exposed-Infectious-Recovered-Susceptible (SEIRS) model by adding a time-varying SARS-CoV-2 reinfection rate to the transmission dynamics model to more accurately characterize the changes in the number of infected persons. The time-varying reinfection rate was estimated using random-effect multivariate meta-regression based on published literature reports of SARS-CoV-2 reinfection rates. The meta-SEIRS model was constructed to predict the epidemic trend of COVID-19 from February to December 2023 in Sichuan province. Finally, according to the online questionnaire survey, the SARS-CoV-2 infection rate at the end of December 2022 in Sichuan province was 82.45%. The time-varying effective reproduction number in Sichuan province had two peaks from July to December 2022, with a maximum peak value of about 15. The prediction results based on the meta-SEIRS model showed that the highest peak of the second wave of COVID-19 in Sichuan province would be in late May 2023. The number of new infections per day at the peak would be up to 2.6 million. We constructed a meta-SEIRS model to predict the epidemic trend of COVID-19 in Sichuan province, which was consistent with the trend of SARS-CoV-2 positivity in China. Therefore, a meta-SEIRS model parameterized based on evidence-based data can be more relevant to the actual situation and thus more accurately predict future trends in the number of infections.
Purple nutsedge (Cyperus rotundus L.) is one of the world’s resilient upland weeds, primarily spreading through its tubers. Its emergence in rice (Oryza sativa L.) fields has been increasing, likely due to changing paddy-farming practices. This study aimed to investigate how C. rotundus, an upland weed, can withstand soil flooding and become a problematic weed in rice fields. The first comparative analysis focused on the survival and recovery characteristics of growing and mature tubers of C. rotundus exposed to soil-flooding conditions. Notably, mature tubers exhibited significant survival and recovery abilities in these environments. Based on this observation, further investigation was carried out to explore the morphological structure, nonstructural carbohydrates, and respiratory mechanisms of mature tubers in response to prolonged soil flooding. Over time, the mature tubers did not form aerenchyma but instead gradually accumulated lignified sclerenchymal fibers, with lignin content also increasing. After 90 d, the lignified sclerenchymal fibers and lignin contents were 4.0 and 1.1 times higher than those in the no soil-flooding treatment. Concurrently, soluble sugar content decreased while starch content increased, providing energy storage, and alcohol dehydrogenase activity rose to support anaerobic respiration via alcohol fermentation. These results indicated that mature tubers survived in soil-flooding conditions by adopting a low-oxygen quiescence strategy, which involves morphological adaptations through the development of lignified sclerenchymal fibers, increased starch reserves for energy storage, and enhanced anaerobic respiration. This mechanism likely underpins the flooding tolerance of mature C. rotundus tubers, allowing them to endure unfavorable conditions and subsequently germinate and grow once flooding subsides. This study provides a preliminary explanation of the mechanism by which mature tubers of C. rotundus from the upland areas confer flooding tolerance, shedding light on the reasons behind this weed’s increasing presence in rice fields.
This study aimed to understand the potassium voltage-gated channel KQT-like subfamily, member 1 gene polymorphism in a rural elderly population in a county in Guangxi and to explore the possible relationship between its gene polymorphism and blood sugar. The 6 SNP loci of blood DNA samples from 4355 individuals were typed using the imLDRTM Multiple SNP Typing Kit from Shanghai Tianhao Biotechnology Co. The data combining epidemiological information (baseline questionnaire and physical examination results) and genotyping results were statistically analyzed using GMDR0.9 software and SPSS22.0 software. A total of 4355 elderly people aged 60 years and above were surveyed in this survey, and the total abnormal rate of glucose metabolism was 16·11 % (699/4355). Among them, male:female ratio was 1:1·48; the age group of 60–69 years old accounted for the highest proportion, with 2337 people, accounting for 53·66 % (2337/4355). The results of multivariate analysis showed that usually not doing farm work (OR 1·26; 95 % CI 1·06, 1·50), TAG ≥ 1·70 mmol/l (OR 1·19; 95 % CI 1·11, 1·27), hyperuricaemia (OR 1·034; 95 % CI 1·01, 1·66) and BMI ≥ 24 kg/m2 (OR 1·06; 95 % CI 1·03, 1·09) may be risk factors for abnormal glucose metabolism. Among all participants, rs151290 locus AA genotype, A allele carriers (AA+AC) were 0.70 times more likely (0.54 to 0.91) and 0.82 times more likely (0.70 to 0.97) to develop abnormal glucose metabolism than CC genotype carriers, respectively. Carriers of the T allele at the rs2237892 locus (CT+TT) were 0.85 times more likely to have abnormal glucose metabolism than carriers of the CC genotype (0.72 to 0.99); rs2237897 locus CT gene. The possibility of abnormal glucose metabolism in the carriers of CC genotype, TT genotype and T allele (CT + TT) is 0·79 times (0·67–0·94), 0·74 times (0·55–0·99) and 0·78 times (0·66, 0·92). The results of multifactor dimensionality reduction showed that the optimal interaction model was a three-factor model consisting of farm work, TAG and rs2237897. The best model dendrogram found that the interaction between TAG and rs2237897 had the strongest effect on fasting blood glucose in the elderly in rural areas, and they were mutually antagonistic. Environment–gene interaction is an important factor affecting abnormal glucose metabolism in the elderly of a county in Hechi City, Guangxi.
We aimed to report an overview of trends in suicide mortality and years of life lost (YLLs) among adolescents and young adults aged 10–24 years by sex, age group, Socio-demographic Index (SDI), region and country from 1990 to 2021 as well as the suicide mortality with age, period and birth cohort effects.
Methods
Estimates and 95% uncertainty intervals for suicide mortality and YLLs were extracted from the Global Burden of Diseases Study 2021. Joinpoint analysis was used to calculate the annual percentage change (APC) and average annual percentage change (AAPC) to describe the mortality and rate of YLLs trends. Age, period and cohort model was utilized to disentangle age, period and birth cohort effects on suicide mortality trends.
Results
Globally, suicide mortality and the rate of YLLs among adolescents and young adults both declined from 1990 to 2021 (AAPC: −1.6 [−2.1 to −1.2]). In 2021, the global number of suicide death cases was 112.9 thousand [103.9–122.2 thousand] and led to 7.9 million [7.2–8.6 million] YLLs. A significant reduction in suicide mortality was observed in all sexes and age groups. By SDI quintiles, the high SDI region (AAPC: −0.3 [−0.6 to 0.0]) had the slowest decline trend, and low-middle SDI region remained the highest suicide mortality till 2021 (7.8 per 100,000 population [6.9–8.6]). Most SDI regions showed generally lower period and cohort effects during the study period, whereas high SDI region showed more unfavourable risks, especially period and cohort effects in females. Regionally, Central Latin America (AAPC: 1.7 [1.1–2.3]), Tropical Latin America (AAPC: 1.5 [0.9–2.0]), High-income Asia Pacific (AAPC: 1.2 [0.7–1.7]) and Southern sub-Saharan Africa (AAPC: 0.8 [0.4–1.2]) had the significance increase in suicide mortality. In 2021, Southern sub-Saharan Africa had the highest mortality (10.5 per 100,000 population [8.6–12.5]). Nationally, a total of 29 countries had a significant upward trend in suicide mortality and rate of YLLs over the past three decades, and certain countries in low-middle and middle regions exhibited an extremely higher burden of suicide.
Conclusions
Global suicide mortality and the rate of YLLs among adolescents and young adults both declined from 1990 to 2021, but obvious variability was observed across regions and countries. Earlier mental health education and targeted management are urgently required for adolescents and young adults in certain areas.
Thanks to its real-time computation efficiency, deep reinforcement learning (DRL) has been widely applied in motion planning for mobile robots. In DRL-based methods, a DRL model computes an action for a robot based on the states of its surrounding obstacles, including other robots that may communicate with it. These methods always assume that the environment is attack-free and the obtained obstacles’ states are reliable. However, in the real world, a robot may suffer from obstacle localization attacks (OLAs), such as sensor attacks, communication attacks, and remote-control attacks, which cause the robot to retrieve inaccurate positions of the surrounding obstacles. In this paper, we propose a robust motion planning method ObsGAN-DRL, integrating a generative adversarial network (GAN) into DRL models to mitigate OLAs in the environment. First, ObsGAN-DRL learns a generator based on the GAN model to compute the approximation of obstacles’ accurate positions in benign and attack scenarios. Therefore, no detectors are required for ObsGAN-DRL. Second, by using the approximation positions of the surrounding obstacles, ObsGAN-DRL can leverage the state-of-the-art DRL methods to compute collision-free motion commands (e.g., velocity) efficiently. Comprehensive experiments show that ObsGAN-DRL can mitigate OLAs effectively and guarantee safety. We also demonstrate the generalization of ObsGAN-DRL.
Chinese nurses working with immense stress may have issues with burnout during COVID-19 regular prevention and control. There were a few studies investigating status of burnout and associated factors among Chinese nurses. However, the relationships remained unclear.
Objectives
To investigate status and associated factors of nurses’ burnout during COVID-19 regular prevention and control.
Methods
784 nurses completed questionnaires including demographics, Generalized Anxiety Disorder-7, Patient Health Questionnaire-9, Insomnia Severity Index, Impact of Event Scale-revised, Perceived Social Support Scale, Connor–Davidson Resilience Scale, General Self-efficacy Scale and Maslach Burnout Inventory.
Results
310 (39.5%), 393 (50.1%) and 576 (73.5%) of respondents were at high risk of emotional exhaustion (EE), depersonalization (DP) and reduced personal accomplishment (PA). The risk of EE, DP and reduced PA were moderate, high and high. Nurses with intermediate and senior professional rank and title and worked >40 h every week had lower scores in EE. Those worked in low-risk department reported lower scores in PA. Anxiety, post-traumatic stress disorder (PTSD), self-efficacy and social support were influencing factors of EE and DP, while social support and resilience were associated factors of PA.
Conclusion
Chinese nurses’ burnout during COVID-19 regular prevention and control was serious. Professional rank and title, working unit, weekly working hours, anxiety, PTSD, self-efficacy, social support and resilience were associated factors of burnout.
Mild cognitive deficits (MCD) emerge before the first episode of psychosis (FEP) and persist in the clinical high-risk (CHR) stage. This study aims to refine risk prediction by developing MCD models optimized for specific early psychosis stages and target populations.
Methods
A comprehensive neuropsychological battery assessed 1059 individuals with FEP, 794 CHR, and 774 matched healthy controls (HCs). CHR subjects, followed up for 2 years, were categorized into converters (CHR-C) and non-converters (CHR-NC). The MATRICS Consensus Cognitive Battery standardized neurocognitive tests were employed.
Results
Both the CHR and FEP groups exhibited significantly poorer performance compared to the HC group across all neurocognitive tests (all p < 0.001). The CHR-C group demonstrated poorer performance compared to the CHR-NC group on three sub-tests: visuospatial memory (p < 0.001), mazes (p = 0.005), and symbol coding (p = 0.023) tests. Upon adjusting for sex and age, the performance of the MCD model was excellent in differentiating FEP from HC, as evidenced by an Area Under the Receiver Operating Characteristic Curve (AUC) of 0.895 (p < 0.001). However, when applied in the CHR group for predicting CHR-C (AUC = 0.581, p = 0.008), the performance was not satisfactory. To optimize the efficiency of psychotic risk assessment, three distinct MCD models were developed to distinguish FEP from HC, predict CHR-C from CHR-NC, and identify CHR from HC, achieving accuracies of 89.3%, 65.6%, and 80.2%, respectively.
Conclusions
The MCD exhibits variations in domains, patterns, and weights across different stages of early psychosis and diverse target populations. Emphasizing precise risk assessment, our findings highlight the importance of tailored MCD models for different stages and risk levels.
Researchers have long been committed to developing alternative, low-cost nanomaterials that have comparable capacity to carbon nanotubes. Halloysite nanotubes (HNTs) are naturally hollow, multi-walled, tubular structures that have high porosity, enlarged volumes and surface areas, and hydroxyl groups ready for modification. In addition, HNTs are non-toxic, biocompatible, inexpensive, abundant in nature, and easy to obtain. Magnetic nanocomposites have aroused widespread attention for their diverse potential applications in analytical fields and so magnetic halloysite nanotubes (MHNTs) have emerged as outstanding magnetic nano-adsorbent materials. Owing to their superparamagnetism, selective adsorption ability, and easy separation and surface modification, these captivating nanomaterials excel at extracting and enriching various analytes from environmental, biological, and food samples. The current review article gives an insight into recent advances in the design, functionalization, characterization, and application of MHNTs as magnetic, solid-phase extraction sorbents for separation of antibiotics, pesticides, proteins, carcinogens such as polycyclic aromatic hydrocarbons (PAHs), dyes, radioactive ions, and heavy-metal ions in complex matrices.
Constraining the timing and extent of Quaternary glaciations in the Tibetan Plateau (TP) is significant for the reconstruction of paleoclimatic environment and understanding the interrelationships among climate, tectonics, and glacial systems. We investigated the late Quaternary glacial history of the Qinggulong and Juequ valleys in the Taniantaweng Mountains, southeastern TP, using cosmogenic 10Be surface exposure dating. Four major glacial events were identified based on 26 10Be ages. The exposure ages of the oldest late Quaternary glaciation correspond to Marine Oxygen Isotope Stage (MIS) 6. The maximum glacial extent was dated to 48.5–41.1 ka (MIS 3), during the last glaciation, and was more advanced than that of the last glacial maximum (LGM). Geochronology and geomorphological evidence indicate that multiple glacial fluctuations occurred in the study area during the Early–Middle Holocene. These glacial fluctuations likely were driven by the North Atlantic climate oscillations, summer solar insolation variability, Asian summer monsoon intensity, and CO2 concentration.
Studies on sentence processing in inflectional languages support that syntactic structure building functionally precedes semantic processing. Conversely, most EEG studies of Chinese sentence processing do not support the priority of syntax. One possible explanation is that the Chinese language lacks morphological inflections. Another explanation may be that the presentation of separate sentence components on individual screens in EEG studies disrupts syntactic framework construction during sentence reading. The present study investigated this explanation using a self-paced reading experiment mimicking rapid serial visual presentation in EEG studies and an eye-tracking experiment reflecting natural reading. In both experiments, Chinese ‘ba’ sentences were presented to Chinese young adults in four conditions that differed across the dimensions of syntactic and semantic congruency. Evidence supporting the functional priority of syntax over semantics was limited to only the natural reading context, in which syntactic violations blocked the processing of semantics. Additionally, we observed a later stage of integrating plausible semantics with a failed syntax. Together, our findings extend the functional priority of syntax to the Chinese language and highlight the importance of adopting more ecologically valid methods when investigating sentence reading.
Purple nutsedge (Cyperus rotundus L.) is a globally distributed noxious weed that poses a significant challenge for control due to its fast and efficient propagation through the tuber, which is the primary reproductive organ. Gibberellic acid (GA3) has proven to be crucial for tuberization in tuberous plants. Therefore, understanding the relationship between GA3 and tuber development and propagation of C. rotundus will provide valuable information for controlling this weed. This study shows that the GA3 content decreases with tuber development, which corresponds to lower expression of bioactive GA3 synthesis genes (CrGA20ox, two CrGA3ox genes) and two upregulated GA3 catabolism genes (CrGA2ox genes), indicating that GA3 is involved in tuber development. Simultaneously, the expression of two CrDELLA genes and CrGID1 declines with tuber growth and decreased GA3, and yeast two-hybrid assays confirm that the GA3 signaling is DELLA-dependent. Furthermore, exogenous application of GA3 markedly reduces the number and the width of tubers and represses the growth of the tuber chain, further confirming the negative impact that GA3 has on tuber development and propagation. Taken together, these results demonstrate that GA3 is involved in tuber development and regulated by the DELLA-dependent pathway in C. rotundus and plays a negative role in tuber development and propagation.