We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Internet addiction (IA) refers to excessive internet use that causes cognitive impairment or distress. Understanding the neurophysiological mechanisms underpinning IA is crucial for enabling an accurate diagnosis and informing treatment and prevention strategies. Despite the recent increase in studies examining the neurophysiological traits of IA, their findings often vary. To enhance the accuracy of identifying key neurophysiological characteristics of IA, this study used the phase lag index (PLI) and weighted PLI (WPLI) methods, which minimize volume conduction effects, to analyze the resting-state electroencephalography (EEG) functional connectivity. We further evaluated the reliability of the identified features for IA classification using various machine learning methods.
Methods
Ninety-two participants (42 with IA and 50 healthy controls (HCs)) were included. PLI and WPLI values for each participant were computed, and values exhibiting significant differences between the two groups were selected as features for the subsequent classification task.
Results
Support vector machine (SVM) achieved an 83% accuracy rate using PLI features and an improved 86% accuracy rate using WPLI features. t-test results showed analogous topographical patterns for both the WPLI and PLI. Numerous connections were identified within the delta and gamma frequency bands that exhibited significant differences between the two groups, with the IA group manifesting an elevated level of phase synchronization.
Conclusions
Functional connectivity analysis and machine learning algorithms can jointly distinguish participants with IA from HCs based on EEG data. PLI and WPLI have substantial potential as biomarkers for identifying the neurophysiological traits of IA.
Hand, foot, and mouth disease (HFMD) shows spatiotemporal heterogeneity in China. A spatiotemporal filtering model was constructed and applied to HFMD data to explore the underlying spatiotemporal structure of the disease and determine the impact of different spatiotemporal weight matrices on the results. HFMD cases and covariate data in East China were collected between 2009 and 2015. The different spatiotemporal weight matrices formed by Rook, K-nearest neighbour (KNN; K = 1), distance, and second-order spatial weight matrices (SO-SWM) with first-order temporal weight matrices in contemporaneous and lagged forms were decomposed, and spatiotemporal filtering model was constructed by selecting eigenvectors according to MC and the AIC. We used MI, standard deviation of the regression coefficients, and five indices (AIC, BIC, DIC, R2, and MSE) to compare the spatiotemporal filtering model with a Bayesian spatiotemporal model. The eigenvectors effectively removed spatial correlation in the model residuals (Moran’s I < 0.2, p > 0.05). The Bayesian spatiotemporal model’s Rook weight matrix outperformed others. The spatiotemporal filtering model with SO-SWM was superior, as shown by lower AIC (92,029.60), BIC (92,681.20), and MSE (418,022.7) values, and higher R2 (0.56) value. All spatiotemporal contemporaneous structures outperformed the lagged structures. Additionally, eigenvector maps from the Rook and SO-SWM closely resembled incidence patterns of HFMD.
Little is known about the association between iodine nutrition status and bone health. The present study aimed to explore the connection between iodine nutrition status, bone metabolism parameters, and bone disease (osteopenia and osteoporosis). A cross-sectional survey was conducted involving 391, 395, and 421 adults from iodine fortification areas (IFA), iodine adequate areas (IAA), and iodine excess areas (IEA) of China. Iodine nutrition status, bone metabolism parameters and BMD were measured. Our results showed that, in IEA, the urine iodine concentrations (UIC) and serum iodine concentrations (SIC) were significantly higher than in IAA. BMD and Ca2+ levels were significantly different under different iodine nutrition levels and the BMD were negatively correlated with UIC and SIC. Univariate linear regression showed that gender, age, BMI, menopausal status, smoking status, alcohol consumption, UIC, SIC, free thyroxine, TSH, and alkaline phosphatase were associated with BMD. The prevalence of osteopenia was significantly increased in IEA, UIC ≥ 300 µg/l and SIC > 90 µg/l groups. UIC ≥ 300 µg/l and SIC > 90 µg/l were risk factors for BMD T value < –1·0 sd. In conclusion, excess iodine can not only lead to changes in bone metabolism parameters and BMD, but is also a risk factor for osteopenia and osteoporosis.
Evidence suggests the crucial role of dysfunctional default mode (DMN), salience and frontoparietal (FPN) networks, collectively termed the triple network model, in the pathophysiology of treatment-resistant depression (TRD).
Aims
Using the graph theory- and seed-based functional connectivity analyses, we attempted to elucidate the role of low-dose ketamine in the triple networks, namely the DMN, salience and FPN.
Method
Resting-state functional connectivity magnetic resonance imaging (rs–fcMRI) data derived from two previous clinical trials of a single, low-dose ketamine infusion were analysed. In clinical trial 1 (Trial 1), patients with TRD were randomised to either a ketamine or normal saline group, while in clinical trial 2 (Trial 2) those patients with TRD and pronounced suicidal symptoms received a single infusion of either 0.05 mg/kg ketamine or 0.045 mg/kg midazolam. All participants underwent rs–fcMRI pre and post infusion at Day 3. Both graph theory- and seed-based functional connectivity analyses were performed independently.
Results
Trial 1 demonstrated significant group-by-time effects on the degree centrality and cluster coefficient in the right posterior cingulate cortex (PCC) cortex ventral 23a and b (DMN) and the cluster coefficient in the right supramarginal gyrus perisylvian language (salience). Trial 2 found a significant group-by-time effect on the characteristic path length in the left PCC 7Am (DMN). In addition, both ketamine and normal saline infusions exerted a time effect on the cluster coefficient in the right dorsolateral prefrontal cortex a9-46v (FPN) in Trial 1.
Conclusions
These findings may support the utility of the triple-network model in elucidating ketamine’s antidepressant effect. Alterations in DMN, salience and FPN function may underlie this effect.
Patients with chronic insomnia are characterized by alterations in default mode network and alpha oscillations, for which the medial parietal cortex (MPC) is a key node and thus a potential target for interventions.
Methods
Fifty-six adults with chronic insomnia were randomly assigned to 2 mA, alpha-frequency (10 Hz), 30 min active or sham transcranial alternating current stimulation (tACS) applied over the MPC for 10 sessions completed within two weeks, followed by 4- and 6-week visits. The connectivity of the dorsal and ventral posterior cingulate cortex (vPCC) was calculated based on resting functional MRI.
Results
For the primary outcome, the active group showed a higher response rate (≥ 50% reduction in Pittsburgh Sleep Quality Index (PSQI)) at week 6 than that of the sham group (71.4% versus 3.6%) (risk ratio 20.0, 95% confidence interval 2.9 to 139.0, p = 0.0025). For the secondary outcomes, the active therapy induced greater and sustained improvements (versus sham) in the PSQI, depression (17-item Hamilton Depression Rating Scale), anxiety (Hamilton Anxiety Rating Scale), and cognitive deficits (Perceived Deficits Questionnaire-Depression) scores. The response rates in the active group decreased at weeks 8–14 (42.9%–57.1%). Improvement in sleep was associated with connectivity between the vPCC and the superior frontal gyrus and the inferior parietal lobe, whereas vPCC-to-middle frontal gyrus connectivity was associated with cognitive benefits and vPCC-to-ventromedial prefrontal cortex connectivity was associated with alleviation in rumination.
Conclusions
Targeting the MPC with alpha-tACS appears to be an effective treatment for chronic insomnia, and vPCC connectivity represents a prognostic marker of treatment outcome.
This paper provides an overview of the current status of ultrafast and ultra-intense lasers with peak powers exceeding 100 TW and examines the research activities in high-energy-density physics within China. Currently, 10 high-intensity lasers with powers over 100 TW are operational, and about 10 additional lasers are being constructed at various institutes and universities. These facilities operate either independently or are combined with one another, thereby offering substantial support for both Chinese and international research and development efforts in high-energy-density physics.
Knowledge of the critical periods of crop–weed competition is crucial for designing weed management strategies in cropping systems. In the Lower Yangtze Valley, China, field experiments were conducted in 2011 and 2012 to study the effect of interference from mixed natural weed populations on cotton growth and yield and to determine the critical period for weed control (CPWC) in direct-seeded cotton. Two treatments were applied: allowing weeds to infest the crop or keeping plots weed-free for increasing periods (0, 1, 2, 4, 6, 8, 10, 12, 14, and 20 wk) after crop emergence. The results show that mixed natural weed infestations led to 35- to 55-cm shorter cotton plants with stem diameters 10 to 13 mm smaller throughout the season, fitting well with modified Gompertz and logistic models, respectively. Season-long competition with weeds reduced the number of fruit branches per plant by 65% to 82%, decreasing boll number per plant by 86% to 96% and single boll weight by approximately 24%. Weed-free seed cotton yields ranged from 2,900 to 3,130 kg ha−1, while yield loss increased with the duration of weed infestation, reaching up to 83% to 96% compared with permanent weed-free plots. Modified Gompertz and logistic models were used to analyze the impact of increasing weed control duration and weed interference on relative seed cotton yield (percentage of season-long weed-free cotton), respectively. Based on a 5% yield loss threshold, the CPWC was found to be from 145 to 994 growing degree days (GDD), corresponding to 14 to 85 d after emergence (DAE). These findings emphasize the importance of implementing effective weed control measures from 14 to 85 DAE in the Lower Yangtze Valley to prevent crop losses exceeding a 5% yield loss threshold.
Post-traumatic stress disorder (PTSD) is a mental health condition caused by the dysregulation or overgeneralization of memories related to traumatic events. Investigating the interplay between explicit narrative and implicit emotional memory contributes to a better understanding of the mechanisms underlying PTSD.
Methods
This case–control study focused on two groups: unmedicated patients with PTSD and a trauma-exposed control (TEC) group who did not develop PTSD. Experiments included real-time measurements of blood oxygenation changes using functional near-infrared spectroscopy during trauma narration and processing of emotional and linguistic data through natural language processing (NLP).
Results
Real-time fNIRS monitoring showed that PTSD patients (mean [SD] Oxy-Hb activation, 0.153 [0.084], 95% CI 0.124 to 0.182) had significantly higher brain activity in the left anterior medial prefrontal cortex (L-amPFC) within 10 s after expressing negative emotional words compared with the control group (0.047 [0.026], 95% CI 0.038 to 0.056; p < 0.001). In the control group, there was a significant time-series correlation between the use of negative emotional memory words and activation of the L-amPFC (latency 3.82 s, slope = 0.0067, peak value = 0.184, difference = 0.273; Spearman’s r = 0.727, p < 0.001). In contrast, the left anterior cingulate prefrontal cortex of PTSD patients remained in a state of high activation (peak value = 0.153, difference = 0.084) with no apparent latency period.
Conclusions
PTSD patients display overactivity in pathways associated with rapid emotional responses and diminished regulation in cognitive processing areas. Interventions targeting these pathways may alleviate symptoms of PTSD.
The emotion regulation network (ERN) in the brain provides a framework for understanding the neuropathology of affective disorders. Although previous neuroimaging studies have investigated the neurobiological correlates of the ERN in major depressive disorder (MDD), whether patients with MDD exhibit abnormal functional connectivity (FC) patterns in the ERN and whether the abnormal FC in the ERN can serve as a therapeutic response signature remain unclear.
Methods
A large functional magnetic resonance imaging dataset comprising 709 patients with MDD and 725 healthy controls (HCs) recruited across five sites was analyzed. Using a seed-based FC approach, we first investigated the group differences in whole-brain resting-state FC of the 14 ERN seeds between participants with and without MDD. Furthermore, an independent sample (45 MDD patients) was used to evaluate the relationship between the aforementioned abnormal FC in the ERN and symptom improvement after 8 weeks of antidepressant monotherapy.
Results
Compared to the HCs, patients with MDD exhibited aberrant FC between 7 ERN seeds and several cortical and subcortical areas, including the bilateral middle temporal gyrus, bilateral occipital gyrus, right thalamus, calcarine cortex, middle frontal gyrus, and the bilateral superior temporal gyrus. In an independent sample, these aberrant FCs in the ERN were negatively correlated with the reduction rate of the HAMD17 score among MDD patients.
Conclusions
These results might extend our understanding of the neurobiological underpinnings underlying unadaptable or inflexible emotional processing in MDD patients and help to elucidate the mechanisms of therapeutic response.
Knowledge is growing on the essential role of neural circuits involved in aberrant cognitive control and reward sensitivity for the onset and maintenance of binge eating.
Aims
To investigate how the brain's reward (bottom-up) and inhibition control (top-down) systems potentially and dynamically interact to contribute to subclinical binge eating.
Method
Functional magnetic resonance imaging data were acquired from 30 binge eaters and 29 controls while participants performed a food reward Go/NoGo task. Dynamic causal modelling with the parametric empirical Bayes framework, a novel brain connectivity technique, was used to examine between-group differences in the directional influence between reward and executive control regions. We explored the proximal risk factors for binge eating and its neural basis, and assessed the predictive ability of neural indices on future disordered eating and body weight.
Results
The binge eating group relative to controls displayed fewer reward-inhibition undirectional and directional synchronisations (i.e. medial orbitofrontal cortex [mOFC]–superior parietal gyrus [SPG] connectivity, mOFC → SPG excitatory connectivity) during food reward_nogo condition. Trait impulsivity is a key proximal factor that could weaken the mOFC–SPG connectivity and exacerbate binge eating. Crucially, this core mOFC–SPG connectivity successfully predicted binge eating frequency 6 months later.
Conclusions
These findings point to a particularly important role of the bottom-up interactions between cortical reward and frontoparietal control circuits in subclinical binge eating, which offers novel insights into the neural hierarchical mechanisms underlying problematic eating, and may have implications for the early identification of individuals suffering from strong binge eating-associated symptomatology in the general population.
Natural infection by Trichinella sp. has been reported in humans and more than 150 species of animals, especially carnivorous and omnivorous mammals. Although the presence of Trichinella sp. infection in wild boars (Sus scrofa) has been documented worldwide, limited information is known about Trichinella circulation in farmed wild boars in China. This study intends to investigate the prevalence of Trichinella sp. in farmed wild boars in China. Seven hundred and sixty-one (761) muscle samples from farmed wild boars were collected in Jilin Province of China from 2017 to 2020. The diaphragm muscles were examined by artificial digestion method. The overall prevalence of Trichinella in farmed wild boars was 0.53% [95% confidence interval (CI): 0.51–0.55]. The average parasite loading was 0.076 ± 0.025 larvae per gram (lpg), and the highest burden was 0.21 lpg in a wild boar from Fusong city. Trichinella spiralis was the only species identified by multiplex polymerase chain reaction. The 5S rDNA inter-genic spacer region of Trichinella was amplified and sequenced. The results showed that the obtained sequence (GenBank accession number: OQ725583) shared 100% identity with the T. spiralis HLJ isolate (GenBank accession number: MH289505). Since the consumption of farmed wild boars is expected to increase in the future, these findings highlight the significance of developing exclusive guidelines for the processing of slaughtered farmed wild boar meat in China.
In contemporary neuroimaging studies, it has been observed that patients with major depressive disorder (MDD) exhibit aberrant spontaneous neural activity, commonly quantified through the amplitude of low-frequency fluctuations (ALFF). However, the substantial individual heterogeneity among patients poses a challenge to reaching a unified conclusion.
Methods
To address this variability, our study adopts a novel framework to parse individualized ALFF abnormalities. We hypothesize that individualized ALFF abnormalities can be portrayed as a unique linear combination of shared differential factors. Our study involved two large multi-center datasets, comprising 2424 patients with MDD and 2183 healthy controls. In patients, individualized ALFF abnormalities were derived through normative modeling and further deconstructed into differential factors using non-negative matrix factorization.
Results
Two positive and two negative factors were identified. These factors were closely linked to clinical characteristics and explained group-level ALFF abnormalities in the two datasets. Moreover, these factors exhibited distinct associations with the distribution of neurotransmitter receptors/transporters, transcriptional profiles of inflammation-related genes, and connectome-informed epicenters, underscoring their neurobiological relevance. Additionally, factor compositions facilitated the identification of four distinct depressive subtypes, each characterized by unique abnormal ALFF patterns and clinical features. Importantly, these findings were successfully replicated in another dataset with different acquisition equipment, protocols, preprocessing strategies, and medication statuses, validating their robustness and generalizability.
Conclusions
This research identifies shared differential factors underlying individual spontaneous neural activity abnormalities in MDD and contributes novel insights into the heterogeneity of spontaneous neural activity abnormalities in MDD.
Psychiatric diagnosis is based on categorical diagnostic classification, yet similarities in genetics and clinical features across disorders suggest that these classifications share commonalities in neurobiology, particularly regarding neurotransmitters. Glutamate (Glu) and gamma-aminobutyric acid (GABA), the brain's primary excitatory and inhibitory neurotransmitters, play critical roles in brain function and physiological processes.
Methods
We examined the levels of Glu, combined glutamate and glutamine (Glx), and GABA across psychiatric disorders by pooling data from 121 1H-MRS studies and further divided the sample based on Axis I disorders.
Results
Statistically significant differences in GABA levels were found in the combined psychiatric group compared with healthy controls (Hedge's g = −0.112, p = 0.008). Further analyses based on brain regions showed that brain GABA levels significantly differed across Axis I disorders and controls in the parieto-occipital cortex (Hedge's g = 0.277, p = 0.019). Furthermore, GABA levels were reduced in affective disorders in the occipital cortex (Hedge's g = −0.468, p = 0.043). Reductions in Glx levels were found in neurodevelopmental disorders (Hedge's g = −0.287, p = 0.022). Analysis focusing on brain regions suggested that Glx levels decreased in the frontal cortex (Hedge's g = −0.226, p = 0.025), and the reduction of Glu levels in patients with affective disorders in the frontal cortex is marginally significant (Hedge's g = −0.172, p = 0.052). When analyzing the anterior cingulate cortex and prefrontal cortex separately, reductions were only found in GABA levels in the former (Hedge's g = − 0.191, p = 0.009) across all disorders.
Conclusions
Altered glutamatergic and GABAergic metabolites were found across psychiatric disorders, indicating shared dysfunction. We found reduced GABA levels across psychiatric disorders and lower Glu levels in affective disorders. These results highlight the significance of GABA and Glu in psychiatric etiology and partially support rethinking current diagnostic categories.
Femtosecond oscillators with gigahertz (GHz) repetition rate are appealing sources for spectroscopic applications benefiting from the individually accessible and high-power comb line. The mode mismatch between the potent pump laser diode (LD) and the incredibly small laser cavity, however, limits the average output power of existing GHz Kerr-lens mode-locked (KLM) oscillators to tens of milliwatts. Here, we present a novel method that solves the difficulty and permits high average power LD-pumped KLM oscillators at GHz repetition rate. We propose a numerical simulation method to guide the realization of Kerr-lens mode-locking and comprehend the dynamics of the Kerr-lens mode-locking process. As a proof-of-principle demonstration, an LD-pumped Yb:KGW oscillator with up to 6.17-W average power and 184-fs pulse duration at 1.6-GHz repetition rate is conducted. The simulation had a good agreement with the experimental results. The cost-effective, compact and powerful laser source opens up new possibilities for research and industrial applications.
This study aims to evaluate the impact of low-carbohydrate diet, balanced dietary guidance and pharmacotherapy on weight loss among individuals with overweight or obesity over a period of 3 months. The study involves 339 individuals with overweight or obesity and received weight loss treatment at the Department of Clinical Nutrition at the Second Affiliated Hospital of Zhejiang University, School of Medicine, between 1 January 2020 and 31 December 2023. The primary outcome is the percentage weight loss. Among the studied patients, the majority chose low-carbohydrate diet as their primary treatment (168 (49·56 %)), followed by balanced dietary guidance (139 (41·00 %)) and pharmacotherapy (32 (9·44 %)). The total percentage weight loss for patients who were followed up for 1 month, 2 months and 3 months was 4·98 (3·04, 6·29) %, 7·93 (5·42, 7·93) % and 10·71 (7·74, 13·83) %, respectively. Multivariable logistic regression analysis identified low-carbohydrate diet as an independent factor associated with percentage weight loss of ≥ 3 % and ≥ 5 % at 1 month (OR = 0·461, P < 0·05; OR = 0·349, P < 0·001). The results showed that a low-carbohydrate diet was an effective weight loss strategy in the short term. However, its long-term effects were comparable to those observed with balanced dietary guidance and pharmacotherapy.
Choline and betaine are important in the body, from cell membrane components to methyl donors. We aimed to investigate trends in dietary intake and food sources of total choline, individual choline forms and betaine in Chinese adults using data from the China Health and Nutrition Survey (CHNS) 1991–2011, a prospective cohort with a multistage, random cluster design. Dietary intake was estimated using three consecutive 24-h dietary recalls in combination with a household food inventory. Linear mixed-effect models were constructed using R software. A total of 11 188 men and 12 279 women aged 18 years or older were included. Between 1991 and 2011, total choline intake increased from 219·3 (95 % CI 215·1, 223·4) mg/d to 269·0 (95 % CI 265·6, 272·5) mg/d in men and from 195·6 (95 % CI 191·8, 199·4) mg/d to 240·4 (95 % CI 237·4, 243·5) mg/d in women (both P-trends < 0·001). Phosphatidylcholine was the major form of dietary choline, and its contribution to total choline increased from 46·9 % in 1991 to 58·8 % in 2011. Cereals were the primary food source of total choline before 2000, while eggs had ranked at the top since 2004. Dietary betaine intake was relatively steady over time with a range of 134·0–151·5 mg/d in men (P-trend < 0·001) and 111·7–125·3 mg/d in women (P-trend > 0·05). Chinese adults experienced a significant increase in dietary intake of choline, particularly phosphatidylcholine during 1991–2011 and animal-derived foods have replaced plant-based foods as the main food sources of choline. Betaine intake remained relatively stable over time. Future efforts should address the health effects of these changes.
In the present study, we have discovered and identified a new crystalline form of pinaverium bromide, pinaverium bromide dihydrate (C26H41BrNO4⋅Br⋅2H2O), whose single crystals can be obtained by recrystallization from a mixture of water and acetonitrile at room temperature. The obtained crystals were characterized by X-ray single-crystal diffraction, and their crystal structure was also solved based on X-ray single-crystal diffraction data. The results show that the final pinaverium bromide dihydrate model contains an asymmetric unit of one pinaverium bromide (C26H41Br2NO4) molecule and two water molecules that combine with the bromine ion through O–H⋯O and O–H⋯Br hydrogen bonds. Then, the adjacent pinaverium bromide dihydrates are linked by O–H⋯O, O–H⋯Br, and C–H⋯O hydrogen bonds. On the other hand, the experimentally obtained X-ray powder diffraction pattern is in good agreement with the simulated diffraction pattern from their single-crystal data, confirming the correctness of the crystal structure. Hirshfeld surface analysis was employed to understand and visualize the packing patterns, indicating that the H⋯H interaction is the main acting force in the crystal stacking of pinaverium bromide dihydrate.
Purple nutsedge (Cyperus rotundus L.) is one of the world’s resilient upland weeds, primarily spreading through its tubers. Its emergence in rice (Oryza sativa L.) fields has been increasing, likely due to changing paddy-farming practices. This study aimed to investigate how C. rotundus, an upland weed, can withstand soil flooding and become a problematic weed in rice fields. The first comparative analysis focused on the survival and recovery characteristics of growing and mature tubers of C. rotundus exposed to soil-flooding conditions. Notably, mature tubers exhibited significant survival and recovery abilities in these environments. Based on this observation, further investigation was carried out to explore the morphological structure, nonstructural carbohydrates, and respiratory mechanisms of mature tubers in response to prolonged soil flooding. Over time, the mature tubers did not form aerenchyma but instead gradually accumulated lignified sclerenchymal fibers, with lignin content also increasing. After 90 d, the lignified sclerenchymal fibers and lignin contents were 4.0 and 1.1 times higher than those in the no soil-flooding treatment. Concurrently, soluble sugar content decreased while starch content increased, providing energy storage, and alcohol dehydrogenase activity rose to support anaerobic respiration via alcohol fermentation. These results indicated that mature tubers survived in soil-flooding conditions by adopting a low-oxygen quiescence strategy, which involves morphological adaptations through the development of lignified sclerenchymal fibers, increased starch reserves for energy storage, and enhanced anaerobic respiration. This mechanism likely underpins the flooding tolerance of mature C. rotundus tubers, allowing them to endure unfavorable conditions and subsequently germinate and grow once flooding subsides. This study provides a preliminary explanation of the mechanism by which mature tubers of C. rotundus from the upland areas confer flooding tolerance, shedding light on the reasons behind this weed’s increasing presence in rice fields.