We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
On both global and local levels, one can observe a trend toward the adoption of algorithmic regulation in the public sector, with the Chinese social credit system (SCS) serving as a prominent and controversial example of this phenomenon. Within the SCS framework, cities play a pivotal role in its development and implementation, both as evaluators of individuals and enterprises and as subjects of evaluation themselves. This study engages in a comparative analysis of SCS scoring mechanisms for individuals and enterprises across diverse Chinese cities while also scrutinizing the scoring system applied to cities themselves. We investigate the extent of algorithmic regulation exercised through the SCS, elucidating its operational dynamics at the city level in China and assessing its interventionism, especially concerning the involvement of algorithms. Furthermore, we discuss ethical concerns surrounding the SCS’s implementation, particularly regarding transparency and fairness. By addressing these issues, this article contributes to two research domains: algorithmic regulation and discourse surrounding the SCS, offering valuable insights into the ongoing utilization of algorithmic regulation to tackle governance and societal challenges.
Prehistoric humans seem to have preferred inhabiting small river basins, which were closer in distance to most settlements compared to larger rivers. The Holocene landscape evolution is considered to have played a pivotal role in shaping the spatiotemporal patterns of these settlements. In this study, we conducted comprehensive research on the relationship between landscape evolution and settlement distribution within the Huangshui River basin, which is a representative small river in Central China with numerous early settlements, including a prehistoric city known as the Wangjinglou site (WJL). Using geoarchaeological investigations, optically stimulated luminescence dating, pollen analysis, and grain-size analysis, we analyzed the characteristics of the Holocene environment. The results indicate the presence of two distinct geomorphic systems, namely the red clay hills and the river valley. The red clay hills, formed in the Neogene, represent remnants of the Songshan piedmont alluvial fan that was eroded by rivers. There are three grades of terraces within the river valley. T3 is a strath terrace and formed around 8.0 ka. Both T2 and T1 are fill terraces, which were developed around 4.0 ka and during the historical period, respectively. The sedimentary features and pollen analysis indicate the existence of an ancient lake-swamp on the platform during 11.0–9.0 ka. This waterbody gradually shrank during 9.0–8.0 ka, and ultimately disappeared after 8.0 ka. Since then, the development of large-scale areas of water ceased on the higher geomorphic units. River floods also cannot reach the top of these high geomorphic units, where numerous prehistoric settlements are located, including the Xia–Shang cities of the WJL site. Our research demonstrates that landscape stability supported the long-term and sustainable development of ancient cultures and facilitated the establishment of the WJL ancient cities in the region.
Our study aimed to develop and validate a nomogram to assess talaromycosis risk in hospitalized HIV-positive patients. Prediction models were built using data from a multicentre retrospective cohort study in China. On the basis of the inclusion and exclusion criteria, we collected data from 1564 hospitalized HIV-positive patients in four hospitals from 2010 to 2019. Inpatients were randomly assigned to the training or validation group at a 7:3 ratio. To identify the potential risk factors for talaromycosis in HIV-infected patients, univariate and multivariate logistic regression analyses were conducted. Through multivariate logistic regression, we determined ten variables that were independent risk factors for talaromycosis in HIV-infected individuals. A nomogram was developed following the findings of the multivariate logistic regression analysis. For user convenience, a web-based nomogram calculator was also created. The nomogram demonstrated excellent discrimination in both the training and validation groups [area under the ROC curve (AUC) = 0.883 vs. 0.889] and good calibration. The results of the clinical impact curve (CIC) analysis and decision curve analysis (DCA) confirmed the clinical utility of the model. Clinicians will benefit from this simple, practical, and quantitative strategy to predict talaromycosis risk in HIV-infected patients and can implement appropriate interventions accordingly.
To examine the effectiveness of Self-Help Plus (SH+) as an intervention for alleviating stress levels and mental health problems among healthcare workers.
Methods
This was a prospective, two-arm, unblinded, parallel-designed randomised controlled trial. Participants were recruited at all levels of medical facilities within all municipal districts of Guangzhou. Eligible participants were adult healthcare workers experiencing psychological stress (10-item Perceived Stress Scale scores of ≥15) but without serious mental health problems or active suicidal ideation. A self-help psychological intervention developed by the World Health Organization in alleviating psychological stress and preventing the development of mental health problems. The primary outcome was psychological stress, assessed at the 3-month follow-up. Secondary outcomes were depression symptoms, anxiety symptoms, insomnia, positive affect (PA) and self-kindness assessed at the 3-month follow-up.
Results
Between November 2021 and April 2022, 270 participants were enrolled and randomly assigned to either SH+ (n = 135) or the control group (n = 135). The SH+ group had significantly lower stress at the 3-month follow-up (b = −1.23, 95% CI = −2.36, −0.10, p = 0.033) compared to the control group. The interaction effect indicated that the intervention effect in reducing stress differed over time (b = −0.89, 95% CI = −1.50, −0.27, p = 0.005). Analysis of the secondary outcomes suggested that SH+ led to statistically significant improvements in most of the secondary outcomes, including depression, insomnia, PA and self-kindness.
Conclusions
This is the first known randomised controlled trial ever conducted to improve stress and mental health problems among healthcare workers experiencing psychological stress in a low-resource setting. SH+ was found to be an effective strategy for alleviating psychological stress and reducing symptoms of common mental problems. SH+ has the potential to be scaled-up as a public health strategy to reduce the burden of mental health problems in healthcare workers exposed to high levels of stress.
Preterm children with very low birthweight (VLBW) / extremely low birthweight (ELBW) with normal early development had been found poorer executive functions (EFs) at preschool-age (Ni, Huang & Guo, 2011). The previous study found that the risks of deficits in EFs at preschool-age of preterm children can be attenuated by more supportive home environment (Taylor & Clark, 2016). However, former studies didn't investigate the effect of birthweight and home environment on cognitive EFs of preterm children simultaneously, especially those with normal early development. The present study aims to investigate the predictive effect of birthweight and home environment on the cognitive EFs of VLBW / ELBW preterm children.
Participants and Methods:
The preterm children were recruited from the Premature Baby Foundation of Taiwan. Inclusion criteria were their scores of Bayley Scales of Infant and Toddler Development, second or third edition at 12 and 24 months, and Wechsler Preschool and Primary Scale of Intelligence, Revised Edition at 5 years old were higher than 70. Exclusion criteria were visual impairment, hearing impairment, and cerebral palsy. There was a total of 287 preterm children with age 6 recruited in the present study. Preterm children were then divided into VLBW group (n=202, birthweight between 1001-1500g) and ELBW group (n=85, birthweight less than 1000g). The typical children included 89 term-born healthy and typically developing children with age 6, who were recruited from comparable social status families in the community. Four types of cognitive EFs including 22 indicators were assessed. Inhibition ability including 8 indicators was assessed through Comprehensive Nonverbal Attention Test Battery (CNAT), cognitive flexibility including 6 indicators was assessed through Wisconsin Card Sorting Test (WCST), working memory including 2 indicators was assessed through Digit Span Subtest of Wechsler Intelligence Scale for Children-IV (WISC-IV) and Knox's Cube Test (KCT), planning ability including 6 indicators was assessed through Tower of London (ToL). The home environment was assessed through Home Observation for Measurement of the Environment (HOME), Revised edition. Data were analyzed with Stepwise Regression.
Results:
Results showed that the regression model with birthweight significantly predicted 83.3% of planning ability indicators, 83.3% of cognitive flexibility indicators, and 50% of working memory indicators. Among indicators mentioned above, birthweight has been found the greatest predictive effect on summation-of score of ToL (R2=.04, p<.001). The regression model with HOME significantly predicted 66.7% of planning ability indicators, 16.7% of cognitive flexibility indicators, and 12.5% of inhibition ability indicators. Among the indicators mentioned above, HOME has been found the greatest predictive effect on rule-1 of ToL (R2=.027, p=.001). The regression model with birthweight and HOME significantly predicted 50% of planning ability indicators. Among indicators mentioned above, the regression model has been found the greatest predictive effect on summation-of-score of ToL (R2=.061, p<.001).
Conclusions:
Both birthweight and home environment have been found significantly predicted different types of cognitive EFs at preschool-age of VLBW / ELBW preterm children with normal early development. Though the home environment doesn't have such a great predictive effect as birthweight is, both birthweight and home environment are significant predictors of planning ability.
The existing literature provides conflicting evidence of whether a collectivistic value orientation is associated with ethical or unethical behavior. To address this confusion, we integrate collectivism theory and research with prior work on social identity, moral boundedness, group morality, and moral identity to develop a model of the double-edged effects of collectivism on employee conduct. We argue that collectivism is morally bounded depending on who the other is, and thus it inhibits employees’ motivation to engage in unethical pro-self behavior, yet strengthens their motivation to engage in unethical pro-organization behavior. We further predict that these effects are mediated by the psychological mechanism of organizational goal commitment and moderated by a person’s strength of moral identity. Results of three studies conducted in China and the United States and involving both field and experimental data offer strong support for our hypotheses. Theoretical and practical implications of the research are discussed.
This study aims to explore the association between coffee consumption and the prevalence of hearing loss in American adults based on a national population-based survey.
Design:
Cross-sectional analysis of reported audiometric status and coffee intake from the 2003–2006 National Health and Nutrition Examination Survey (NHANES). Multivariate logistic regression, forest plots and restricted cubic spline (RCS) analyses were used to explore the associations and dose–response relationships between coffee consumption frequency and hearing loss.
Setting:
The USA.
Participant:
This study included 1894 individuals aged ≥ 20 from the 2003–2006 NHANES.
Results:
In this study, the prevalence of speech-frequency hearing loss (SFHL) and high-frequency hearing loss (HFHL) among the participants was 35·90 % and 51·54 %, respectively. Compared with those who no consumed coffee, non-Hispanic White who consumed ≥ 4 cups/d had higher prevalence of SFHL (OR: 1·87; 95 % CI: 1·003. 3·47). And a positive trend of coffee consumption frequency with the prevalence of HFHL was found (Ptrend = 0·001). This association of HFHL was similar for participants aged 20–64 (Ptrend = 0·001), non-Hispanic White (Ptrend = 0·002), non-noise exposure participants (Ptrend = 0·03) and noise-exposed participants (Ptrend = 0·003). The forest plots analysis found that the association between 1 cup-increment of daily coffee consumption and the prevalence of HFHL was statistically significant in males. RCS model supported a positive linear association of coffee consumption with SFHL (P for overall association = 0·02, P for nonlinearity = 0·48) and a positive non-linear association of coffee consumption with HFHL (P for overall association = 0·001, P for nonlinearity = 0·001).
Conclusion:
Our findings suggested that coffee consumption was associated with higher prevalence of hearing loss. Further cohort studies in larger population are needed to investigate these findings.
A novel data-driven modal analysis method, reduced-order variational mode decomposition (RVMD), is proposed, inspired by the Hilbert–Huang transform and variational mode decomposition (VMD), to resolve transient or statistically non-stationary flow dynamics. First, the form of RVMD modes (referred to as an ‘elementary low-order dynamic process’, ELD) is constructed by combining low-order representation and the idea of intrinsic mode function, which enables the computed modes to characterize the non-stationary properties of space–time fluid flows. Then, the RVMD algorithm is designed based on VMD to achieve a low-redundant adaptive extraction of ELDs in flow data, with the modes computed by solving an elaborate optimization problem. Further, a combination of RVMD and Hilbert spectral analysis leads to a modal-based time-frequency analysis framework in the Hilbert view, providing a potentially powerful tool to discover, quantify and analyse the transient and non-stationary dynamics in complex flow problems. To provide a comprehensive evaluation, the computational cost and parameter dependence of RVMD are discussed, as well as the relations between RVMD and some classic modal decomposition methods. Finally, the virtues and utility of RVMD and the modal-based time-frequency analysis framework are well demonstrated via two canonical problems: the transient cylinder wake and the planar supersonic screeching jet.
Prolonged sitting in a fixed or constrained position exposes aircraft passengers to long-term static loading of their bodies, which has deleterious effects on passengers’ comfort throughout the duration of the flight. The previous studies focused primarily on office and driving sitting postures and few studies, however, focused on the sitting postures of passengers in aircraft. Consequently, the aim of the present study is to detect and recognize the sitting postures of aircraft passengers in relation to sitting discomfort. A total of 24 subjects were recruited for the experiment, which lasted for 2 h. Furthermore, a total of 489 sitting postures were extracted and the pressure data between subjects and seat was collected from the experiment. After the detection of sitting postures, eight types of sitting postures were classified based on key parts (trunk, back, and legs) of the human bodies. Thereafter, the eight types of sitting postures were recognized with the aid of pressure data of seat pan and backrest employing several machine learning methods. The best classification rate of 89.26% was obtained from the support vector machine (SVM) with radial basis function (RBF) kernel. The detection and recognition of the eight types of sitting postures of aircraft passengers in this study provided an insight into aircraft passengers’ discomfort and seat design.
Researchers have paid much attention to the performance implications of authoritarian leadership. However, less effort has been devoted to exploring its ethical consequences at work. Drawing on the social cognitive theory of morality, this study explores the indirect relationship between authoritarian leadership and subordinates’ unethical pro-organizational behaviors (UPB) via displacement of responsibility. A vignette-based experimental study (Study 1) and a time-lagged field study (Study 2) were conducted to test our hypotheses. Consistent findings were accumulated for the indirect relationship between authoritarian leadership and UPB through displacement of responsibility (both Study 1 and 2). Furthermore, this indirect relationship was stronger among employees with low level of moral efficacy (Study 2). We conclude this study by discussing the theoretical and practical implications of these findings.
The epidemic of coronavirus disease 2019 (COVID-19) began in China and had spread rapidly to many other countries. This study aimed to identify risk factors associated with delayed negative conversion of SARS-CoV-2 in COVID-19 patients. In this retrospective single-centre study, we included 169 consecutive patients with confirmed COVID-19 in Zhongnan Hospital of Wuhan University from 15th January to 2nd March. The cases were divided into two groups according to the median time of SARS-CoV-2 negative conversion. The differences between groups were compared. In total, 169 patients had a median virus negative conversion time of 18 days (interquartile range: 11–25) from symptom onset. Compared with the patients with short-term negative conversion, those with long-term conversion had an older age, higher incidence of comorbidities, chief complaints of cough and chest distress/breath shortness and severer illness on admission, higher level of leucocytes, neutrophils, aspartate aminotransferase, creatine kinase and erythrocyte sedimentation rate (ESR), lower level of CD3+CD4+ lymphocytes and albumin and more likely to receive mechanical ventilation. In multivariate analysis, cough, leucocytes, neutrophils and ESR were positively correlated with delayed virus negative conversion, and CD3+CD4+ lymphocytes were negatively correlated. The integrated indicator of leucocytes, neutrophils and CD3+CD4+ lymphocytes showed a good performance in predicting the negative conversion within 2 weeks (area under ROC curve (AUC) = 0.815), 3 weeks (AUC = 0.804), 4 weeks (AUC = 0.812) and 5 weeks (AUC = 0.786). In conclusion, longer quarantine periods might be more justified for COVID-19 patients with cough, higher levels of leucocytes, neutrophils and ESR and lower levels of CD3+CD4+ lymphocytes.
Currently there is no consensus regarding how long anti-psychotics medication should be continued following a first/single psychotic episode. Clinically patients often request discontinuation after a period of remission. This is one of the first double-blind randomized-controlled studies designed to address the issue.
Methods:
Patients with DSM-IV schizophrenia and related psychoses (excluding substance induced psychosis) who remitted well following a first/single-episode, and had remained well on maintenance medication for one year, were randomized to receive either maintenance therapy with quetiapine (400 mg/day), or placebo for 12 months. Relapse was defined by the presence of (i) an increase in at least one of the following PANSS psychotic symptom items to a threshold score (delusion, hallucinatory behaviour, conceptual disorganization, unusual thought content, suspiciousness); (ii) CGI Severity of Illness 3 or above; and (iii) CGI Improvement 5 or above.
Results:
178 patients were randomized. 144 patients completed the study (80.9%). The relapse rate was 33.7% (30/89) for the maintenance group and 66.3% (59/89) for the placebo group (log-rank test, chi-square=13.328, p<0.001). Relapse was not related to age or gender. Other significant predictors of relapse include medication status, pre-morbid schizotypal traits, verbal memory and soft neurological signs.
Conclusions:
There is a substantial risk of relapse if medication is discontinued in remitted first-episode psychosis patients following one year of maintenance therapy. On the contrary 33.7% of patients discontinued medication and remained well.
Medication discontinuation in remitted single episode patients after a period of maintenance therapy is a major clinical decision and thus the identification of risk factors controlling for medication status is important.
Methods:
Following a first/single episode with DSM-IV schizophrenia and related psychoses, remitted patients who had remained well on maintenance medication for at least one year were randomized to receive either maintenance therapy (with quetiapine 400 mg/day), or placebo for 12 months.
Results:
178 patients were randomized. Relapse rates were 33.7% (30/89) in maintenance group and 66.3% (59/89) in placebo group. Potential predictors were initially identified in univariate Cox regression models (p<0.1) and were subsequently entered into a multivariate Cox regression model for measuring the relapse risk. Significant predictors included patients on placebo (hazard ratio, 0.41; CI, 0.25 – 0.68; p=0.001); having more pre-morbid schizotypal traits (hazard ratio, 2.32; CI, 1.33 – 4.04; p=0.003); scoring lower in the logical memory test (hazard ratio, 0.94; CI, 0.9 – 0.99; p=0.028); and having more soft neurological signs (disinhibition) (hazard ratio, 1.33; CI, 1.02 – 1.74; p=0.039).
Conclusions:
Relapse predictors may help to inform clinical decisions about discontinuation of maintenance therapy specifically for patients with a first/single episode psychosis following at least one year of maintenance therapy.
Acknowledgement:
We are grateful to Dr TJ Yao at the Clinical Trials Center, University of Hong Kong, for statistical advice. The study was supported by investigator initiated trial award from AstraZeneca and the Research Grants Council Hong Kong (Project number: 765505).
Heterogeneous photocatalytic oxidation technology is currently a technology with the potential to solve environmental pollution and energy shortages. The key to this technology is to find and design efficient photocatalysts. Here, a series of inorganic coordination polymer quantum sheets (ICPQS)@graphene oxide (GO) composite photocatalysts are synthesized by adding GO to the synthesis process of ICPQS: {[CuII(H2O)4][CuI4(CN)6]}n. These composite photocatalysts were characterized by X-ray diffraction, X-ray photoelectron spectroscopy, cyclic voltammetry, scanning electron microscopy, transmission electron microscopy, Zeta potential, and N2 adsorption/desorption isotherms. The photocatalytic degradation of methylene blue showed that the activity of ICPQS@GO composite photocatalysts is better than that of ICPQS. Among ICPQS@GO composite photocatalysts, the 10.6% ICPQS@GO composite photocatalyst has the best activity, which can reach 3.3 mg/(L min) at pH 3. This method of loading low–specific surface area photocatalysts onto GO to improve photocatalytic performance indicates the direction for the synthesis of highly efficient photocatalysts.
Population-based colorectal cancer (CRC) screening programs that use a fecal immunochemical test (FIT) are often faced with a noncompliance issue and its subsequent waiting time (WT) for those FIT positives complying with confirmatory diagnosis. We aimed to identify factors associated with both of the correlated problems in the same model.
Methods
A total of 294,469 subjects, either with positive FIT test results or having a family history, collected from 2004 to 2013 were enrolled for analysis. We applied a hurdle Poisson regression model to accommodate the hurdle of compliance and also its related WT for undergoing colonoscopy while assessing factors responsible for the mixture of the two outcomes.
Results
The effect on compliance and WT varied with contextual factors, such as geographic areas, type of screening units, and level of urbanization. The hurdle score, representing the risk score in association with noncompliance, and the WT score, reflecting the rate of taking colonoscopy, were used to classify subjects into each of three groups representing the degree of compliance and the level of health awareness.
Conclusion
Our model was not only successfully applied to evaluating factors associated with the compliance and the WT distribution, but also developed into a useful assessment model for stratifying the risk and predicting whether and when screenees comply with the procedure of receiving confirmatory diagnosis given contextual factors and individual characteristics.
We provide a comparative case study of rehabilitation counselling across the U.S., Japan and Taiwan focusing on the common challenges facing international constituents in the field. Through interviews with students, faculty and administrators from each of the respective countries, we use thematic coding analysis to identify key points of tension. Emergent themes comprise (a) systemic challenges, (b) student and faculty mobility, (c) cultural and linguistic differences and (d) lack of sustainable international leadership. We further discuss mitigation of these recurrent challenges and conclude collaborative research, student exchange and institutional partnerships may advance teaching, research and service scholarship of rehabilitation counselling programs, and, in turn, enhance the lives of people with chronic illness and disability worldwide.
An effective multiplex real-time polymerase chain reaction (PCR) assay for the simultaneous detection of three major pathogens, Nosema bombycis Nägeli (Microsporidia: Nosematidae), Bombyx mori nucleopolyhedrovirus (Baculoviridae: genus Alphabaculovirus) (NPV), and Bombyx mori densovirus (Parvoviridae: genus Iteravirus) (DNV), in silkworms (Bombyx mori (Linnaeus); Lepidoptera: Bombycidae) was developed in this study. Polymerase chain reaction and real-time PCR tests and basic local alignment search tool searches revealed that the primers and probes used in this study had high specificities for their target species. The ability of each primer/probe set to detect pure pathogen DNA was determined using a plasmid dilution panel, in which under optimal conditions the multiplex real-time PCR assay showed high efficiency in the detection of three mixed target plasmids with a detection limit of 8.5×103 copies for N. bombycis and Bombyx mori NPV (BmNPV) and 8.5×104 copies for Bombyx mori DNV (BmDNV). When the ability to detect these three pathogens was examined in artificially inoculated silkworms, our method presented a number of advantages over traditional microscopy, including specificity, sensitivity, and high-throughput capabilities. Under the optimal volume ratio for the three primer/probe sets (3:2:2=N. bombycis:BmNPV:BmDNV), the multiplex real-time PCR assay showed early detection of BmNPV and BmDNV by day 1 post inoculation using DNA templates of the three pathogens in various combinations from individually infected silkworms; the early detection of N. bombycis was possible by day 3 post inoculation using the DNA isolated from the midgut of N. bombycis-infected silkworms.
To examine the vitamin D status, SNP of the vitamin D receptor gene (VDR) and the effects of vitamin D supplementation on parathyroid hormone and insulin secretion in adult males with obesity or normal weight in a subtropical Chinese city.
Design
An intervention trial.
Setting
Shenzhen City, Guangdong Province, China.
Subjects
From a cross-sectional survey conducted from June to July, eighty-two normal-weight and ninety-nine obese males (18–69 years) were screened to analyse their vitamin D status and for five SNP of VDR. From these individuals, in the same season of a different year, obese and normal-weight male volunteers (twenty-one per group) were included for an intervention trial with oral vitamin D supplementation at 1250 µg/week for 8 weeks.
Results
For the survey, there was no significant difference (P>0·05) in baseline circulating 25-hydroxyvitamin D concentrations or in the percentages of participants in different categories of vitamin D status between the two groups. The VDR SNP, rs3782905, was significantly associated with obesity (P=0·043), but none of the examined SNP were correlated with serum 25-hydroxyvitamin D when adjusted for age, BMI and study group. After vitamin D supplementation, serum 25-hydroxyvitamin D concentration, hypersecretions of parathyroid hormone and insulin, and insulin resistance in the obese were changed beneficially (P<0·05); however, the increase in serum 25-hydroxyvitamin D was less than that of the normal-weight men.
Conclusions
For obese and normal-weight men of subtropical China, the summer baseline vitamin D status was similar. However, oral vitamin D supplementation revealed a decreased bioavailability of vitamin D in obese men and ameliorated their hypersecretion of parathyroid hormone and insulin resistance.
By
Mildred Embree, Columbia University Medical Center,
Chang Hun Lee, Columbia University Medical Center,
Ziming Dong, Zhengzhou University,
Mo Chen, Columbia University Medical Center,
Kimi Kong, Columbia University Medical Center,
Hemin Nie, Columbia University Medical Center,
Avital Mendelson, Columbia University Medical Center,
Bhranti Shah, Columbia University Medical Center,
Shoko Cho, Columbia University Medical Center,
Takahiro Suzuki, Columbia University Medical Center,
Rujing Yang, Columbia University Medical Center,
Nan Jiang, Columbia University Medical Center,
Jeremy J. Mao, Columbia University Medical Center
Edited by
Peter X. Ma, University of Michigan, Ann Arbor
Introduction: stem/progenitor cell recruitment vs. transplantation
The utilization of transplanted stem cells in regenerative medicine has been studied extensively as a potential therapy to repair or replace tissues that are lost due to trauma, congenital deformities, tumor resections, or infectious diseases [1–3]. The current cell transplantation model in regenerative medicine is founded on the principle that the application of transplanted stem cells could repopulate and regenerate damaged or diseased tissues, with restored tissue functions and homeostasis. However, cell transplantation is faced with a multitude of clinical and cell culture complications including the complexity of the multistep surgical procedures, donor-site trauma, immune rejection for allogeneic and xenogeneic cells, cell phenotypic variations due to in-vitro culture techniques, potential tumorigenesis associated with long-term cell expansion, failure of exogenous cell engraftment, and uncertainties and difficulties in the regulatory approval process [4–8]. The difficulties in the clinical application of stem cell transplantation are in strong contrast to the results of multiple experimental studies that demonstrate different levels of efficacy of cell delivery in a number of disease models such as Parkinson’s disease [9, 10], blood cancers and diseases [11, 12], and muscle and spinal disorders/injuries [13, 14].
For a number of regenerative medicine applications, the use of stem cell transplantation might not be competitive with the cost-effectiveness of current clinical treatment modalities in the dental and musculoskeletal fields, including titanium joint replacements, dental implants, and operative dental procedures [15–17]. Alternatively, the concept of endogenous stem/progenitor cell recruitment in regenerative medicine is based on the idea that native stem/progenitor cells that already reside within mature tissue can be stimulated and functionally enhanced to repopulate, repair, and/or regenerate damaged tissues [18]. Relative to stem cell transplantation, the application of endogenous stem cell recruitment in regenerative medicine is still in its infancy. The combination of the use of biological factors, release technology, biomaterials, and bioengineered scaffolds to enhance endogenous stem cell recruitment seems very promising for potential use in translational regenerative medicine. However, further scientific experimentation is warranted, since many scientific questions concerning the mechanistic details remain unresolved and it will be necessary to validate the efficacy of this approach for clinical application.
We review the present status and future prospects of fast ignition (FI) research of the theoretical group at the IAPCM (Institute of Applied Physics and Computational Mathematics, Beijing) as a part of the inertial confinement fusion project. Since the approval of the FI project at the IAPCM, we have devoted our efforts to improving the integrated codes for FI and designing advanced targets together with the experimental group. Recent FI experiments [K. U. Akli et al., Phys. Rev. E 86, 065402 (2012)] showed that the petawatt laser beam energy was not efficiently converted into the compressed core because of the beam divergence of relativistic electron beams. The coupling efficiency can be improved in three ways: (1) using a cone–wire-in-shell advanced target to enhance the transport efficiency, (2) using external magnetic fields to collimate fast electrons, and (3) reducing the prepulse level of the petawatt laser beam. The integrated codes for FI, named ICFI, including a radiation hydrodynamic code, a particle-in-cell (PIC) simulation code, and a hybrid fluid–PIC code, have been developed to design this advanced target at the IAPCM. The Shenguang-II upgraded laser facility has been constructed for FI research; it consists of eight beams (in total $24~ {\rm kJ}/3\omega $, 3 ns) for implosion compression, and a heating laser beam (0.5–1 kJ, 3–5 ps) for generating the relativistic electron beam. A fully integrated FI experiment is scheduled for the 2014 project.