We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Bipolar disorder (BD) shows heterogeneous illness presentation both cross-sectionally and longitudinally. This phenotypic heterogeneity might reflect underlying genetic heterogeneity. At the same time, overlapping characteristics between BD and other psychiatric illnesses are observed at clinical and biomarker levels, which implies a shared biological mechanism between them. Incorporating these two issues in a single study design, this study investigated whether phenotypically heterogeneous subtypes of BD have a distinct polygenic basis shared with other psychiatric illnesses.
Methods
Six lifetime phenotype dimensions of BD identified in our previous study were used as target phenotypes. Associations between these phenotype dimensions and polygenic risk scores (PRSs) of major psychiatric illnesses from East Asian (EA) and other available populations were analyzed.
Results
Each phenotype dimension showed a different association pattern with PRSs of mental illnesses. PRS for EA schizophrenia showed a significant negative association with the cyclicity dimension (p = 0.044) but a significant positive association with the psychotic/irritable mania dimension (p = 0.001). PRS of EA major depressive disorder demonstrated a significant negative association with the elation dimension (p = 0.003) but a significant positive association with the comorbidity dimension (p = 0.028).
Conclusion
This study demonstrates that well-defined phenotype dimensions of lifetime-basis in BD have distinct genetic risks shared with other major mental illnesses. This finding supports genetic heterogeneity in BD and suggests a pleiotropy among BD subtypes and other psychiatric disorders beyond BD. Further genomic analyses adopting deep phenotyping across mental illnesses in ancestrally diverse populations are warranted to clarify intra-diagnosis heterogeneity and inter-diagnoses commonality issues in psychiatry.
This paper aims to un-suture common-sense assumptions based on Westphalian International Relations (IR) from South Korea’s non-essentialist and situated perspective, in the context of decolonising IR. Towards this end, the paper methodologically investigates a South Korean novel, A Grey Man, published in 1963 during South Korea’s early post-colonial period at the height of the Cold War. Using a non-Western novel to conduct a contrapuntal reading of Westphalian IR, this paper constructs a different type of worlding, conceptualising ‘the international’ through ‘the cultural’. It explores the following questions: How do ‘yellow negroes’ (the subject race) make sense of themselves and their roles and life-modes in a world defined for them by the white West (the master race)? How do yellow negroes understand and respond to the white West, which is hegemonic in world politics and history? In what ways does the protagonist of A Grey Man resist, engage with, and relate to the hegemonic West, which he has already internalised? In addressing these questions, the paper attempts to access different IR words to think with, such as race, white supremacy, intimacy without equality, sarcastic empathy, and disengagement. These provide an arena in which we can think otherwise, while un-suturing dominant Westphalian IR thinking.
The reading the mind in the eyes test (RMET) – which assesses the theory of mind component of social cognition – is often used to compare social cognition between patients with schizophrenia and healthy controls. There is, however, no systematic review integrating the results of these studies. We identified 198 studies published before July 2020 that administered RMET to patients with schizophrenia or healthy controls from three English-language and two Chinese-language databases. These studies included 41 separate samples of patients with schizophrenia (total n = 1836) and 197 separate samples of healthy controls (total n = 23 675). The pooled RMET score was 19.76 (95% CI 18.91–20.60) in patients and 25.53 (95% CI 25.19–25.87) in controls (z = 12.41, p < 0.001). After excluding small-sample outlier studies, this difference in RMET performance was greater in studies using non-English v. English versions of RMET (Chi [Q] = 8.54, p < 0.001). Meta-regression analyses found a negative association of age with RMET score and a positive association of years of schooling with RMET score in both patients and controls. A secondary meta-analysis using a spline construction of 180 healthy control samples identified a non-monotonic relationship between age and RMET score – RMET scores increased with age before 31 and decreased with age after 31. These results indicate that patients with schizophrenia have substantial deficits in theory of mind compared with healthy controls, supporting the construct validity of RMET as a measure of social cognition. The different results for English versus non-English versions of RMET and the non-monotonic relationship between age and RMET score highlight the importance of the language of administration of RMET and the possibility that the relationship of aging with theory of mind is different from the relationship of aging with other types of cognitive functioning.
Cancer is a life-changing experience, and side effects from treatment can make it difficult for survivors to return to their pre-cancer “normal life.” We explored the “new normal” and barriers to achieving it among lung cancer survivors who underwent surgery.
Methods
Semi-structured interviews were conducted with 32 recurrence-free non–small cell lung cancer survivors. We asked survivors how life had changed; how they defined the “new normal”; barriers that prevent them from achieving a “normal” life; and unmet needs or support for normalcy. Thematic analysis was performed.
Results
Defining “new normal” subjectively depends on an individual’s expectation of recovery: (1) being able to do what they want without pain or discomfort; (2) being able to do activities they could accomplish before their surgery; and (3) being able to work, earn money, and support their family. We found that (1) persistent symptoms, (2) fear of cancer recurrence, (3) high expectations in recovery, and (4) psychosocial stress and guilty feelings were barriers to achieving a “new normal.” The needs and support for normalcy were information on expected trajectories, postoperative management, and support from family and society.
Significance of results
Survivors defined the “new normal” differently, depending on their expectations for recovery. Informing survivors about the “new normal” so they could expect possible changes and set realistic goals for their life after cancer. Health professionals need to communicate with survivors about expectations for “normality” from the beginning of treatment, and it should be included in comprehensive survivorship care.
This study examined the relationship between changes in physical activity and their impact on exercise capacity and health-related quality of life over a 3-year span in patients with CHD.
Methods:
We evaluated 99 young patients with CHD, aged 13–18 years at the outset. Physical activity, health-related quality of life, and exercise capacity were assessed via questionnaires and peak oxygen uptake measurements at baseline and after 3 years; changes in measures were estimated between the two time points and categorised into quartiles. Participants were stratified according to achieved (active) or not-achieved (inactive) recommended levels of physical activity (≥150 minutes/week) at both time points.
Results:
Despite increases in physical activity, exercise capacity, and health-related quality of life over 3 years, the changes were not statistically significant (all p > 0.05). However, a positive association was found between physical activity changes and exercise capacity (ß = 0.250, p = 0.040) and health-related quality of life improvements (ß = 0.380, p < 0.001). Those with the most pronounced physical activity increase showed notable exercise capacity (p < 0.001) and health-related quality of life increases (p < 0.001) compared with patients with the largest decline in physical activity. The active-inactive category demonstrated a notable decline in exercise capacity compared to the active-active group, while the inactive-active group showed health-related quality of life improvements.
Conclusions:
Over 3 years, increased physical activity was consistently linked to increases in exercise capacity and health-related quality of life in patients with CHD, highlighting the potential of physical activity augmentation as an intervention strategy.
The “Fast track” protocol is an early extubation strategy to reduce ventilator-associated complications and induce early recovery after open-heart surgery. This study compared clinical outcomes between operating room extubation and ICU extubation after open-heart surgery in patients with CHD.
Methods:
We retrospectively reviewed 215 patients who underwent open-heart surgery for CHDs under the scheduled “Fast track” protocol between September 2016 and April 2022. The clinical endpoints were post-operative complications, including bleeding, respiratory and neurological complications, and hospital/ICU stays.
Results:
The patients were divided into operating room extubation (group O, n = 124) and ICU extubation (group I, n=91) groups. The most frequently performed procedures were patch closures of the atrial septal (107/215, 49.8%) and ventricular septal (89/215, 41.4%) defects. There were no significant differences in major post-operative complications or ICU and hospital stay duration between the two groups; however, patients in group I showed longer mechanical ventilatory support (0.0 min vs. 59.0 min (interquartile range: 17.0–169.0), p < 0.001). Patients in Group O showed higher initial lactate levels (3.2 ± 1.7 mg/dL versus 2.5 ± 2.0 mg/dL, p = 0.007) and more frequently used additional sedatives and opioid analgesics (33.1% versus 19.8%, p = 0.031).
Conclusions:
Extubation in the operating room was not beneficial for patients during post-operative ICU or hospital stay. Early extubation in the ICU resulted in more stable hemodynamics in the immediate post-operative period and required less use of sedatives and analgesics.
We conducted a replication of Shafir (1993) who showed that people are inconsistent in their preferences when faced with choosing versus rejecting decision-making scenarios. The effect was demonstrated using an enrichment paradigm, asking subjects to choose between enriched and impoverished alternatives, with enriched alternatives having more positive and negative features than the impoverished alternative. Using eight different decision scenarios, Shafir found support for a compatibility principle: subjects chose and rejected enriched alternatives in choose and reject decision scenarios (d = 0.32 [0.23,0.40]), respectively, and indicated greater preference for the enriched alternative in the choice task than in the rejection task (d = 0.38 [0.29,0.46]). In a preregistered very close replication of the original study (N = 1026), we found no consistent support for the hypotheses across the eight problems: two had similar effects, two had opposite effects, and four showed no effects (overall d = −0.01 [−0.06,0.03]). Seeking alternative explanations, we tested an extension, and found support for the accentuation hypothesis.
In this review, we introduce our recent applications of deep learning to solar and space weather data. We have successfully applied novel deep learning methods to the following applications: (1) generation of solar farside/backside magnetograms and global field extrapolation based on them, (2) generation of solar UV/EUV images from other UV/EUV images and magnetograms, (3) denoising solar magnetograms using supervised learning, (4) generation of UV/EUV images and magnetograms from Galileo sunspot drawings, (5) improvement of global IRI TEC maps using IGS TEC ones, (6) one-day forecasting of global TEC maps through image translation, (7) generation of high-resolution magnetograms from Ca II K images, (8) super-resolution of solar magnetograms, (9) flare classification by CNN and visual explanation by attribution methods, and (10) forecasting GOES solar X-ray profiles. We present major results and discuss them. We also present future plans for integrated space weather models based on deep learning.
We investigated the effects of transcranial alternating stimulation (tACS) in patients with insomnia. Nine patients with chronic insomnia underwent two in-laboratory polysomnography, 2 weeks apart, and were randomized to receive tACS either during the first or second study. The stimulation was applied simultaneously and bilaterally at F3/M1 and F4/M2 electrodes (0.75 mA, 0.75 Hz, 5-minute). Sleep onset latency and wake after sleep onset dropped on the stimulation night but they did not reach statistical significance; however, there were significant improvements in spontaneous and total arousals, sleep quality, quality of life, recall memory, sleep duration, sleep efficiency, and daytime sleepiness.
This study investigates how International Relations (IR) as an academic discipline emerged and evolved in South Korea, focusing on the country's peculiar colonial and postcolonial experiences. In the process, it examines why South Korean IR has been so state-centric and positivist (American-centric), while also disclosing the ways in which international history has shaped the current state of IR in South Korea, institutionally and intellectually. It is argued that IR intellectuals in South Korea have largely reflected the political arrangement of their time, rather than demonstrate academic independence or leadership for its government and/or civil society, as they have navigated difficult power structures in world politics. Related to this, it reveals South Korean IR's twisted postcoloniality, which is the absence – or weakness – of non-Western Japanese colonial legacies in its knowledge production/system, while its embracing the West/America as an ideal and better model of modernity for South Korea's security and development. It also reveals that South Korean IR's recent quest for building a Korean School of IR to overcome its Western dependency appears to be in operation within a colonial mentality towards mainstream American IR.
Two aphid-transmitted RNA viruses, broad bean wilt virus 2 (BBWV2) and cucumber mosaic virus (CMV), are the most prevalent viruses in Korean pepper fields and cause chronic damage in pepper production. In this study, we employed a screening system for pathotype-specific resistance of pepper germplasm to BBWV2 and CMV by utilizing infectious cDNA clones of different pathotypes of the viruses (two BBWV2 strains and three CMV strains). We first examined pathogenic characteristics of the BBWV2 and CMV strains in various plant species and their phylogenetic positions in the virus population structures. We then screened 34 commercial pepper cultivars and seven accessions for resistance. While 21 pepper cultivars were resistant to CMV Fny strain, only two cultivars were resistant to CMV P1 strain. We also found only one cultivar partially resistant to BBWV2 RP1 strain. However, all tested commercial pepper cultivars were susceptible to the resistance-breaking CMV strain GTN (CMV-GTN) and BBWV2 severe strain PAP1 (BBWV2-PAP1), suggesting that breeding new cultivars resistant to these virus strains is necessary. Fortunately, we identified several pepper accessions that were resistant or partially resistant to CMV-GTN and one symptomless accession despite systemic infection with BBWV2-PAP1. These genetic resources will be useful in pepper breeding programs to deploy resistance to BBWV2 and CMV.
This study was performed to improve production efficiency at the level of recipient pig and donor nuclei of transgenic cloned pigs used for xenotransplantation. To generate transgenic pigs, human endothelial protein C receptor (hEPCR) and human thrombomodulin (hTM) genes were introduced using the F2A expression vector into GalT–/–/hCD55+ porcine neonatal ear fibroblasts used as donor cells and cloned embryos were transferred to the sows and gilts. Cloned fetal kidney cells were also used as donor cells for recloning to increase production efficiency. Pregnancy and parturition rates after embryo transfer and preimplantation developmental competence were compared between cloned embryos derived from adult and fetal cells. Significantly higher parturition rates were shown in the group of sows (50.0 vs. 4.1%), natural oestrus (20.8 vs. 0%), and ovulated ovary (16.7 vs. 5.6%) compared with gilt, induced and non-ovulated, respectively (P < 0.05). When using gilts as recipients, final parturitions occurred in only the fetal cell groups and significantly higher blastocyst rates (15.1% vs. 21.3%) were seen (P < 0.05). Additionally, gene expression levels related to pluripotency were significantly higher in the fetal cell group (P < 0.05). In conclusion, sows can be recommended as recipients due to their higher efficiency in the generation of transgenic cloned pigs and cloned fetal cells also can be recommended as donor cells through correct nuclear reprogramming.
Consensus does not exist for which cost forms (i.e., one accounting solely for explicit cost and the other for both explicit and opportunity costs as in relative opportunity cost) are used in calculating return on investment (ROI) for conservation-related decisions. This research examines how the cost of conservation investment with and without inclusion of the opportunity cost of the protected area results in different solutions in a multi-objective optimization framework at the county level in the Central and Southern Appalachian Region of the USA. We maximize rates of ROI of both forest-dependent biodiversity and economic impact generated by forest-based payments for ecosystem services. We find that the conservation budget is optimally distributed more narrowly among counties that are more likely to be rural when the investment cost measure is relative opportunity cost than when it is explicit cost. We also find that the sacrifice in forest-dependent biodiversity per unit increase in economic impact is higher when investment cost is measured by relative opportunity cost rather than when measured by explicit cost. By understanding the consequences of using one cost measure over the other, a conservation agency can decide on which cost measure is more appropriate for informing the agency’s decision-making process.
To evaluate the impact of a vancomycin-resistant Enterococcus (VRE) screening policy change on the incidence of healthcare-associated (HA)-VRE bacteremia in an endemic hospital setting.
Design:
A quasi-experimental before-and-after study.
Setting:
A 1,989-bed tertiary-care referral center in Seoul, Republic of Korea.
Methods:
Since May 2010, our hospital has diminished VRE screening for admitted patients transferred from other healthcare facilities. We assessed the impact of this policy change on the incidence of HA-VRE bacteremia using segmented autoregression analysis of interrupted time series from January 2006 to December 2014 at the hospital and unit levels. In addition, we compared the molecular characteristics of VRE blood isolates collected before and after the screening policy change using multilocus sequence typing and pulsed-field gel electrophoresis.
Results:
After the VRE screening policy change, the incidence of hospital-wide HA-VRE bacteremia increased, although no significant changes of level or slope were observed. In addition, a significant slope change in the incidence of HA-VRE bacteremia (change in slope, 0.007; 95% CI, 0.001–0.013; P = .02) was observed in the hemato-oncology department. Molecular analysis revealed that various VRE sequence types appeared after the policy change and that clonally related strains became more predominant (increasing from 26.1% to 59.3%).
Conclusions:
The incidence of HA-VRE bacteremia increased significantly after VRE screening policy change, and this increase was mainly driven by high-risk patient populations. When planning VRE control programs in hospitals, different approaches that consider risk for severe VRE infection in patients may be required.
The longitudinal relationship between depression and the risk of non-alcoholic fatty liver disease is uncertain. We examined: (a) the association between depressive symptoms and incident hepatic steatosis (HS), both with and without liver fibrosis; and (b) the influence of obesity on this association.
Methods
A cohort of 142 005 Korean adults with neither HS nor excessive alcohol consumption at baseline were followed for up to 8.9 years. The validated Center for Epidemiologic Studies-Depression score (CES-D) was assessed at baseline, and subjects were categorised as non-depressed (a CES-D < 8, reference) or depression (CES-D ⩾ 16). HS was diagnosed by ultrasonography. Liver fibrosis was assessed by the fibrosis-4 index (FIB-4). Parametric proportional hazards models were used to estimate the adjusted hazard ratios (aHRs) and 95% confidence intervals (CIs).
Results
During a median follow-up of 4.0 years, 27 810 people with incident HS and 134 with incident HS plus high FIB-4 were identified. Compared with the non-depressed category, the aHR (95% CIs) for incident HS was 1.24 (1.15–1.34) for CES-D ⩾ 16 among obese individuals, and 1.00 (0.95–1.05) for CES-D ⩾ 16 among non-obese individuals (p for interaction with obesity <0.001). The aHR (95% CIs) for developing HS plus high FIB-4 was 3.41 (1.33–8.74) for CES-D ⩾ 16 among obese individuals, and 1.22 (0.60–2.47) for CES-D ⩾ 16 among non-obese individuals (p for interaction = 0.201).
Conclusions
Depression was associated with an increased risk of incident HS and HS plus high probability of advanced fibrosis, especially among obese individuals.
Background: We describe and evaluate our outbreak of carbapenem-resistant K. pneumoniae transmitted by contaminated duodenoscopes during endoscopic retrograde cholangiopancreatography (ERCP) procedures. Methods: An outbreak investigation was performed when Klebsiella pneumoniae carbapenemase-producing K. pneumoniae (KPC-KP) were identified from bile specimens of 4 patients. The investigation included medical record review, practice audits, and surveillance cultures of duodenoscopes and environmental sites. If available, clinical specimens were obtained from patients who had undergone ERCP in the previous 3 months. Carbapenem-resistant Enterobacteriaceae (CRE) screening cultures were performed to identify additional patients until no CRE cases were detected during 2 consecutive weeks. Pulsed-field gel electrophoresis (PFGE) of KPC-KP isolates was implemented. Results: In total, 12 cases were identified with exposure to duodenoscope from February 2019 through April 2019, including 6 cases with infections and 6 asymptomatic carriers. Case-control analysis showed that 2 specific duodenoscopes would be associated with the KPC-KP outbreak. Duodenoscope reprocessing procedures did not deviate from manufacturer recommendations for reprocessing. After ethylene oxide (EO) gas sterilization, the outbreak was terminated. Conclusions: Meticulous cleaning protocol and enhanced surveillance are necessary to prevent outbreaks of CRE. Notably, enhanced cleaning measures, such as sterilization for duodenoscopes, would be required after procedures with KPC-KP carriers.
Early replacement of a new central venous catheter (CVC) may pose a risk of persistent or recurrent infection in patients with a catheter-related bloodstream infection (CRBSI). We evaluated the clinical impact of early CVC reinsertion after catheter removal in patients with CRBSIs.
Methods:
We conducted a retrospective chart review of adult patients with confirmed CRBSIs in 2 tertiary-care hospitals over a 7-year period.
Results:
To treat their infections, 316 patients with CRBSIs underwent CVC removal. Among them, 130 (41.1%) underwent early CVC reinsertion (≤3 days after CVC removal), 39 (12.4%) underwent delayed reinsertion (>3 days), and 147 (46.5%) did not undergo CVC reinsertion. There were no differences in baseline characteristics among the 3 groups, except for nontunneled CVC, presence of septic shock, and reason for CVC reinsertion. The rate of persistent CRBSI in the early CVC reinsertion group (22.3%) was higher than that in the no CVC reinsertion group (7.5%; P = .002) but was similar to that in the delayed CVC reinsertion group (17.9%; P > .99). The other clinical outcomes did not differ among the 3 groups, including rates of 30-day mortality, complicated infection, and recurrence. After controlling for several confounding factors, early CVC reinsertion was not significantly associated with persistent CRBSI (OR, 1.59; P = .35) or 30-day mortality compared with delayed CVC reinsertion (OR, 0.81; P = .68).
Conclusions:
Early CVC reinsertion in the setting of CRBSI may be safe. Replacement of a new CVC should not be delayed in patients who still require a CVC for ongoing management.
We report our experience with an emergency room (ER) shutdown related to an accidental exposure to a patient with coronavirus disease 2019 (COVID-19) who had not been isolated.
Setting:
A 635-bed, tertiary-care hospital in Daegu, South Korea.
Methods:
To prevent nosocomial transmission of the disease, we subsequently isolated patients with suspected symptoms, relevant radiographic findings, or epidemiology. Severe acute respiratory coronavirus 2 (SARS-CoV-2) reverse-transcriptase polymerase chain reaction assays (RT-PCR) were performed for most patients requiring hospitalization. A universal mask policy and comprehensive use of personal protective equipment (PPE) were implemented. We analyzed effects of these interventions.
Results:
From the pre-shutdown period (February 10–25, 2020) to the post-shutdown period (February 28 to March 16, 2020), the mean hourly turnaround time decreased from 23:31 ±6:43 hours to 9:27 ±3:41 hours (P < .001). As a result, the proportion of the patients tested increased from 5.8% (N=1,037) to 64.6% (N=690) (P < .001) and the average number of tests per day increased from 3.8±4.3 to 24.7±5.0 (P < .001). All 23 patients with COVID-19 in the post-shutdown period were isolated in the ER without any problematic accidental exposure or nosocomial transmission. After the shutdown, several metrics increased. The median duration of stay in the ER among hospitalized patients increased from 4:30 hours (interquartile range [IQR], 2:17–9:48) to 14:33 hours (IQR, 6:55–24:50) (P < .001). Rates of intensive care unit admissions increased from 1.4% to 2.9% (P = .023), and mortality increased from 0.9% to 3.0% (P = .001).
Conclusions:
Problematic accidental exposure and nosocomial transmission of COVID-19 can be successfully prevented through active isolation and surveillance policies and comprehensive PPE use despite longer ER stays and the presence of more severely ill patients during a severe COVID-19 outbreak.