We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In 10-minute speaking, N95 respirators significantly decreased SARS-CoV-2 emissions compared with no-mask wearing. However, SARS-CoV-2 was detected in the air even when wearing N95 and surgical masks in patients with high viral loads. Therefore, universal masking of infected and uninfected persons is important for preventing COVID-19 transmission via the air.
Although, attempts to apply virtual reality (VR) in mental healthcare are rapidly increasing, it is still unclear whether VR relaxation can reduce stress more than conventional biofeedback.
Methods:
Participants consisted of 83 healthy adult volunteers with high stress, which was defined as a score of 20 or more on the Perceived Stress Scale-10 (PSS-10). This study used an open, randomized, crossover design with baseline, stress, and relaxation phases. During the stress phase, participants experienced an intentionally generated shaking VR and serial-7 subtraction. For the relaxation phase, participants underwent a randomly assigned relaxation session on day 1 among VR relaxation and biofeedack, and the other type of relaxation session was applied on day 2. We compared the StateTrait Anxiety Inventory-X1 (STAI-X1), STAI-X2, the Numeric Rating Scale (NRS), and physiological parameters including heart rate variability (HRV) indexes in the stress and relaxation phases.
Results:
A total of 74 participants were included in the analyses. The median age of participants was 39 years, STAI-X1 was 47.27 (SD = 9.92), and NRS was 55.51 (SD = 24.48) at baseline. VR and biofeedback significantly decreased STAI-X1 and NRS from the stress phase to the relaxation phase, while the difference of effect between VR and biofeedback was not significant. However, there was a significant difference in electromyography, LF/HF ratio, LF total, and NN50 between VR relaxation and biofeedback
Conclusion:
VR relaxation was effective in reducing subjectively reported stress in individuals with high stress.
The “Fast track” protocol is an early extubation strategy to reduce ventilator-associated complications and induce early recovery after open-heart surgery. This study compared clinical outcomes between operating room extubation and ICU extubation after open-heart surgery in patients with CHD.
Methods:
We retrospectively reviewed 215 patients who underwent open-heart surgery for CHDs under the scheduled “Fast track” protocol between September 2016 and April 2022. The clinical endpoints were post-operative complications, including bleeding, respiratory and neurological complications, and hospital/ICU stays.
Results:
The patients were divided into operating room extubation (group O, n = 124) and ICU extubation (group I, n=91) groups. The most frequently performed procedures were patch closures of the atrial septal (107/215, 49.8%) and ventricular septal (89/215, 41.4%) defects. There were no significant differences in major post-operative complications or ICU and hospital stay duration between the two groups; however, patients in group I showed longer mechanical ventilatory support (0.0 min vs. 59.0 min (interquartile range: 17.0–169.0), p < 0.001). Patients in Group O showed higher initial lactate levels (3.2 ± 1.7 mg/dL versus 2.5 ± 2.0 mg/dL, p = 0.007) and more frequently used additional sedatives and opioid analgesics (33.1% versus 19.8%, p = 0.031).
Conclusions:
Extubation in the operating room was not beneficial for patients during post-operative ICU or hospital stay. Early extubation in the ICU resulted in more stable hemodynamics in the immediate post-operative period and required less use of sedatives and analgesics.
It has been suggested that psychosocial factors are related to survival time of inpatients with cancer. However, there are not many studies examining the relationship between spiritual well-being (SWB) and survival time among countries. This study investigated the relationship between SWB and survival time among three East Asian countries.
Methods
This international multicenter cohort study is a secondary analysis involving newly admitted inpatients with advanced cancer in palliative care units in Japan, South Korea, and Taiwan. SWB was measured using the Integrated Palliative Outcome Scale (IPOS) at admission. We performed multivariate analysis using the Cox proportional hazards model to identify independent prognostic factors.
Results
A total of 2,638 patients treated at 37 palliative care units from January 2017 to September 2018 were analyzed. The median survival time was 18.0 days (95% confidence interval [CI] 16.5–19.5) in Japan, 23.0 days (95% CI 19.9–26.1) in Korea, and 15.0 days (95% CI 13.0–17.0) in Taiwan. SWB was a significant factor correlated with survival in Taiwan (hazard ratio [HR] 1.27; 95% CI 1.01–1.59; p = 0.04), while it was insignificant in Japan (HR 1.10; 95% CI 1.00–1.22; p = 0.06), and Korea (HR 1.02; 95% CI 0.77–1.35; p = 0.89).
Significance of results
SWB on admission was associated with survival in patients with advanced cancer in Taiwan but not Japan or Korea. The findings suggest the possibility of a positive relationship between spiritual care and survival time in patients with far advanced cancer.
Sarcopenic obesity is defined as the presence of high fat mass and low muscle mass combined with low physical function, and it is closely related with the onset of cardiovasular diseases (CVD). The existing anthropometric indices, which are being utilised in clinical practice as predictors of CVD, may also be used to screen sarcopenic obesity, but their feasibility remained unknown. Using cross-sectional data of 2031 participants aged 70–84 years (mean age, 75·9 ± 3·9 years; 49·2 % women) from the Korean Frailty and Aging Cohort Study, we analysed the association of anthropometric indices, including body mass index (BMI), waist circumference (WC), waist-to-height ratio (WHtR) and weight-adjusted waist index (WWI) with sarcopenic obesity. Body composition was measured using dual-energy X-ray absorptiometry. Higher WWI, WHtR and WC quartiles were associated with higher risk of sarcopenic obesity; the odds ratio (OR) of sarcopenic obesity were highest in the fourth quartile of the WWI (OR: 10·99, 95 % CI: 4·92–24·85, Pfor trend < 0·001). WWI provided the best diagnostic power for sarcopenic obesity in men (area under the receiver operating characteristic curve: 0·781, 95 % CI: 0·751–0·837). No anthropometric indices were significantly associated with sarcopenic obesity in women. WWI was the only index that was negatively correlated with physical function in both men and women. WWI showed the strongest association with sarcopenic obesity, defined by high fat mass and low muscle mass combined with low physical function only in older men. No anthropometric indices were associated with sarcopenic obesity in older women.
There are growing concerns about the impact of the COVID-19 pandemic on the mental health of older adults. We examined the effect of the pandemic on the risk of depression in older adults.
Methods
We analyzed data from the prospective cohort study of Korean older adults, which has been followed every 2 years. Among the 2308 participants who completed both the third and the fourth follow-up assessments, 58.4% completed their fourth follow-up before the outbreak of COVID-19 and the rest completed it during the pandemic. We conducted face-to-face diagnostic interviews using Mini International Neuropsychiatric Interview and used Geriatric Depression Scale. We performed generalized estimating equations and logistic regression analyses.
Results
The COVID-19 pandemic was associated with increased depressive symptoms in older adults [b (standard error) = 0.42 (0.20), p = 0.040] and a doubling of the risk for incident depressive disorder even in euthymic older adults without a history of depression (odds ratio = 2.44, 95% confidence interval 1.18–5.02, p = 0.016). Less social activities, which was associated with the risk of depressive disorder before the pandemic, was not associated with the risk of depressive disorder during the pandemic. However, less family gatherings, which was not associated with the risk of depressive disorder before the pandemic, was associated with the doubled risk of depressive disorder during the pandemic.
Conclusions
The COVID-19 pandemic significantly influences the risk of late-life depression in the community. Older adults with a lack of family gatherings may be particularly vulnerable.
This study aims to identify factors associated with divorce following breast cancer diagnosis and measures the impact of divorce on the quality of life (QoL) of patients.
Methods
We used cross-sectional survey data collected at breast cancer outpatient clinics in South Korea from November 2018 to April 2019. Adult breast cancer survivors who completed active treatment without any cancer recurrence at the time of the survey (N = 4,366) were included. The participants were classified into two groups: “maintaining marriage” and “being divorced,” between at the survey and at the cancer diagnosis. We performed logistic regression and linear regression to identify the factors associated with divorce after cancer diagnosis and to compare the QoL of divorced and nondivorced survivors.
Results
Approximately 11.1/1,000 of married breast cancer survivors experienced divorce after cancer diagnosis. Younger age, lower education, and being employed at diagnosis were associated with divorce. Being divorced survivors had significantly lower QoL (Coefficient [Coef] = −7.50; 95% CI = −13.63, −1.36), social functioning (Coef = −9.47; 95% CI = −16.36, −2.57), and body image (Coef = −8.34; 95% CI = −6.29, −0.39) than survivors who remained married. They also experienced more symptoms including pain, insomnia, financial difficulties, and distress due to hair loss.
Conclusion
Identifying risk factors of divorce will ultimately help ascertain the resources necessary for early intervention.
Network approach has been applied to a wide variety of psychiatric disorders. The aim of the present study was to identify network structures of remitters and non-remitters in patients with first-episode psychosis (FEP) at baseline and the 6-month follow-up.
Methods
Participants (n = 252) from the Korean Early Psychosis Study (KEPS) were enrolled. They were classified as remitters or non-remitters using Andreasen's criteria. We estimated network structure with 10 symptoms (three symptoms from the Positive and Negative Syndrome Scale, one depressive symptom, and six symptoms related to schema and rumination) as nodes using a Gaussian graphical model. Global and local network metrics were compared within and between the networks over time.
Results
Global network metrics did not differ between the remitters and non-remitters at baseline or 6 months. However, the network structure and nodal strengths associated with positive-self and positive-others scores changed significantly in the remitters over time. Unique central symptoms for remitters and non-remitters were cognitive brooding and negative-self, respectively. The correlation stability coefficients for nodal strength were within the acceptable range.
Conclusion
Our findings indicate that network structure and some nodal strengths were more flexible in remitters. Negative-self could be an important target for therapeutic intervention.
Background: After the Middle East respiratory syndrome coronavirus outbreak in Korea in 2015, the government newly established the additional reimbursement for infection prevention to encourage infection control activities in the hospitals. The new policy was announced in December 2015 and was implemented in September 2016. We evaluated how infection control activities improved in hospitals after the change of government policy in Korea. Methods: Three cross-sectional surveys using the WHO Hand Hygiene Self-Assessment Framework (HHSAF) were conducted in 2013, 2015, and 2017. Using multivariable linear regression model including hospital characteristics, we analyzed the changes in total HHSAF scores according to the survey time. Results: In total, 32 hospitals participated in the survey in 2013, 52 in 2015, and 101 in 2017. The number of inpatient beds per infection control professionals decreased from 324 in 2013 to 303 in 2015 and 179 in 2017. Most hospitals were at intermediate or advanced levels of progress (90.6% in 2013, 86.6% in 2015, and 94.1% in 2017). In a multivariable linear regression model, the total HHSAF scores were significantly associated with hospital teaching status (β coefficient of major teaching hospital, 52.6; 95% CI, 8.9–96.4; P = .018), bed size (β coefficient of 100-bed increase, 5.1; 95% CI, 0.3–9.8; P = .038), and survey time (β coefficient of 2017 survey, 45.1; 95% CI, 19.3–70.9; P = .001). Conclusions: After the national policy implementation, the number of infection control professionals increased, and the promotion of hand hygiene activities was strengthened in Korean hospitals.
Early replacement of a new central venous catheter (CVC) may pose a risk of persistent or recurrent infection in patients with a catheter-related bloodstream infection (CRBSI). We evaluated the clinical impact of early CVC reinsertion after catheter removal in patients with CRBSIs.
Methods:
We conducted a retrospective chart review of adult patients with confirmed CRBSIs in 2 tertiary-care hospitals over a 7-year period.
Results:
To treat their infections, 316 patients with CRBSIs underwent CVC removal. Among them, 130 (41.1%) underwent early CVC reinsertion (≤3 days after CVC removal), 39 (12.4%) underwent delayed reinsertion (>3 days), and 147 (46.5%) did not undergo CVC reinsertion. There were no differences in baseline characteristics among the 3 groups, except for nontunneled CVC, presence of septic shock, and reason for CVC reinsertion. The rate of persistent CRBSI in the early CVC reinsertion group (22.3%) was higher than that in the no CVC reinsertion group (7.5%; P = .002) but was similar to that in the delayed CVC reinsertion group (17.9%; P > .99). The other clinical outcomes did not differ among the 3 groups, including rates of 30-day mortality, complicated infection, and recurrence. After controlling for several confounding factors, early CVC reinsertion was not significantly associated with persistent CRBSI (OR, 1.59; P = .35) or 30-day mortality compared with delayed CVC reinsertion (OR, 0.81; P = .68).
Conclusions:
Early CVC reinsertion in the setting of CRBSI may be safe. Replacement of a new CVC should not be delayed in patients who still require a CVC for ongoing management.
Gosan-ri-type pottery (GTP) is a unique plant-fiber-tempered pottery from Korea and has only been found in Early Neolithic sites on Jeju Island. In this study, we conducted radiocarbon (14C) dating for one GTP sample and 10 charcoal samples collected from archaeological structures in which GTP was found in 2012. The measurement conditions, the internal quality assurance test, and the reliability test indicate that each 14C date is very reliable. However, the 14C dates of the charcoal samples were more accurate than that of the GTP sample due to contamination from younger humic acids. From the summary of all 14C dates of charcoal samples using the KDE model, we finally conclude that GTP was manufactured and utilized throughout the period 9610–9490 cal BP (7670–7550 BC) with 95.4% confidence level. This age corroborates the inference that GTP is the oldest known Korean Neolithic pottery.
Paraquat was the most successful nonselective herbicide in Korea due to its rapid herbicidal activity. However, its high mammalian toxicity, frequent self-poisoning incidents, and a lack of effective antidotes led to a paraquat ban in Korea in 2012. Therefore, this review was conducted to revisit the toxicological profile of paraquat and to investigate the impacts of the paraquat ban on human health and agriculture in Korea. A review of toxicological information reconfirmed that paraquat is highly acutely toxic to humans, and ingestion, inhalation, or dermal administration of the herbicide can cause severe clinical signs and inevitably lead to death by respiratory failure. In Korea, the paraquat ban immediately decreased the suicide rate due to pesticides (mainly paraquat) by 46.1%, resulting in a 10% decrease of the total suicide rate. However, this also led to an increase in suicide attempts with other poisons such as carbon monoxide, suggesting that suicide attempts and rates of suicide by poisoning depend on not only the toxicity of the poison but also the accessibility of the poisoning agents. In agriculture, paraquat was quickly replaced by other nonselective herbicides such as glufosinate and glyphosate. Thus, the paraquat ban did not have a significant impact on agricultural practices but influenced the nonselective herbicide market; the use of glufosinate was higher than use of glyphosate due to glufosinate’s rapid herbicidal activity, which is similar to that of paraquat. Though the paraquat ban can be considered as a national strategy to lower suicide rates, the increase in suicide attempts with other poisons suggests that multilateral efforts are required for not only keeping suicidal agents away from people but also minimizing motives for suicide.
Lack of understanding the effects of single- and multiple-weed interference on soybean yield has led to inadequate weed management in Primorsky Krai, resulting in much lower average yield than neighboring regions. A 2 yr field experiment was conducted in a soybean field located in Bogatyrka (43.82°N, 131.6°E), Primorsky Krai, Russia, in 2013 and 2014 to investigate the effects of single and multiple interference caused by naturally established weeds on soybean yield and to model these effects. Aboveground dry weight was negatively affected the most by weed interference, followed by number of pods and seeds. Soybean yield under single-weed interference was best demonstrated by a rectangular hyperbolic model, showing that common ragweed and barnyardgrass were the most competitive weed species, followed by annual sowthistle, American sloughgrass, and common lambsquarters. In the case of multiple-weed interference, soybean yield loss was accurately described by a multivariate rectangular hyperbolic model, with total density equivalent as the independent variable. Parameter estimates indicated that weed-free soybean yields were similar in 2013 and 2014, i.e., estimated as 1.72 t and 1.75 t ha−1, respectively, and competitiveness of each weed species was not significantly different between the two years. Economic thresholds for single-weed interference were 0.74, 0.66, 1.15, 1.23, and 1.45 plants m−2 for common ragweed, barnyardgrass, annual sowthistle, American sloughgrass, and common lambsquarters, respectively. The economic threshold for multiple-weed interference was 0.70 density equivalent m−2. These results, including the model, thus can be applied to a decision support system for weed management in soybean cultivation under single and multiple-weed interference in Primorsky Krai and its neighboring regions of Russia.
Personality may predispose family caregivers to experience caregiving differently in similar situations and influence the outcomes of caregiving. A limited body of research has examined the role of some personality traits for health-related quality of life (HRQoL) among family caregivers of persons with dementia (PWD) in relation to burden and depression.
Methods:
Data from a large clinic-based national study in South Korea, the Caregivers of Alzheimer's Disease Research (CARE), were analyzed (N = 476). Path analysis was performed to explore the association between family caregivers’ personality traits and HRQoL. With depression and burden as mediating factors, direct and indirect associations between five personality traits and HRQoL of family caregivers were examined.
Results:
Results demonstrated the mediating role of caregiver burden and depression in linking two personality traits (neuroticism and extraversion) and HRQoL. Neuroticism and extraversion directly and indirectly influenced the mental HRQoL of caregivers. Neuroticism and extraversion only indirectly influenced their physical HRQoL. Neuroticism increased the caregiver's depression, whereas extraversion decreased it. Neuroticism only was mediated by burden to influence depression and mental and physical HRQoL.
Conclusions:
Personality traits can influence caregiving outcomes and be viewed as an individual resource of the caregiver. A family caregiver's personality characteristics need to be assessed for tailoring support programs to get the optimal benefits from caregiver interventions.
Some clinical studies have reported reduced peripheral glial cell line-derived neurotrophic factor (GDNF) level in elderly patients with major depressive disorder (MDD). We verified whether a reduction in plasma GDNF level was associated with MDD.
Method
Plasma GDNF level was measured in 23 healthy control subjects and 23 MDD patients before and after 6 weeks of treatment.
Results
Plasma GDNF level in MDD patients at baseline did not differ from that in healthy controls. Plasma GDNF in MDD patients did not differ significantly from baseline to the end of treatment. GDNF level was significantly lower in recurrent-episode MDD patients than in first-episode patients before and after treatment.
Conclusions
Our findings revealed significantly lower plasma GDNF level in recurrent-episode MDD patients, although plasma GDNF levels in MDD patients and healthy controls did not differ significantly. The discrepancy between our study and previous studies might arise from differences in the recurrence of depression or the ages of the MDD patients.
Decreased hemoglobin levels increase the risk of developing dementia among the elderly. However, the underlying mechanisms that link decreased hemoglobin levels to incident dementia still remain unclear, possibly due to the fact that few studies have reported on the relationship between low hemoglobin levels and neuroimaging markers. We, therefore, investigated the relationships between decreased hemoglobin levels, cerebral small-vessel disease (CSVD), and cortical atrophy in cognitively healthy women and men.
Methods:
Cognitively normal women (n = 1,022) and men (n = 1,018) who underwent medical check-ups and magnetic resonance imaging (MRI) were enrolled at a health promotion center. We measured hemoglobin levels, white matter hyperintensities (WMH) scales, lacunes, and microbleeds. Cortical thickness was automatically measured using surface based methods. Multivariate regression analyses were performed after controlling for possible confounders.
Results:
Decreased hemoglobin levels were not associated with the presence of WMH, lacunes, or microbleeds in women and men. Among women, decreased hemoglobin levels were associated with decreased cortical thickness in the frontal (Estimates, 95% confidence interval, −0.007, (−0.013, −0.001)), temporal (−0.010, (−0.018, −0.002)), parietal (−0.009, (−0.015, −0.003)), and occipital regions (−0.011, (−0.019, −0.003)). Among men, however, no associations were observed between hemoglobin levels and cortical thickness.
Conclusion:
Our findings suggested that decreased hemoglobin levels affected cortical atrophy, but not increased CSVD, among women, although the association is modest. Given the paucity of modifiable risk factors for age-related cognitive decline, our results have important public health implications.
There is increasing evidence of a relationship between underweight or obesity and dementia risk. Several studies have investigated the relationship between body weight and brain atrophy, a pathological change preceding dementia, but their results are inconsistent. Therefore, we aimed to evaluate the relationship between body mass index (BMI) and cortical atrophy among cognitively normal participants.
Methods:
We recruited cognitively normal participants (n = 1,111) who underwent medical checkups and detailed neurologic screening, including magnetic resonance imaging (MRI) in the health screening visits between September 2008 and December 2011. The main outcome was cortical thickness measured using MRI. The number of subjects with five BMI groups in men/women was 9/9, 148/258, 185/128, 149/111, and 64/50 in underweight, normal, overweight, mild obesity, and moderate to severe obesity, respectively. Linear and non-linear relationships between BMI and cortical thickness were examined using multiple linear regression analysis and generalized additive models after adjustment for potential confounders.
Results:
Among men, underweight participants showed significant cortical thinning in the frontal and temporal regions compared to normal weight participants, while overweight and mildly obese participants had greater cortical thicknesses in the frontal region and the frontal, temporal, and occipital regions, respectively. However, cortical thickness in each brain region was not significantly different in normal weight and moderate to severe obesity groups. Among women, the association between BMI and cortical thickness was not statistically significant.
Conclusions:
Our findings suggested that underweight might be an important risk factor for pathological changes in the brain, while overweight or mild obesity may be inversely associated with cortical atrophy in cognitively normal elderly males.
Epidemiological studies have reported that higher education (HE) is associated with a reduced risk of incident Alzheimer's disease (AD). However, after the clinical onset of AD, patients with HE levels show more rapid cognitive decline than patients with lower education (LE) levels. Although education level and cognition have been linked, there have been few longitudinal studies investigating the relationship between education level and cortical decline in patients with AD. The aim of this study was to compare the topography of cortical atrophy longitudinally between AD patients with HE (HE-AD) and AD patients with LE (LE-AD).
Methods:
We prospectively recruited 36 patients with early-stage AD and 14 normal controls. The patients were classified into two groups according to educational level, 23 HE-AD (>9 years) and 13 LE-AD (≤9 years).
Results:
As AD progressed over the 5-year longitudinal follow-ups, the HE-AD showed a significant group-by-time interaction in the right dorsolateral frontal and precuneus, and the left parahippocampal regions compared to the LE-AD.
Conclusion:
Our study reveals that the preliminary longitudinal effect of HE accelerates cortical atrophy in AD patients over time, which underlines the importance of education level for predicting prognosis.
Background: Holt–Oram syndrome is characterised by CHD and limb anomalies. Mutations in TBX5 gene, encoding the T-box transcription factor, are responsible for the development of Holt–Oram syndrome, but such mutations are variably detected in 30–75% of patients. Methods: Clinically diagnosed eight Holt–Oram syndrome patients from six families were evaluated the clinical characteristics, focusing on the cardiac manifestations, in particular, and molecular aetiologies. In addition to the investigation of the mutation of TBX5, SALL4, NKX2.5, and GATA4 genes, which are known to regulate cardiac development by physically and functionally interacting with TBX5, were also analyzed. Multiple ligation-dependent probe amplification analysis was performed to detect exonic deletion and duplication mutations in these genes. Results: All included patients showed cardiac septal defects and upper-limb anomalies. Of the eight patients, seven underwent cardiac surgery, and four suffered from conduction abnormalities such as severe sinus bradycardia and complete atrioventricular block. Although our patients showed typical clinical findings of Holt–Oram syndrome, only three distinct TBX5 mutations were detected in three families: one nonsense, one splicing, and one missense mutation. No new mutations were identified by testing SALL4, NKX2.5, and GATA4 genes. Conclusions: All Holt–Oram syndrome patients in this study showed cardiac septal anomalies. Half of them showed TBX5 gene mutations. To understand the genetic causes for inherited CHD such as Holt–Oram syndrome is helpful to take care of the patients and their families. Further efforts with large-scale genomic research are required to identify genes responsible for cardiac manifestations or genotype–phenotype relation in Holt–Oram syndrome.