We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Climate change poses a major threat to marine ecosystems, with its effects felt worldwide. A major effect of climate change on marine ecosystems is the rise in water temperature, leading to a northward expansion of habitats for marine organisms. Herdmania momus, a species of ascidians (sea squirts), originally found in tropical and subtropical regions, was introduced to the Korean Peninsula. In this study, we examined the habitat of H. momus along the southeastern coast of the Korean Peninsula between 2016 and 2022. We found that H. momus settlements were observed across the entire survey area, with confirmed habitation in Busan in 2016, Ulsan in 2021, and Gyeongju (the northernmost location) in 2022. The observed habitation trend indicates a rapid geographical expansion, occurring approximately 79 years earlier than previously predicted. These observations demonstrate that marine organisms are undergoing a more rapid geographical expansion than previously projected. These unexpected findings should inform government policies related to proactive measures and strategies for managing the impact of climate change on marine ecosystems.
This study aimed to identify the roles of community pharmacists (CPs) during the coronavirus disease 2019 (COVID-19) pandemic, the differences in their role performance compared with their perceived importance, and limiting factors.
Methods:
A cross-sectional online survey of CPs was conducted. The CPs self-measured the importance and performance of each role during the pandemic using a 5-point Likert scale. A paired t-test was used to compare each role’s importance and performance scores. A logistic regression analysis of the roles with low performance scores, despite their level of importance, was conducted to determine the factors affecting performance. The limiting factors were also surveyed.
Results:
The 436 responses to the questionnaire were analyzed. The performance scores were significantly lower than the perceived importance scores for 15 of the 17 roles. The source and update frequency of COVID-19 information and participation in outreach pharmaceutical services were associated with low performance scores. Insufficient economic compensation, the lack of communication channels, and legal limitations were the limiting factors in performing the CPs’ roles.
Conclusions:
The participation in outreach pharmaceutical services, economic compensation, and communication channel should be improved to motivate the CPs in performing their roles.
In this observational study conducted in 2022, 12.3% of patients who shared a room with a patient positive for severe acute respiratory coronavirus virus 2 (SARS-CoV-2) also had a positive polymerase chain reaction (PCR) test, either at initial screening or during a 5-day quarantine. Therefore, screening and quarantine are still necessary within hospitals for close-contact inpatients during the SARS-CoV-2 omicron-variant dominant period.
Age is a risk factor for numerous diseases. Although the development of modern medicine has greatly extended the human lifespan, the duration of relatively healthy old age, or ‘healthspan’, has not increased. Targeting the detrimental processes that can occur before the onset of age-related diseases can greatly improve health and lifespan. Healthspan is significantly affected by what, when and how much one eats. Dietary restriction, including calorie restriction, fasting or fasting-mimicking diets, to extend both lifespan and healthspan has recently attracted much attention. However, direct scientific evidence that consuming specific foods extends the lifespan and healthspan seems lacking. Here, we synthesized the results of recent studies on the lifespan and healthspan extension properties of foods and their phytochemicals in various organisms to confirm how far the scientific research on the effect of food on the lifespan has reached.
Mood disorders require consistent management of symptoms to prevent recurrences of mood episodes. Circadian rhythm (CR) disruption is a key symptom of mood disorders to be proactively managed to prevent mood episode recurrences. This study aims to predict impending mood episodes recurrences using digital phenotypes related to CR obtained from wearable devices and smartphones.
Methods
The study is a multicenter, nationwide, prospective, observational study with major depressive disorder, bipolar disorder I, and bipolar II disorder. A total of 495 patients were recruited from eight hospitals in South Korea. Patients were followed up for an average of 279.7 days (a total sample of 75 506 days) with wearable devices and smartphones and with clinical interviews conducted every 3 months. Algorithms predicting impending mood episodes were developed with machine learning. Algorithm-predicted mood episodes were then compared to those identified through face-to-face clinical interviews incorporating ecological momentary assessments of daily mood and energy.
Results
Two hundred seventy mood episodes recurred in 135 subjects during the follow-up period. The prediction accuracies for impending major depressive episodes, manic episodes, and hypomanic episodes for the next 3 days were 90.1, 92.6, and 93.0%, with the area under the curve values of 0.937, 0.957, and 0.963, respectively.
Conclusions
We predicted the onset of mood episode recurrences exclusively using digital phenotypes. Specifically, phenotypes indicating CR misalignment contributed the most to the prediction of episodes recurrences. Our findings suggest that monitoring of CR using digital devices can be useful in preventing and treating mood disorders.
The two key mechanisms affected by internet gaming disorder (IGD) are cognitive and reward processing. Despite their significance, little is known about neurophysiological features as determined using resting-state electroencephalography (EEG) source functional connectivity (FC).
Methods
We compared resting-state EEG source FC within the default mode network (DMN) and reward/salience network (RSN) between patients with IGD and healthy controls (HCs) to identify neurophysiological markers associated with cognitive and reward processing. A total of 158 young male adults (79 patients with IGD and 79 HCs) were included, and the source FC of the DMN and RSN in five spectral bands (delta, theta, alpha, beta, and gamma) were assessed.
Results
Patients with IGD showed increased theta, alpha, and beta connectivity within the DMN between the orbitofrontal cortex and parietal regions compared with HCs. In terms of RSN, patients with IGD exhibited elevated alpha and beta connectivity between the anterior cingulate gyrus and temporal regions compared with HCs. Furthermore, patients with IGD showed negative correlations between the severity of IGD symptoms and/or weekly gaming time and theta and alpha connectivity within the DMN and theta, alpha, and beta connectivity within the RSN. However, the duration of IGD was not associated with EEG source FC.
Conclusions
Hyper-connectivities within the DMN and RSN may be considered potential state markers associated with symptom severity and gaming time in IGD.
Accurate prognostication is important for patients and their families to prepare for the end of life. Objective Prognostic Score (OPS) is an easy-to-use tool that does not require the clinicians’ prediction of survival (CPS), whereas Palliative Prognostic Score (PaP) needs CPS. Thus, inexperienced clinicians may hesitate to use PaP. We aimed to evaluate the accuracy of OPS compared with PaP in inpatients in palliative care units (PCUs) in three East Asian countries.
Method
This study was a secondary analysis of a cross-cultural, multicenter cohort study. We enrolled inpatients with far-advanced cancer in PCUs in Japan, Korea, and Taiwan from 2017 to 2018. We calculated the area under the receiver operating characteristics (AUROC) curve to compare the accuracy of OPS and PaP.
Results
A total of 1,628 inpatients in 33 PCUs in Japan and Korea were analyzed. OPS and PaP were calculated in 71.7% of the Japanese patients and 80.0% of the Korean patients. In Taiwan, PaP was calculated for 81.6% of the patients. The AUROC for 3-week survival was 0.74 for OPS in Japan, 0.68 for OPS in Korea, 0.80 for PaP in Japan, and 0.73 for PaP in Korea. The AUROC for 30-day survival was 0.70 for OPS in Japan, 0.71 for OPS in Korea, 0.79 for PaP in Japan, and 0.74 for PaP in Korea.
Significance of results
Both OPS and PaP showed good performance in Japan and Korea. Compared with PaP, OPS could be more useful for inexperienced physicians who hesitate to estimate CPS.
Several studies supported the usefulness of “the surprise question” in terms of 1-year mortality of patients. “The surprise question” requires a “Yes” or “No” answer to the question “Would I be surprised if this patient died in [specific time frame].” However, the 1-year time frame is often too long for advanced cancer patients seen by palliative care personnel. “The surprise question” with shorter time frames is needed for decision making. We examined the accuracy of “the surprise question” for 7-day, 21-day, and 42-day survival in hospitalized patients admitted to palliative care units (PCUs).
Method
This was a prospective multicenter cohort study of 130 adult patients with advanced cancer admitted to 7 hospital-based PCUs in South Korea. The accuracy of “the surprise question” was compared with that of the temporal question for clinician's prediction of survival.
Results
We analyzed 130 inpatients who died in PCUs during the study period. The median survival was 21.0 days. The sensitivity, specificity, and overall accuracy for the 7-day “the surprise question” were 46.7, 88.7, and 83.9%, respectively. The sensitivity, specificity, and overall accuracy for the 7-day temporal question were 6.7, 98.3, and 87.7%, respectively. The c-indices of the 7-day “the surprise question” and 7-day temporal question were 0.662 (95% CI: 0.539–0.785) and 0.521 (95% CI: 0.464–0.579), respectively. The c-indices of the 42-day “the surprise question” and 42-day temporal question were 0.554 (95% CI: 0.509–0.599) and 0.616 (95% CI: 0.569–0.663), respectively.
Significance of results
Surprisingly, “the surprise questions” and temporal questions had similar accuracies. The high specificities for the 7-day “the surprise question” and 7- and 21-day temporal question suggest they may be useful to rule in death if positive.
We calculated the human resources required for an antimicrobial stewardship program (ASP) in Korean hospitals.
Design:
Multicenter retrospective study.
Setting:
Eight Korean hospitals ranging in size from 295 to 1,337 beds.
Methods:
The time required for performing ASP activities for all hospitalized patients under antibiotic therapy was estimated and converted into hours per week. The actual time spent on patient reviews of each ASP activity was measured with a small number of cases, then the total time was estimated by applying the determined times to a larger number of cases. Full-time equivalents (FTEs) were measured according to labor laws in Korea (52 hours per week).
Results:
In total, 225 cases were reviewed to measure time spent on patient reviews. The median time spent per patient review for ASP activities ranged from 10 to 16 minutes. The total time spent on the review for all hospitalized patients was estimated using the observed number of ASP activities for 1,534 patients who underwent antibiotic therapy on surveillance days. The most commonly observed ASP activity was ‘review of surgical prophylactic antibiotics’ (32.7%), followed by ‘appropriate antibiotics recommendations for patients with suspected infection without a proven site of infection but without causative pathogens’ (28.6%). The personnel requirement was calculated as 1.20 FTEs (interquartile range [IQR], 1.02–1.38) per 100 beds and 2.28 FTEs (IQR, 1.93–2.62) per 100 patients who underwent antibiotic therapy, respectively.
Conclusion:
The estimated time required for human resources performing extensive ASP activities on all hospitalized patients undergoing antibiotic therapy in Korean hospitals was ~1.20 FTEs (IQR, 1.02–1.38) per 100 beds.
To date, there have been few studies on dietary supplement (DS) use in Korean children and adolescents, using nationally representative data. This study aimed to investigate the current status of DS use and its related factors, among Korean children and adolescents from the Korean National Health and Nutrition Examination Survey (KNHANES) data.
Design:
A cross-sectional study.
Setting:
Data from the KNHANES 2015–2017. Participants completed 24-h dietary recall interviews, including DS products that the subjects consumed.
Participants:
The study population was 4380 children and adolescents aged 1–18 years.
Results:
Approximately 2013 % of children and adolescents were using DS; the highest use was among children aged 1–3 years old, and the lowest use was among adolescents aged 16–18 years. The most frequently used DS was prebiotics/probiotics, followed by multivitamin/mineral supplements. Factors that were associated with DS use were lower birth weight in children aged <4 years; younger age, higher household income, regular breakfast intake and lower BMI in children aged 4–9 years; and regular breakfast intake and use of nutrition facts label in adolescents aged 10–18 years. Feeding patterns in infancy and having chronic diseases were not associated with DS use.
Conclusions:
We report that over 20 % of children and adolescents use DS. Nutritional education for parents and children about proper DS consumption is needed.
Spirituality is what gives people meaning and purpose in life, and it has been recognized as a critical factor in patients’ well-being, particularly at the ends of their lives. Studies have demonstrated relationships between spirituality and patient-reported outcomes such as quality of life and mental health. Although a number of studies have suggested that spiritual belief can be associated with mortality, the results are inconsistent. We aimed to determine whether spirituality was related to survival in advanced cancer inpatients in Korea.
Method
For this multicenter study, we recruited adult advanced cancer inpatients who had been admitted to seven palliative care units with estimated survival of <3 months. We measured spirituality at admission using the Korean version of the Functional Assessment of Chronic Illness Therapy-Spiritual Well-Being (FACIT-sp), which comprises two subscales: meaning/peace and faith. We calculated a Kaplan-Meier curve for spirituality, dichotomized at the predefined cutoffs and medians for the total scale and each of the two subscales, and performed univariate regression with a Cox proportional hazard model.
Result
We enrolled a total of 204 adults (mean age: 64.5 ± 13.0; 48.5% female) in the study. The most common primary cancer diagnoses were lung (21.6%), colorectal (18.6%), and liver/biliary tract (13.0%). Median survival was 19.5 days (95% confidence interval [CI95%]: 23.5, 30.6). Total FACIT-sp score was not related to survival time (hazard ratio [HR] = 0.981, CI95% = 0.957, 1.007), and neither were the scores for its two subscales, meaning/peace (HR = 0.969, CI95% = 0.932, 1.008) and faith (HR = 0.981, CI95% = 0.938, 1.026).
Significance of results
Spirituality was not related to survival in advanced cancer inpatients in Korea. Plausible mechanisms merit further investigation.
The Saemangeum tidal flat, an important staging site for migratory shorebirds that travel the East Asian-Australasian (EAA) Flyway, was isolated from the eastern Yellow Sea in 2006 as part of a large-scale reclamation project. To gain a better understanding of the impacts that this reclamation has had on the long-distance migratory shorebirds that use the EAA Flyway, we examined the number of shorebirds visiting Saemangeum and three adjacent sites in the Geum Estuary (Yubu Island, the Janghang coastline, and the Geum River Channel) during the spring and fall prior to, and after, completion of the reclamation (2004–2013). A total of 48 shorebird species, including one Critically Endangered, three Endangered, and nine Near Threatened species, were observed over this period. Peak numbers of shorebirds recorded at sites in Saemangeum and the Geum Estuary following completion of the project were 74% below those recorded in 2004 and 2005, the years prior to reclamation activity. In Saemangeum, shorebird abundance declined by approximately 95% and 97.3% during the northward and southward migrations, respectively, as a result of reclamation. Although shorebird populations in the Geum Estuary increased by 5% and 20% during the northwards and southward migrations, respectively, these increases failed to offset the reduction in shorebird abundance in Saemangeum; overall, shorebird abundance at Saemangeum and the three adjacent sites in the Geum Estuary markedly declined over the reclamation period. Given the more favourable conditions of adjacent areas, sites in Saemangeum and the Geum Estuary no longer provide the habitat conditions necessary for long-distance migratory shorebirds. In order to improve habitat for staging migratory birds, we suggest that measures such as the conversion of an abandoned salt farm for use as roosting sites, the construction of artificial barriers to prevent human disturbance, and re-opening of the river-banks to facilitate water flow be implemented.
Echinochloa species are among the most troublesome weeds in
rice cultivation, and grow in a broad habitat range in Korea. Although
various ecotypes of Echinochloa have been collected as
germplasm for future studies, it has been difficult to classify them due to
their high level of morphological similarity. This study was thus conducted
to develop and investigate the phylogenetic relationships between 77
Echinochloa accessions with the use of 23 simple
sequence repeat (SSR) markers and 24 morphological traits. Of 77
Echinochloa accessions, including 57 accessions from
Korea and 5 reference species, late watergrass was clearly clustered as a
distinctive group from barnyardgrass and other Echinochloa
species. In this analysis, we also identified core genetic and morphological
markers that can be used for the future identification and classification of
Echinochloa species. Five out of 23 SSR makers produced
distinctive bands that discriminate late watergrass from barnyardgrass and
other Echinochloa species. Four morphological traits of the
reproductive organs were the most influential contributors for classifying
Echinochloa species. Although there was no clear
consensus generated in this study between SSR markers and morphological
trait analyses, our results support the potential use of the selected SSR
markers and morphological traits in future studies of
Echinochloa.
Grammaticality judgment tests (GJTs) have been used to elicit data reflecting second language (L2) speakers’ knowledge of L2 grammar. However, the exact constructs measured by GJTs, whether primarily implicit or explicit knowledge, are disputed and have been argued to differ depending on test-related variables (i.e., time pressure and item grammaticality).
Using eye-tracking, this study replicates the GJT results in R. Ellis (2005). Twenty native and 40 nonnative English speakers judged sentences with and without time pressure. Analyses revealed that time pressure suppressed regressions (right-to-left eye movements) in nonnative speakers only. Conversely, both groups regressed more on untimed, grammatical items. These findings suggest that timed and untimed GJTs measure different constructs, which could correspond to implicit and explicit knowledge, respectively. In particular, they point to a difference in the levels of automatic and controlled processing involved in responding to the timed and untimed tests. Furthermore, untimed grammatical items may induce GJT-specific task effects.
Epidemiological studies have reported that higher education (HE) is associated with a reduced risk of incident Alzheimer's disease (AD). However, after the clinical onset of AD, patients with HE levels show more rapid cognitive decline than patients with lower education (LE) levels. Although education level and cognition have been linked, there have been few longitudinal studies investigating the relationship between education level and cortical decline in patients with AD. The aim of this study was to compare the topography of cortical atrophy longitudinally between AD patients with HE (HE-AD) and AD patients with LE (LE-AD).
Methods:
We prospectively recruited 36 patients with early-stage AD and 14 normal controls. The patients were classified into two groups according to educational level, 23 HE-AD (>9 years) and 13 LE-AD (≤9 years).
Results:
As AD progressed over the 5-year longitudinal follow-ups, the HE-AD showed a significant group-by-time interaction in the right dorsolateral frontal and precuneus, and the left parahippocampal regions compared to the LE-AD.
Conclusion:
Our study reveals that the preliminary longitudinal effect of HE accelerates cortical atrophy in AD patients over time, which underlines the importance of education level for predicting prognosis.
This study examined changes in health-related quality of life (HRQoL) and quality of care (QoC) as perceived by terminally ill cancer patients and a stratified set of HRQoL or QoC factors that are most likely to influence survival at the end of life (EoL).
Method:
We administered questionnaires to 619 consecutive patients immediately after they were diagnosed with terminal cancer by physicians at 11 university hospitals and at the National Cancer Center in Korea. Subjects were followed up over 161.2 person-years until their deaths. We measured HRQoL using the core 30-item European Organization for Research and Treatment of Cancer Quality of Life Questionnaire, and QoC using the Quality Care Questionnaire–End of Life (QCQ–EoL). We evaluated changes in HRQoL and QoC issues during the first three months after enrollment, performing sensitivity analysis by using data generated via four methods (complete case analysis, available case analysis, the last observation carried forward, and multiple imputation).
Results:
Emotional and cognitive functioning decreased significantly over time, while dyspnea, constipation, and pain increased significantly. Dignity-conserving care, care by healthcare professionals, family relationships, and QCQ–EoL total score decreased significantly. Global QoL, appetite loss, and Eastern Cooperative Oncology Group Performance Status (ECOG–PS) scores were significantly associated with survival.
Significance of results:
Future standardization of palliative care should be focused on assessment of these deteriorated types of quality. Accurate estimates of the length of life remaining for terminally ill cancer patients by such EoL-enhancing factors as global QoL, appetite loss, and ECOG–PS are needed to help patients experience a dignified and comfortable death.