We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Depression is a complex mental health disorder with highly heterogeneous symptoms that vary significantly across individuals, influenced by various factors, including sex and regional contexts. Network analysis is an analytical method that provides a robust framework for evaluating the heterogeneity of depressive symptoms and identifying their potential clinical implications.
Objective:
To investigate sex-specific differences in the network structures of depressive symptoms in Asian patients diagnosed with depressive disorders, using data from the Research on Asian Psychotropic Prescription Patterns for Antidepressants, Phase 3, which was conducted in 2023.
Methods:
A network analysis of 10 depressive symptoms defined according to the National Institute for Health and Care Excellence guidelines was performed. The sex-specific differences in the network structures of the depressive symptoms were examined using the Network Comparison Test. Subgroup analysis of the sex-specific differences in the network structures was performed according to geographical region classifications, including East Asia, Southeast Asia, and South or West Asia.
Results:
A total of 998 men and 1,915 women with depression were analysed in this study. The analyses showed that all 10 depressive symptoms were grouped into a single cluster. Low self-confidence and loss of interest emerged as the most central nodes for men and women, respectively. In addition, a significant difference in global strength invariance was observed between the networks. In the regional subgroup analysis, only East Asian men showed two distinct clustering patterns. In addition, significant differences in global strength and network structure were observed only between East Asian men and women.
Conclusion:
The study highlights the sex-specific differences in depressive symptom networks across Asian countries. The results revealed that low self-confidence and loss of interest are the main symptoms of depression in Asian men and women, respectively. The network connections were more localised in men, whereas women showed a more diverse network. Among the Asian subgroups analysed, only East Asians exhibited significant differences in network structure. The considerable effects of neurovegetative symptoms in men may indicate potential neurobiological underpinnings of depression in the East Asian population.
Shape deformation during fossilization can prevent accurate reconstruction of an organism's form during life, hampering areas of paleontology ranging from functional morphology to systematics. Retrodeformation attempts to restore the original shape of deformed fossil specimens and requires an adequate knowledge of the deformation process. Although tectonic processes and retrodeformation are relatively well understood, research on quantifying the effect of compressive deformation on fossil morphology is scant. Here we investigate the factors that can cause changes in the shape of fossil specimens during compressive deformation. Three-dimensional (3D) models of trilobite cranidia/cephala are subjected to simulated deposition and compaction using rigid body simulation and scaling features of the open-source 3D software Blender. The variation in pitch and roll angle is lowest on flat surfaces, intermediate on tilted surfaces, and highest on irregular surfaces. These trends are reflected in the morphological differences captured by principal component scores in geometric morphometric analyses using landmarks. In addition, the different shapes of trilobite cranidia/cephala according to their systematic affinity influence the degree of angular variation, which in turn affects their posture—normal or inverted. Inverted cranidia/cephala show greater morphological variability than those with normal postures.
We investigated gender differences in psychosocial determinants that affect hand hygiene (HH) performance among physicians.
Design:
The survey included a structured questionnaire with 7 parts: self-assessment of HH execution rate; knowledge, attitude, and behavior regarding HH; internal and emotional motivation for better HH; barriers to HH; need for external reminders; preference for alcohol gel; and embarrassment due to supervision.
Setting:
The study was conducted across 4 academic referral hospitals in Korea.
Participants:
Physicians who worked at these hospitals were surveyed.
Methods:
The survey questionnaire was sent to 994 physicians of the hospitals in July 2018 via email or paper. Differences in psychosocial determinants of HH among physicians were analyzed by gender using an independent t test or the Fisher exact test.
Results:
Of the 994 physicians, 201 (20.2%) responded to the survey. Among them, 129 (63.5%) were men. Male physicians identified 4 barriers as significant: time wasted on HH (P = .034); HH is not a habit (P = .004); often forgetting about HH situations (P = .002); and no disadvantage when I do not perform HH (P = .005). Female physicians identified pain and dryness of the hands as a significant obstacle (P = .010), and they had a higher tendency to feel uncomfortable when a fellow employee performed inadequate HH (P = .098). Among the respondents, 26.6% identified diversifying the types of hand sanitizers as their first choice for overcoming barriers to improving HH, followed by providing reminders (15.6%) and soap and paper towels in each hospital room (13.0%).
Conclusion:
A significant difference in the barriers to HH existed between male and female physicians. Promoting HH activities could help increase HH compliance.
Background: Evaluation of the adequacy of prophylactic antibiotics in surgery has been implemented as a national policy in Korea since August 2007, and the appropriate use of prophylactic antibiotics has improved. However, antibiotic prescriptions that are not recommended or discontinuation of prophylactic antibiotic administration within 24 hours after surgery are still not well done. This study introduced a program to improve the adequacy of prophylactic antibiotics for surgery and analyzed its effects. Methods: We retrospectively analyzed the effectiveness of the appropriate prophylactic antibiotic use program for surgery conducted at a university hospital in Seoul. The participants were patients aged ≥18 years who underwent any of 18 types of surgery. The program started was implemented in June 2020. First, a computer system was used to confirm the antibiotic prescription recommended for each surgery. It also assessed whether the number of days of administration was exceeded, whether antibiotics were prescribed in combination, and whether antibiotics prescribed for discharge medicine were checked in 4 steps. A pop-up window appeared in each patient record to enter the reason for the prescription. If the reason was appropriate, the prescription was allowed, but if not, the prescription was restricted. In addition, infectious diseases physicians and an insurance review team visited each department to conduct an education session. To analyze the effect 3 months before activity (January–March 2020) and 3 months after activity (October–December 2020), we compared the first antibiotic administration rate within 1 hour prior to skin incision, the recommended prophylactic antibiotic administration rate, and surgery type. The rate of discontinuation of prophylactic antibiotics within 24 hours after administration and the rate of prescription of prophylactic antibiotics at discharge were compared. Results: In total, 1,339 surgeries during the study period were included in the analysis. There were 695 cases before the introduction of the program and 644 cases after the introduction. The rate of first antibiotic use within 1 hour prior to skin incision was 93.1%–99.5% (P < .001), the rate of recommended prophylactic antibiotic administration was 85.0%–99.2% (P < .001), and the rate of discontinuation of antibiotic administration within 24 hours after surgery improved from 51.8% to 98.3% (P < .001), respectively. The prescription rate of antibiotics at discharge improved from 20.7% to 0.8% (P <.001) (Table 1). Conclusions: A computerized program to improve the adequacy of prophylactic antibiotic use in surgery combined with education of medical staff was very effective.
The two key mechanisms affected by internet gaming disorder (IGD) are cognitive and reward processing. Despite their significance, little is known about neurophysiological features as determined using resting-state electroencephalography (EEG) source functional connectivity (FC).
Methods
We compared resting-state EEG source FC within the default mode network (DMN) and reward/salience network (RSN) between patients with IGD and healthy controls (HCs) to identify neurophysiological markers associated with cognitive and reward processing. A total of 158 young male adults (79 patients with IGD and 79 HCs) were included, and the source FC of the DMN and RSN in five spectral bands (delta, theta, alpha, beta, and gamma) were assessed.
Results
Patients with IGD showed increased theta, alpha, and beta connectivity within the DMN between the orbitofrontal cortex and parietal regions compared with HCs. In terms of RSN, patients with IGD exhibited elevated alpha and beta connectivity between the anterior cingulate gyrus and temporal regions compared with HCs. Furthermore, patients with IGD showed negative correlations between the severity of IGD symptoms and/or weekly gaming time and theta and alpha connectivity within the DMN and theta, alpha, and beta connectivity within the RSN. However, the duration of IGD was not associated with EEG source FC.
Conclusions
Hyper-connectivities within the DMN and RSN may be considered potential state markers associated with symptom severity and gaming time in IGD.
Nosocomial transmission of COVID-19 among immunocompromised hosts can have a serious impact on COVID-19 severity, underlying disease progression and SARS-CoV-2 transmission to other patients and healthcare workers within hospitals. We experienced a nosocomial outbreak of COVID-19 in the setting of a daycare unit for paediatric and young adult cancer patients. Between 9 and 18 November 2020, 473 individuals (181 patients, 247 caregivers/siblings and 45 staff members) were exposed to the index case, who was a nursing staff. Among them, three patients and four caregivers were infected. Two 5-year-old cancer patients with COVID-19 were not severely ill, but a 25-year-old cancer patient showed prolonged shedding of SARS-CoV-2 RNA for at least 12 weeks, which probably infected his mother at home approximately 7–8 weeks after the initial diagnosis. Except for this case, no secondary transmission was observed from the confirmed cases in either the hospital or the community. To conclude, in the day care setting of immunocompromised children and young adults, the rate of in-hospital transmission of SARS-CoV-2 was 1.6% when applying the stringent policy of infection prevention and control, including universal mask application and rapid and extensive contact investigation. Severely immunocompromised children/young adults with COVID-19 would have to be carefully managed after the mandatory isolation period while keeping the possibility of prolonged shedding of live virus in mind.
Accurate prognostication is important for patients and their families to prepare for the end of life. Objective Prognostic Score (OPS) is an easy-to-use tool that does not require the clinicians’ prediction of survival (CPS), whereas Palliative Prognostic Score (PaP) needs CPS. Thus, inexperienced clinicians may hesitate to use PaP. We aimed to evaluate the accuracy of OPS compared with PaP in inpatients in palliative care units (PCUs) in three East Asian countries.
Method
This study was a secondary analysis of a cross-cultural, multicenter cohort study. We enrolled inpatients with far-advanced cancer in PCUs in Japan, Korea, and Taiwan from 2017 to 2018. We calculated the area under the receiver operating characteristics (AUROC) curve to compare the accuracy of OPS and PaP.
Results
A total of 1,628 inpatients in 33 PCUs in Japan and Korea were analyzed. OPS and PaP were calculated in 71.7% of the Japanese patients and 80.0% of the Korean patients. In Taiwan, PaP was calculated for 81.6% of the patients. The AUROC for 3-week survival was 0.74 for OPS in Japan, 0.68 for OPS in Korea, 0.80 for PaP in Japan, and 0.73 for PaP in Korea. The AUROC for 30-day survival was 0.70 for OPS in Japan, 0.71 for OPS in Korea, 0.79 for PaP in Japan, and 0.74 for PaP in Korea.
Significance of results
Both OPS and PaP showed good performance in Japan and Korea. Compared with PaP, OPS could be more useful for inexperienced physicians who hesitate to estimate CPS.
Several studies supported the usefulness of “the surprise question” in terms of 1-year mortality of patients. “The surprise question” requires a “Yes” or “No” answer to the question “Would I be surprised if this patient died in [specific time frame].” However, the 1-year time frame is often too long for advanced cancer patients seen by palliative care personnel. “The surprise question” with shorter time frames is needed for decision making. We examined the accuracy of “the surprise question” for 7-day, 21-day, and 42-day survival in hospitalized patients admitted to palliative care units (PCUs).
Method
This was a prospective multicenter cohort study of 130 adult patients with advanced cancer admitted to 7 hospital-based PCUs in South Korea. The accuracy of “the surprise question” was compared with that of the temporal question for clinician's prediction of survival.
Results
We analyzed 130 inpatients who died in PCUs during the study period. The median survival was 21.0 days. The sensitivity, specificity, and overall accuracy for the 7-day “the surprise question” were 46.7, 88.7, and 83.9%, respectively. The sensitivity, specificity, and overall accuracy for the 7-day temporal question were 6.7, 98.3, and 87.7%, respectively. The c-indices of the 7-day “the surprise question” and 7-day temporal question were 0.662 (95% CI: 0.539–0.785) and 0.521 (95% CI: 0.464–0.579), respectively. The c-indices of the 42-day “the surprise question” and 42-day temporal question were 0.554 (95% CI: 0.509–0.599) and 0.616 (95% CI: 0.569–0.663), respectively.
Significance of results
Surprisingly, “the surprise questions” and temporal questions had similar accuracies. The high specificities for the 7-day “the surprise question” and 7- and 21-day temporal question suggest they may be useful to rule in death if positive.
We calculated the human resources required for an antimicrobial stewardship program (ASP) in Korean hospitals.
Design:
Multicenter retrospective study.
Setting:
Eight Korean hospitals ranging in size from 295 to 1,337 beds.
Methods:
The time required for performing ASP activities for all hospitalized patients under antibiotic therapy was estimated and converted into hours per week. The actual time spent on patient reviews of each ASP activity was measured with a small number of cases, then the total time was estimated by applying the determined times to a larger number of cases. Full-time equivalents (FTEs) were measured according to labor laws in Korea (52 hours per week).
Results:
In total, 225 cases were reviewed to measure time spent on patient reviews. The median time spent per patient review for ASP activities ranged from 10 to 16 minutes. The total time spent on the review for all hospitalized patients was estimated using the observed number of ASP activities for 1,534 patients who underwent antibiotic therapy on surveillance days. The most commonly observed ASP activity was ‘review of surgical prophylactic antibiotics’ (32.7%), followed by ‘appropriate antibiotics recommendations for patients with suspected infection without a proven site of infection but without causative pathogens’ (28.6%). The personnel requirement was calculated as 1.20 FTEs (interquartile range [IQR], 1.02–1.38) per 100 beds and 2.28 FTEs (IQR, 1.93–2.62) per 100 patients who underwent antibiotic therapy, respectively.
Conclusion:
The estimated time required for human resources performing extensive ASP activities on all hospitalized patients undergoing antibiotic therapy in Korean hospitals was ~1.20 FTEs (IQR, 1.02–1.38) per 100 beds.
Spirituality is what gives people meaning and purpose in life, and it has been recognized as a critical factor in patients’ well-being, particularly at the ends of their lives. Studies have demonstrated relationships between spirituality and patient-reported outcomes such as quality of life and mental health. Although a number of studies have suggested that spiritual belief can be associated with mortality, the results are inconsistent. We aimed to determine whether spirituality was related to survival in advanced cancer inpatients in Korea.
Method
For this multicenter study, we recruited adult advanced cancer inpatients who had been admitted to seven palliative care units with estimated survival of <3 months. We measured spirituality at admission using the Korean version of the Functional Assessment of Chronic Illness Therapy-Spiritual Well-Being (FACIT-sp), which comprises two subscales: meaning/peace and faith. We calculated a Kaplan-Meier curve for spirituality, dichotomized at the predefined cutoffs and medians for the total scale and each of the two subscales, and performed univariate regression with a Cox proportional hazard model.
Result
We enrolled a total of 204 adults (mean age: 64.5 ± 13.0; 48.5% female) in the study. The most common primary cancer diagnoses were lung (21.6%), colorectal (18.6%), and liver/biliary tract (13.0%). Median survival was 19.5 days (95% confidence interval [CI95%]: 23.5, 30.6). Total FACIT-sp score was not related to survival time (hazard ratio [HR] = 0.981, CI95% = 0.957, 1.007), and neither were the scores for its two subscales, meaning/peace (HR = 0.969, CI95% = 0.932, 1.008) and faith (HR = 0.981, CI95% = 0.938, 1.026).
Significance of results
Spirituality was not related to survival in advanced cancer inpatients in Korea. Plausible mechanisms merit further investigation.
New ionic conjugated polyelectrolyte complex films based on poly(3,4-ethylenedioxythiophene):sulfonated poly(diphenylacetylene) (PEDOT:SPDPA) are electrochemically formed on indium thin oxide substrates using a potentiostatic method, and their physical properties are evaluated using various analytical tools. Depending on a constant applied voltage, the surface morphological features and electrochemically doped states are different due to the conformational structure related to the oxidation state in the PEDOT growth process and concomitant SPDPA doping state in the films. For the purpose of use as a hole injection layer in organic light-emitting diodes, a well-known configuration (ITP/PEDOT:SPDPA/TPD/Alq3/LiF/Al) is adopted to investigate the optoelectronic properties.
There is increasing evidence of a relationship between underweight or obesity and dementia risk. Several studies have investigated the relationship between body weight and brain atrophy, a pathological change preceding dementia, but their results are inconsistent. Therefore, we aimed to evaluate the relationship between body mass index (BMI) and cortical atrophy among cognitively normal participants.
Methods:
We recruited cognitively normal participants (n = 1,111) who underwent medical checkups and detailed neurologic screening, including magnetic resonance imaging (MRI) in the health screening visits between September 2008 and December 2011. The main outcome was cortical thickness measured using MRI. The number of subjects with five BMI groups in men/women was 9/9, 148/258, 185/128, 149/111, and 64/50 in underweight, normal, overweight, mild obesity, and moderate to severe obesity, respectively. Linear and non-linear relationships between BMI and cortical thickness were examined using multiple linear regression analysis and generalized additive models after adjustment for potential confounders.
Results:
Among men, underweight participants showed significant cortical thinning in the frontal and temporal regions compared to normal weight participants, while overweight and mildly obese participants had greater cortical thicknesses in the frontal region and the frontal, temporal, and occipital regions, respectively. However, cortical thickness in each brain region was not significantly different in normal weight and moderate to severe obesity groups. Among women, the association between BMI and cortical thickness was not statistically significant.
Conclusions:
Our findings suggested that underweight might be an important risk factor for pathological changes in the brain, while overweight or mild obesity may be inversely associated with cortical atrophy in cognitively normal elderly males.
A 44-year-old man developed sudden non-fluent aphasia and right hemiplegia due to left striatocapsular infarction (Figure). Neurologic examination revealed gaze deviation to the right with eyes closed, but not with eyes open (Video). There was no spontaneous or gaze-evoked nystagmus, even after elimination of visual fixation. Leftward pursuit was impaired in a craniotopic frame of reference, and horizontal saccades were hypometric in both directions. Head impulse test was normal in the horizontal plane and there were no visual field defects. The contralesional gaze deviation with eye closure persisted for ten days.
Epidemiological studies have reported that higher education (HE) is associated with a reduced risk of incident Alzheimer's disease (AD). However, after the clinical onset of AD, patients with HE levels show more rapid cognitive decline than patients with lower education (LE) levels. Although education level and cognition have been linked, there have been few longitudinal studies investigating the relationship between education level and cortical decline in patients with AD. The aim of this study was to compare the topography of cortical atrophy longitudinally between AD patients with HE (HE-AD) and AD patients with LE (LE-AD).
Methods:
We prospectively recruited 36 patients with early-stage AD and 14 normal controls. The patients were classified into two groups according to educational level, 23 HE-AD (>9 years) and 13 LE-AD (≤9 years).
Results:
As AD progressed over the 5-year longitudinal follow-ups, the HE-AD showed a significant group-by-time interaction in the right dorsolateral frontal and precuneus, and the left parahippocampal regions compared to the LE-AD.
Conclusion:
Our study reveals that the preliminary longitudinal effect of HE accelerates cortical atrophy in AD patients over time, which underlines the importance of education level for predicting prognosis.
This study examined changes in health-related quality of life (HRQoL) and quality of care (QoC) as perceived by terminally ill cancer patients and a stratified set of HRQoL or QoC factors that are most likely to influence survival at the end of life (EoL).
Method:
We administered questionnaires to 619 consecutive patients immediately after they were diagnosed with terminal cancer by physicians at 11 university hospitals and at the National Cancer Center in Korea. Subjects were followed up over 161.2 person-years until their deaths. We measured HRQoL using the core 30-item European Organization for Research and Treatment of Cancer Quality of Life Questionnaire, and QoC using the Quality Care Questionnaire–End of Life (QCQ–EoL). We evaluated changes in HRQoL and QoC issues during the first three months after enrollment, performing sensitivity analysis by using data generated via four methods (complete case analysis, available case analysis, the last observation carried forward, and multiple imputation).
Results:
Emotional and cognitive functioning decreased significantly over time, while dyspnea, constipation, and pain increased significantly. Dignity-conserving care, care by healthcare professionals, family relationships, and QCQ–EoL total score decreased significantly. Global QoL, appetite loss, and Eastern Cooperative Oncology Group Performance Status (ECOG–PS) scores were significantly associated with survival.
Significance of results:
Future standardization of palliative care should be focused on assessment of these deteriorated types of quality. Accurate estimates of the length of life remaining for terminally ill cancer patients by such EoL-enhancing factors as global QoL, appetite loss, and ECOG–PS are needed to help patients experience a dignified and comfortable death.
Social support programs for dementia caregivers were widely used in order to reduce care burden. We investigated which types of social supports can reduce psychological and non-psychological burdens of dementia caregivers, and explored the mechanism of those social supports.
Methods:
We evaluated 731 community-dwelling dementia patients and their caregivers from the National Survey of Dementia Care in South Korea. We investigated the five types of social supports (emotional support, informational support, tangible support, positive social interaction, affectionate support) using the Medical Outcomes Study Social Support Survey in each caregiver. The mechanisms of specific types of social support on psychological/non-psychological burden were examined using path analysis.
Results:
Positive social interaction and affectionate support reduced psychological burden via direct and indirect paths. Tangible support reduced the non-psychological burden via direct and indirect paths. Informational support and emotional support were not helpful for reducing psychological or non-psychological burden. A maximum of 20% of psychological burden could be relieved by positive social interaction and 10.3% of that could be reduced by affectionate support. Tangible support was associated with a 15.1% maximal improvement in non-psychological burden.
Conclusions:
In order to reduce caregiver burden in dementia effectively, psychosocial interventions should be tailored to target type of caregiver burden.
The present study aimed to assess the adequacy of Ca intake and major food sources of Ca in Korean children and adolescents.
Design
A cross-sectional study.
Setting
Data from the Korean National Health and Nutrition Examination Survey (KNHANES) 2007–2010. We analysed the daily Ca intake, major food sources of Ca and the prevalence of inadequate Ca intake in the study population. Ca intake was categorized as inadequate when the participant's daily Ca intake was less than the Estimated Average Requirement.
Subject
The study population consisted of 7233 children and adolescents (3973 boys, 3260 girls; aged 1–18 years).
Results
Mean Ca intake was 510·2 mg/d in boys and 431·7 mg/d in girls. Overall, 75·0 % of adolescents (boys 71·6 %, girls 79·1 %) had inadequate Ca intake. The prevalence of inadequate Ca intake increased significantly from toddlers (45–55 %) to adolescents (78–86 %) in both genders. The highest ranked food sources for Ca were dairy products (35·0 %), followed by vegetables (17·3 %), grains (11·3 %) and seafood (9·9 %). Ca intake from dairy products decreased significantly from 57 % in toddlers to 30 % in adolescents, while Ca intakes from other foods increased with age.
Conclusions
Inadequate Ca intake is highly prevalent and increased with age in Korean children and adolescents. It should be emphasized to encourage children and adolescents to eat more Ca-rich products to meet their Ca needs.
The surgical approaches previously reported for facial nerve decompression have focussed on achieving good exposure of the lateral or superior aspects of the geniculate ganglion. This report aims to describe a unique case of facial nerve decompression beneath the geniculate ganglion.
Patient:
A 30-year-old woman with right-sided facial palsy due to a temporal bone fracture.
Intervention:
Bony fragments at the base of the geniculate ganglion were removed via a trans-tensor tympani approach with extended posterior tympanotomy.
Results:
The patient's facial movement recovered successfully, without complications such as sensorineural hearing loss and conductive hearing loss.
Conclusion:
In rare cases requiring decompression of the facial nerve inferior to the perigeniculate area, the trans-tensor tympani approach should be considered as a valuable alternative option when surgical intervention is considered.