We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Data on antimicrobial use at the national level is crucial to establish domestic antimicrobial stewardship policies and enable medical institutions to benchmark against each other. This study aimed to analyze antimicrobial use in Korean hospitals. Methods: We investigated the antimicrobials prescribed in Korean hospitals between 2018 and 2021, using data from the Health Insurance Review and Assessment. Primary care hospitals (PCHs), secondary care hospitals (SCHs), and tertiary care hospitals (TCHs) were included in this analysis. Antimicrobials were categorized according to the Korea National Antimicrobial Use Analysis System (KONAS) classification, which is suitable for measuring antimicrobial use in Korean hospitals. Results: Out of more than 1,900 hospitals, PCHs and TCHs represented the largest and lowest percentage of hospitals, respectively. The most frequently prescribed antimicrobial in 2021 was piperacillin/β-lactamase inhibitor (9.3%) in TCHs, ceftriaxone (11.0%) in SCHs, and cefazedone (18.9%) in PCHs. Between 2018 and 2021, the most used antimicrobial class according to the KONAS classification was ‘broad-spectrum antibacterial agents predominantly used for community-acquired infections’ in TCHs and SCHs, and 'narrow spectrum beta-lactam agents' in PCH. Total consumption of antimicrobials has decreased from 951.7 to 929.9 days of therapy (DOT)/1,000 patient-days in TCHs and from 817.8 to 752.2 DOT/1,000 patient-days in SCHs during study period, but not in PCHs (from 504.3 to 527.2 DOT/1,000 patient-days). Moreover, in 2021, while use of reserve antimicrobials has decreased from 13.6 to 10.7 DOT/1,000 patient-days in TCHs and from 4.6 to 3.3 DOT/1,000 patient-days in SCHs, it has increased from 0.7 to 0.8 DOT/1,000 patient-days in PCHs. Conclusion: This study confirms that antimicrobial use differs by hospital type in Korea. Recent increases of use of antimicrobials, including reserve antimicrobials, in PCHs reflect the challenges that must be addressed.
This study aimed to assess the actual burden of antibiotic use among end-of-life (EOL) patients in South Korea and to compare trends between cancer and non-cancer decedents.
Design:
Population-based mortality follow-back study.
Setting:
Data from the Korean National Health Insurance Database, covering the period from January1, 2006, to December 31, 2018, provided for research by the National Health Insurance Service (NHIS), were used.
Participants:
All decedents from 2006 to 2018 were included and categorized as cancer decedents or non-cancer decedents.
Methods:
Annual antibiotic consumption rates and prescription rates were calculated, and Poisson regression was used to estimate their trends.
Results:
Overall antibiotic consumption rates decreased slightly among decedents in their final month with a less pronounced annual decrease rate among cancer decedents compared to non-cancer decedents (0.4% vs 2.3% per year, P <.001). Over the study period, although narrow spectrum antibiotics were used less, utilization and prescription of broad-spectrum antibiotics steadily increased, and prescription rates were higher in cancer decedents compared to non-cancer controls. Specifically, carbapenem prescription rates increased from 5.6% to 18.5%, (RR 1.087, 95% CI 1.085–1.088, P <.001) in cancer decedents and from 2.9% to 13.2% (RR 1.115, 95% CI 1.113–1.116, P <.001) in non-cancer decedents.
Conclusions:
Our findings show that patients at the EOL, especially those with cancer, are increasingly and highly exposed to broad-spectrum antibiotics. Measures of antibiotic stewardship are required among this population.
Changes in lifestyle factors are known to affect mood. However, there is insufficient evidence supporting the association between smoking, alcohol consumption, physical activity and depression in middle-aged women who are likely to experience rapid hormonal changes.
Methods:
We used a nationwide database of medical records in South Korea. 901,721 premenopausal and 943,710 postmenopausal women aged 40 years or older included in this study. Information on smoking, alcohol consumption, physical activity was identified from health examination data and followed up for the occurrence of depression using claims data.
Results:
Compared with never-smokers, ex-smokers and current smokers among premenopausal and postmenopausal women showed an increased risk of depression in a dose-dependent manner (aHR 1.13 for ex-smokers; aHR 1.23 for current smokers). Compared with non-drinkers, mild drinkers showed a decreased risk of depression (aHR 0.98 for premenopausal women; aHR 0.95 for postmenopausal women), and heavy drinkers showed an increased risk of depression both among premenopausal (aHR 1.20) and postmenopausal women (aHR 1.05). The risk of depression due to smoking and heavy alcohol consumption was higher in premenopausal women than in postmenopausal women. Compared with those who had not engaged in regular physical activity, those who had engaged showed a decreased risk of depression both among premenopausal (aHR 0.96) and postmenopausal women (aHR 0.95).
Conclusions:
Smoking and heavy alcohol consumption increased the risk of depression, and the increased risk was prominent in premenopausal than in postmenopausal women. Regular physical activity decreased the risk of depression both in premenopausal and postmenopausal women.
Vascular abnormalities have been frequently reported in elderly adults as a potential risk factor of late-life depression. However, it is still unclear whether stenosis of cerebral arteries may increase risk of depression in the elderly.
Methods:
Study participants were 365 patients 65 years or older with depressive disorder who had undergone brain MRI and angiography (MRA) which were assessed by trained radiologists, and the 15-item Geriatric Depression Scale (GDS-15) and the Mini Mental State Examination (MMSE), and blood glucose and lipid profiles.
Results:
Of the 365 subjects, 108 had at least one location of cerebral artery stenosis (29.59%). Stenosis was associated with age, marital status, infarction, and atherosclerosis. In multivariable linear regression analysis of different locations of stenosis among the whole sample, only bilateral middle cerebral artery (MCA) stenosis was found to have a significant association with higher GDS-15 score (p= 0.0138), and more than 8 scores in the GDS-15 (p= 0.0045), but no significant associations with ACA (anterior cerebral artery), PCA (posterior cerebral artery) or ICA (internal carotid artery). In multivariable linear logistic analysis of different locations among patients with at least one cerebral artery stenosis, left MCA was found to be significantly related to higher GDS-15 scores but not with right MCA (p = 0.0202).
Conclusion:
MCA stenosis is significantly associated with severity of depression in elderly adults with cerebral artery stenosis, especially in those with left MCA stenosis.
Key points
Patients with cerebral artery stenosis found with brain magnetic resonance angiography (MRA) were associated with higher depression severity.
Stenosis of both left and right middle cerebral artery (MCA) was associated with greater depression severity, with left MCA stenosis having a greater influence on depression severity than right MCA stenosis.
Higher depression severity in patients with MCA stenosis suggests that depression in elderly patients is mediated at least in part by vascular pathology of MCA supplied regions and careful investigation and management of cerebral artery stenosis and their risk factors may help reduce the severity of depression in elderly patients who visit psychiatrists.
Although people who attempted suicide tend to repeat suicide attempts, there is a lack of evidence on the association between psychiatric service factors and suicide reattempt among them.
Methods:
We used a nationwide, population-based medical record database of South Korea to investigate the use of psychiatric services before and after the index suicide attempt and the association between psychiatric service factors after the index suicide attempt with the risk of suicide reattempt.
Results:
Among 5,874 people who had attempted suicide, the all-cause mortality within 3 months after the suicide attempt was 11.6%. Among all subjects who attempted suicide, 30.6% of them had used psychiatric services within 6 months before the suicide attempt; 43.7% of them had used psychiatric services within 3 months after the suicide attempt. Among individuals who had visited clinics following attempted suicide, the cumulative incidence of suicide reattempt over a mean follow-up period of 5.1 years was 3.4%. About half of suicide reattempts occurred within 1 year after the index suicide attempt. Referral to psychiatric services within 7 days was associated with a decreased risk of suicide reattempt (adjusted hazard ratio, 0.51; 95% confidence intervals, 0.29-0.89).
Conclusion:
An early psychiatric referral within 1 week after a suicide attempt was associated with a decreased risk of suicide reattempt.
South Korea has introduced conditionality to coverage decisions for certain difficult or high-risk procedures. The transcatheter aortic valve implantation (TAVI) was included in the coverage with evidence development (CED) in 2014. This study reviewed the results of reassessment for the TAVI using real world data (RWD) and suggested its implications.
Methods
Healthcare providers authorized to use the promising technologies are required to collect the RWD for suitability evaluation, safety monitoring, and cost-effectiveness, differing from the general reassessment process. In 2021, 45 healthcare providers collected clinical information for TAVI patients. Their registries were linked with the national health insurance claims, which provided data on 19 items to assess safety and effectiveness such as overall mortality, reoperation rates, hospital readmission rates, and degree of functional improvement.
Results
According to the Society of Thoracic Surgeons’ predicted risk of mortality (STS), 988 TAVI patients were classified into three groups; high (STS >8 percent, n=347), intermediate (STS 4-8 percent, n=272), and low (STS <4 percent, n=369); We compared main outcomes and estimated survival probabilities between subgroups. Within 30 days, the overall mortality rates were 4.9 percent (high), 2.6 percent (intermediate), and 1.4 percent (low); major bleeding rates were 7.6 percent (high), 6.2 percent (intermediate),and 1.4 percent (low); incidence of new atrial fibrillation were 6.8 percent (high), 4.2 percent (intermediate), and 3.2 percent (low). Based on the quantitative results using RWD and systematic review for the safety and effectiveness, TAVI is reported to have essential benefits for high-risk group and elderly patients (>80 years). Whereas, intermediate and low-risk groups have out-of-pocket payment rates of 50 percent and 80 percent, respectively.
Conclusions
The reassessment system through RWD accumulation enabled the evidence-based evaluation for the TAVI. Based on the transition to CED for essential benefits, a systematic framework such as RWD collection from treatment commencement should be introduced to broaden RWD use for benefit management of medical technologies with uncertain levels of evidence. Therefore, this ensures overall quality of care and effective coverage in health.
Background: Although small- and medium-sized hospitals comprise most healthcare providers in South Korea, data on antibiotic usage is limited in these facilities. We evaluated the pattern of antibiotic usage and its appropriateness in hospitals with <400 beds in South Korea. Methods: A multicenter retrospective study was conducted in 10 hospitals (6 long-term care hospitals, 3 acute-care hospitals, and 1 orthopedic hospital), with <400 beds in South Korea. We analyzed patterns of antibiotic prescription and their appropriateness in the participating hospitals. Data on the monthly antibiotic prescriptions and patient days for hospitalized patients were collected using electronic databases from each hospital. To avoid the effect of the COVID-19 pandemic, data were collected from January to December 2019. For the evaluation of the appropriateness of the prescription, 25 patients under antibiotic therapy were randomly selected at each hospital over 2 separate periods. Due to the heterogeneity of their characteristics, the orthopedics hospital was excluded from the analysis. The collected data were reviewed, and the appropriateness of antibiotic prescriptions was evaluated by 5 specialists in infectious diseases (adult and pediatric). Data from 2 hospitals were assigned to each specialist. The appropriateness of antibiotic prescriptions was evaluated from 3 aspects: route of administration, dose, and class. If the 3 aspects were ‘optimal,’ the prescription was considered ‘optimal.’ If only the route was ‘optimal,’ and the dose and/or class was ‘suboptimal,’ but not ‘inappropriate,’ it was considered ‘suboptimal.’ If even 1 aspect was ‘inappropriate,’ it was classified as ‘inappropriate.’ Results: The most commonly prescribed antibiotics in long-term care hospitals was fluoroquinolone, followed by β-lactam/β-lactamase inhibitor (antipseudomonal). In acute-care hospitals, these were third-generation cephalosporin, followed by first-generation cephalosporin and second-generation cephalosporin. The major antibiotics that were prescribed in the orthopedics hospital was first-generation cephalosporin. Only 2.3% of the antibiotics were administered inappropriately. In comparison, 15.3% of patients were prescribed an inappropriate dose. The proportion of inappropriate antibiotic prescriptions was 30.6% of the total antibiotic prescriptions. Conclusions: The antibiotic usage patterns vary between small- and medium-sized hospitals in South Korea. The proportion of inappropriate prescriptions exceeded 30% of the total antibiotic prescriptions.
Background: The δ (delta) variant has spread rapidly worldwide and has become the predominant strain of SARS-CoV-2. We analyzed an outbreak caused by a vaccine breakthrough infection in a hospital with an active infection control program where 91.9% of healthcare workers were vaccinated. Methods: We investigated a SARS-CoV-2 outbreak between September 9 and October 2, 2021, in a referral teaching hospital in Korea. We retrospectively collected data on demographics, vaccination history, transmission, and clinical features of confirmed COVID-19 in patients, healthcare workers, and caregivers. Results: During the outbreak, 94 individuals tested positive for SARS-CoV-2 using reverse transcription-polymerase chain reaction (rtPCR) testing. Testing identified infections in 61 health care workers, 18 patients, and 15 caregivers, and 70 (74.5%) of 94 cases were vaccine breakthrough infections. We detected 3 superspreading events: in the hospital staff cafeteria and offices (n = 47 cases, 50%), the 8th floor of the main building (n = 22 cases, 23.4%), and the 7th floor in the maternal and child healthcare center (n = 12 cases, 12.8%). These superspreading events accounted for 81 (86.2%) of 94 transmissions (Fig. 1, 2). The median interval between completion of vaccination and COVID-19 infection was 117 days (range, 18–187). There was no significant difference in the mean Ct value of the RdRp/ORF1ab gene between fully vaccinated individuals (mean 20.87, SD±6.28) and unvaccinated individuals (mean 19.94, SD±5.37, P = .52) at the time of diagnosis. Among healthcare workers and caregivers, only 1 required oxygen supplementation. In contrast, among 18 patients, there were 4 fatal cases (22.2%), 3 of whom were unvaccinated (Table 1). Conclusions: Superspreading infection among fully vaccinated individuals occurred in an acute-care hospital while the δ (delta) variant was dominant. Given the potential for severe complications, as this outbreak demonstrated, preventive measures including adequate ventilation should be emphasized to minimize transmission in hospitals.
This study aimed to determine the effect of donor-transmitted atherosclerosis on the late aggravation of cardiac allograft vasculopathy in paediatric heart recipients aged ≥7 years.
Methods:
In total, 48 patients were included and 23 had donor-transmitted atherosclerosis (baseline maximal intimal thickness of >0.5 mm on intravascular ultrasonography). Logistic regression analyses were performed to identify risk factors for donor-transmitted atherosclerosis. Rates of survival free from the late aggravation of cardiac allograft vasculopathy (new or worsening cardiac allograft vasculopathy on following angiograms, starting 1 year after transplantation) in each patient group were estimated using the Kaplan–Meier method and compared using the log-rank test. The effect of the results of intravascular ultrasonography at 1 year after transplantation on the late aggravation of cardiac allograft vasculopathy, correcting for possible covariates including donor-transmitted atherosclerosis, was examined using the Cox proportional hazards model.
Results:
The mean follow-up duration after transplantation was 5.97 ± 3.58 years. The log-rank test showed that patients with donor-transmitted atherosclerosis had worse survival outcomes than those without (p = 0.008). Per the multivariate model considering the difference of maximal intimal thickness between baseline and 1 year following transplantation (hazard ratio, 22.985; 95% confidence interval, 1.948–271.250; p = 0.013), donor-transmitted atherosclerosis was a significant covariate (hazard ratio, 4.013; 95% confidence interval, 1.047–15.376; p = 0.043).
Conclusion:
Paediatric heart transplantation recipients with donor-transmitted atherosclerosis aged ≥7 years had worse late cardiac allograft vasculopathy aggravation-free survival outcomes.
The accurate estimation of expected survival in terminal cancer patients is important. The palliative performance scale (PPS) is an important factor in predicting survival of hospice patients. The purpose of this study was to examine how initial status of PPS and changes in PPS affect the survival of hospice patients in Korea.
Method
We retrospectively examined 315 patients who were admitted to our hospice unit between January 2017 and December 2018. The patients were divided based on the PPS of ≥50% (group A) and ≤40% (group B). We performed survival analysis for factors associated with the length of survival (LOS) in group A. Based on the hospice team's weekly evaluation of PPS, we examined the effect of initial levels and changes in group A on the prognosis of patients who survived for 2 weeks or more.
Results
At the time of admission to hospice, 265 (84.1%) patients were PPS ≥50%, and 50 (15.9%) were PPS ≤40%. The median LOS of PPS ≥50% and PPS ≤40% were 15 (2–158 days) and 9 (2–43 days), respectively. Male, gastrointestinal cancer, and lower initial PPS all predicted poor prognosis in group A. Male, gastrointestinal cancer, and a PPS change of 10% or greater, compared with initial status 1 week and 2 weeks of hospitalization, were all predictors of poor prognosis in group A patients who survived for 2 weeks or longer.
Significance of results
Our research demonstrates the significance of PPS change at 1 week and 2 weeks, suggesting the importance of evaluating not only initial PPS but also change in PPS.
We calculated the human resources required for an antimicrobial stewardship program (ASP) in Korean hospitals.
Design:
Multicenter retrospective study.
Setting:
Eight Korean hospitals ranging in size from 295 to 1,337 beds.
Methods:
The time required for performing ASP activities for all hospitalized patients under antibiotic therapy was estimated and converted into hours per week. The actual time spent on patient reviews of each ASP activity was measured with a small number of cases, then the total time was estimated by applying the determined times to a larger number of cases. Full-time equivalents (FTEs) were measured according to labor laws in Korea (52 hours per week).
Results:
In total, 225 cases were reviewed to measure time spent on patient reviews. The median time spent per patient review for ASP activities ranged from 10 to 16 minutes. The total time spent on the review for all hospitalized patients was estimated using the observed number of ASP activities for 1,534 patients who underwent antibiotic therapy on surveillance days. The most commonly observed ASP activity was ‘review of surgical prophylactic antibiotics’ (32.7%), followed by ‘appropriate antibiotics recommendations for patients with suspected infection without a proven site of infection but without causative pathogens’ (28.6%). The personnel requirement was calculated as 1.20 FTEs (interquartile range [IQR], 1.02–1.38) per 100 beds and 2.28 FTEs (IQR, 1.93–2.62) per 100 patients who underwent antibiotic therapy, respectively.
Conclusion:
The estimated time required for human resources performing extensive ASP activities on all hospitalized patients undergoing antibiotic therapy in Korean hospitals was ~1.20 FTEs (IQR, 1.02–1.38) per 100 beds.
Early replacement of a new central venous catheter (CVC) may pose a risk of persistent or recurrent infection in patients with a catheter-related bloodstream infection (CRBSI). We evaluated the clinical impact of early CVC reinsertion after catheter removal in patients with CRBSIs.
Methods:
We conducted a retrospective chart review of adult patients with confirmed CRBSIs in 2 tertiary-care hospitals over a 7-year period.
Results:
To treat their infections, 316 patients with CRBSIs underwent CVC removal. Among them, 130 (41.1%) underwent early CVC reinsertion (≤3 days after CVC removal), 39 (12.4%) underwent delayed reinsertion (>3 days), and 147 (46.5%) did not undergo CVC reinsertion. There were no differences in baseline characteristics among the 3 groups, except for nontunneled CVC, presence of septic shock, and reason for CVC reinsertion. The rate of persistent CRBSI in the early CVC reinsertion group (22.3%) was higher than that in the no CVC reinsertion group (7.5%; P = .002) but was similar to that in the delayed CVC reinsertion group (17.9%; P > .99). The other clinical outcomes did not differ among the 3 groups, including rates of 30-day mortality, complicated infection, and recurrence. After controlling for several confounding factors, early CVC reinsertion was not significantly associated with persistent CRBSI (OR, 1.59; P = .35) or 30-day mortality compared with delayed CVC reinsertion (OR, 0.81; P = .68).
Conclusions:
Early CVC reinsertion in the setting of CRBSI may be safe. Replacement of a new CVC should not be delayed in patients who still require a CVC for ongoing management.
Recent hospital fire incidents in South Korea have heightened the importance of patient evacuation. Moving patients from an intensive care unit (ICU) or emergency department (ED) setting is a challenge due to the complexity of moving acutely unwell patients who are reliant on invasive monitoring and organ support. Despite the importance of patient evacuation, the readiness of ICU and ED for urgent evacuation has not been assessed.
Aim:
To enhance the readiness and competencies of workers from ICU and ED in the evacuation of patients during a simulated tabletop fire exercise.
Methods:
A tabletop simulation exercise was developed by the Center for Disaster Relief, Training, and Research referencing the fire evacuation manual developed by the hospital’s ICU and ED. The scenario consisted of evacuating patients horizontally and vertically from each department. The participants’ actions were assessed using a checklist. A debriefing was completed after the exercise to discuss the gaps observed. A post-survey questionnaire was used to evaluate the exercise and assess the perception changes of the participants. All pre-to-post differences within subjects were analyzed with paired t-tests.
Results:
A total of 22 and 29 people participated in the exercise from ICU and ED, respectively. Knowledge and confidence improved post-exercise for both ICU and ED scenarios (p<0.05). Course satisfaction was 7.9 and 8.7, respectively for ICU and ED exercise. Correct performance rates for ICU and ED were 59% and 58%, respectively. Common gaps noted for both ICU and ED were wearing protective masks, patient hand-over communication, and preparation for resources.
Discussion:
There need to be exercises to recognize system gaps in place for hospital fire evacuation preparedness. Tabletop simulation exercises are ideal tools for this purpose. Although this was a short 90-minute exercise, this increased familiarity with the evacuation plan, tested the plan, and allowed for identification of gaps.
South Korea experienced Middle East Respiratory Syndrome (MERS) outbreak in 2015. To mitigate the threat posed by MERS, the Ministry of Health and Center for Disease Control designated hospitals to be responsible for managing any suspected or confirmed infectious patient. These hospitals receive mandatory training in managing infectious patients, but many of the trainings lack practical skills practice and pandemic preparedness exercise.
Aim:
To develop and evaluate a training course designed to train healthcare providers from designated hospitals to enhance their competencies in managing emerging infectious diseases and potential outbreaks.
Methods:
A two-day course was developed by the Center for Disaster Relief, Training, and Research in collaboration with the Korea Health Promotion Institute using Kern’s 6-step approach. The course consisted of didactic lectures, technical skills training, tabletop simulation, and scenario-based simulation. Table-top simulation exercises consisted of cases involving a single infectious patient detected in the outpatient clinic and outbreak in the emergency department. Scenario-based simulation exercises involved managing a critically ill infectious patient in an isolated ward. A post-survey questionnaire was used to evaluate the course and assess the perception changes of the participants. All pre-to-post differences within subjects were analyzed with paired t-tests.
Results:
A total of 121 healthcare providers participated in three separate courses. The competencies for pandemic preparedness knowledge, skills, and attitude improved from pre- to post-course. The differences were all statistically significant (p<0.05). Overall course satisfaction in average for expectation, time, delivery method, and contents were 9.5, 9.2, 9.4, and 9.2, respectively.
Discussion:
There needs to be tests and exercises to recognize gaps of systems in place for pandemic preparedness. Simulation exercises are ideal tools for this purpose. Although this was only a two-day intensive course, this increased familiarity with workflows, tested the coordination of workflows between different disciplines and allowed the identification of gaps.
Somatization is known to be more prevalent in Asian than in Western populations. Using a South Korean adolescent and young adult twin sample (N = 1754; 367 monozygotic male, 173 dizygotic male, 681 monozygotic female, 274 dizygotic female and 259 opposite-sex dizygotic twins), the present study aimed to estimate heritability of somatization and to determine common genetic and environmental influences on somatization and hwabyung (HB: anger syndrome). Twins completed self-report questionnaires of the HB symptoms scale and the somatization scale via a telephone interview. The results of the general sex-limitation model showed that 43% (95% CI [36, 50]) of the total variance of somatization was attributable to additive genetic factors, with the remaining variance, 57% (95% CI [50, 64]), being due to individual-specific environmental influences, including measurement error. These estimates were not significantly different between the two sexes. The phenotypic correlation between HB and somatization was .53 (p < .001). The bivariate model-fitting analyses revealed that the genetic correlation between the two symptoms was .68 (95% CI [.59, .77]), while the individual-specific environmental correlation, including correlated measurement error, was .41 (95% CI [.34, .48]). Of the additive genetic factors of 43% that influence somatization, approximately half (20%) were associated with those related to HB, with the remainder being due to genes unique to somatization. A substantial part (48%) of individual environmental variance in somatization was unrelated to HB; only 9% of the environmental variance was shared with HB. Our findings suggest that HB and somatization have shared genetic etiology, but environmental factors that precipitate the development of HB and somatization may be largely independent from each other.
The present study aimed to estimate heritability of Hwabyung (HB) symptoms in adolescent and young adult twins in South Korea. The sample included 1,601 twins consisting of 143 pairs of monozygotic male (MZM), 67 pairs of dizygotic male (DZM), 295 pairs of monozygotic female (MZF), 114 pairs of dizygotic female (DZF), and 117 pairs of opposite-sex dizygotic (OSDZ) twins and 129 twins with non-participating co-twins (mean age = 19.1 ± 3.1 years; range: 12–29 years). An HB symptom questionnaire was given to twins via a telephone interview. Consistent with the literature of HB, the mean level of HB was significantly higher in females than in males. Maximum likelihood twin correlations for HB were 0.31 (95% CI [0.16, 0.45]) for MZM, 0.19 (95% CI [-0.05, 0.41]) for DZM, 0.50 (95% CI [0.41, 0.58]) for MZF, 0.28 (95% CI [0.11, 0.44]) for DZF, and 0.23 (95% CI [0.05, 0.40]) for OSDZ twins. These patterns of twin correlations suggested the presence of additive genetic influences on HB. Model-fitting analysis showed that additive genetic and individual-specific environmental influences on HB were 44% (95% CI [37, 51]) and 56% (95% CI [49, 63]), respectively. Shared environmental influences were not significant. These parameter estimates were not significantly different between two sexes, and did not change significantly with age in the present sample, suggesting that genetic and environmental influences on HB in both sexes are stable across adolescence and young adulthood.
The National Institute of Neurological Disease and Stroke-Canadian Stroke Network (NINDS-CSN) 5-minute neuropsychology protocol consists of only verbal tasks, and is proposed as a brief screening method for vascular cognitive impairment. We evaluated its feasibility within two weeks after stroke and ability to predict the development of post-stroke dementia (PSD) at 3 months after stroke.
Method:
We prospectively enrolled subjects with ischemic stroke within seven days of symptom onset who were consecutively admitted to 12 university hospitals. Neuropsychological assessments using the NINDS-CSN 5-minute and 60-minute neuropsychology protocols were administered within two weeks and at 3 months after stroke onset, respectively. PSD was diagnosed with reference to the American Heart Association/American Stroke Association statement, requiring deficits in at least two cognitive domains.
Results:
Of 620 patients, 512 (82.6%) were feasible for the NINDS-CSN 5-minute protocol within two weeks after stroke. The incidence of PSD was 16.2% in 308 subjects who had completed follow-up at 3 months after stroke onset. The total score of the NINDS-CSN 5-minute protocol differed significantly between those with and without PSD (4.0 ± 2.7, 7.4 ± 2.7, respectively; p < 0.01). A cut-off value of 6/7 showed reasonable discriminative power (sensitivity 0.82, specificity 0.67, AUC 0.74). The NINDS-CSN 5-minute protocol score was a significant predictor for PSD (adjusted odds ratio 6.32, 95% CI 2.65–15.05).
Discussion:
The NINDS-CSN 5-minute protocol is feasible to evaluate cognitive functions in patients with acute ischemic stroke. It might be a useful screening method for early identification of high-risk groups for PSD.
The HoDoo English game was developed to take advantage of the benefits attributed to on-line games while teaching English to native Korean speakers. We expected to see that the improvements in the subjects’ English language abilities after playing the HoDoo English game would be associated with increased brain functional connectivity in the areas of the brain involved in the language production (Broca’s area) and the understanding (Wernicke’s area) networks. Twelve children, aged nine and ten, were asked to play the on-line English education game for 50 minutes per day, five days per week for twelve weeks. At baseline, and again at the end of twelve weeks of game play, each child’s English language ability was assessed and a functional magnetic resonance imaging (fMRI) scan was conducted. The on-line English education game play effectively improved English language skills, especially in terms of non-verbal pragmatic skills. Following twelve weeks of on-line English education game play, the children showed positive connectivity between Broca’s area and the left frontal cortex as well as between Wernicke’s area and the left parahippocampal gyrus and the right medial frontal gyrus. Changes in pragmatic scores were positively correlated with average peak brain activity in the left parahippocampal gyrus. To the best of our knowledge, this is the first study to report an improvement in English ability and changes in brain activity within language areas after on-line language education game play.
To determine the influence of early pain relief for patients with suspected appendicitis on the diagnostic performance of surgical residents.
Methods
A prospective randomized, double-blind, placebo-controlled trial was conducted for patients with suspected appendicitis. The patients were randomized to receive placebo (normal saline intravenous [IV]) infusions over 5 minutes or the study drug (morphine 5 mg IV). All of the clinical evaluations by surgical residents were performed 30 minutes after administration of the study drug or placebo. After obtaining the clinical probability of appendicitis, as determined by the surgical residents, abdominal computed tomography was performed. The primary objective was to compare the influence of IV morphine on the ability of surgical residents to diagnose appendicitis.
Results
A total of 213 patients with suspected appendicitis were enrolled. Of these patients, 107 patients received morphine, and 106 patients received placebo saline. The negative appendectomy percentages in each group were similar (3.8% in the placebo group and 3.2% in the pain control group, p=0.62). The perforation rates in each group were also similar (18.9% in the placebo group and 14.3% in the pain control group, p=0.75). Receiver operating characteristic analysis revealed that the overall diagnostic accuracy in each group was similar (the area under the curve of the placebo group and the pain control group was 0.63 v. 0.61, respectively, p=0.81).
Conclusions
Early pain control in patients with suspected appendicitis does not affect the diagnostic performance of surgical residents.