We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Blood carotenoid concentration measurement is considered the gold standard for fruit and vegetable (F&V) intake estimation; however, this method is invasive and expensive. Recently, skin carotenoid status (SCS) measured by optical sensors has been evaluated as a promising parameter for F&V intake estimation. In this cross-sectional study, we aimed to validate the utility of resonance Raman spectroscopy (RRS)-assessed SCS as a biomarker of F&V intake in Korean adults. We used data from 108 participants aged 20–69 years who completed SCS measurements, blood collection and 3-d dietary recordings. Serum carotenoid concentrations were quantified using HPLC, and dietary carotenoid and F&V intakes were estimated via 3-d dietary records using a carotenoid database for common Korean foods. The correlations of the SCS with serum carotenoid concentrations, dietary carotenoid intake and F&V intake were examined to assess SCS validity. SCS was positively correlated with total serum carotenoid concentration (r = 0·52, 95 % CI = 0·36, 0·64, P < 0·001), serum β-carotene concentration (r = 0·60, 95 % CI = 0·47, 0·71, P < 0·001), total carotenoid intake (r = 0·20, 95 % CI = 0·01, 0·37, P = 0·04), β-carotene intake (r = 0·30, 95 % CI = 0·11, 0·46, P = 0·002) and F&V intake (r = 0·40, 95 % CI = 0·23, 0·55, P < 0·001). These results suggest that SCS can be a valid biomarker of F&V intake in Korean adults.
Birds in flight are prone to collide with various transparent or reflective structures. While bird–window collision has been recognised as a critical conservation issue, collision with other transparent structures has been less understood. Noise barriers made of transparent materials are considered critical hazards for birds; however, little is known about the bird mortality they cause. We conducted the first nationwide-scale estimates of bird-collision mortality caused by transparent noise barriers (TNBs) along roads in the Republic of Korea. The total length of existing roadside transparent noise barriers was estimated at 1,416 km nationwide (as of 2018), and it had been increasing exponentially. Based on carcass surveys at 25 sites, daily mortality at the observed barriers was 0.335 ± 1.132 birds/km on average, and no difference in observed mortality was detected between both sides of a single barrier and between road types (i.e. local roads and motorways). Finally, we estimated that approximately 186,000 birds (95% confidence interval: 162,465–204,812 birds) are killed annually by collisions with roadside TNBs. As privately installed barriers were not considered in this study, the actual mortality is likely be higher than our estimates. Thus, collision with TNBs could become an emerging threat to avian conservation, especially in developing and urbanising regions around the world. As such structures are not formally recognised as conservation issues of importance, more systematic surveys aided by citizen science, both for the status of TNBs and bird-collision mortality, are needed in addition to management and mitigation policies.
The COVID-19 pandemic greatly impacted the social lives of older adults across several areas, leading to concern about an increase in loneliness. This study examines the associations of structural, functional, and quality aspects of social connection with increased loneliness during COVID-19 and how these associations vary by sociodemographic factors.
Design:
Secondary data analyses on a nationally representative survey of older US adults.
Setting:
The 2020 Health and Retirement Study (HRS) COVID-19 module.
Participants:
The study sample includes 3,804 adults aged 54 or older.
Measurements:
Increased loneliness was based on respondents’ self-report on whether they felt lonelier than before the COVID-19 outbreak.
Results:
While 29% felt lonelier after COVID-19, middle-aged adults, women, non-Hispanic Whites, and the most educated were more likely to report increased loneliness. Not having enough in-person contact with people outside the household was associated with increased loneliness (OR = 10.07, p < .001). Receiving emotional support less frequently (OR = 2.28, p < .05) or more frequently (OR = 2.00, p < .001) than before was associated with increased loneliness. Worse quality of family relationships (OR = 1.85, p < .05) and worse friend/neighbor relationships (OR = 1.77, p < .01) were related to feeling lonelier. Significant interactions indicated stronger effects on loneliness of poor-quality family relationships for women and insufficient in-person contact with non-household people for the middle-aged group and non-Hispanic Whites.
Conclusions:
Our findings show an increase in loneliness during COVID-19 that was partly due to social mitigation efforts, and also uncover how sociodemographic groups were impacted differently, providing implications for recovery and support.
Identification of geographical areas with high burden of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) transmission in schools using spatial analyses has become an important tool to guide targeted interventions in educational setting. In this study, we aimed to explore the spatial distribution and determinants of coronavirus disease 2019 (COVID-19) among students aged 3–18 years in South Korea. We analysed the nationwide epidemiological data on laboratory-confirmed COVID-19 cases in schools and in the communities between January 2020 and October 2021 in South Korea. To explore the spatial distribution, the global Moran's I and Getis-Ord's G using incidence rates among the districts of aged 3–18 years and 30–59 years. Spatial regression analysis was performed to find sociodemographic predictors of the COVID-19 attack rate in schools and in the communities. The global spatial correlation estimated by Moran's I was 0.647 for the community population and 0.350 for the student population, suggesting that the students were spatially less correlated than the community-level outbreak of SARS-CoV-2. In schools, attack rate of adults aged 30–59 years in the community was associated with increased risk of transmission (P < 0.0001). Number of students per class (in kindergartens, primary schools, middle schools and high schools) did not show significant association with the school transmission of SARS-CoV-2. In South Korea, COVID-19 in students had spatial variations across the country. Statistically significant high hotspots of SARS-CoV-2 transmission among students were found in the capital area, with dense population level and high COVID-19 burden among adults aged 30–59 years. Our finding suggests that controlling community-level burden of COVID-19 can help in preventing SARS-CoV-2 infection in school-aged children.
The Korea Disaster Relief Team (KDRT) was established in 2008 to systemize Korea’s overseas medical emergency response. Following multiple international deployments since 2008, KDRT embarked on its journey to achieve WHO Emergency Medical Team (EMT) Global Classification in 2017.
Objectives:
To outline the key success factors in KDRT’s work to reach classification as a Type 1 Fixed EMT.
Method/Description:
As the Korean government dispatches KDRT, a multi-agency collaboration is essential to respond to overseas disasters. To this end, KDRT leveraged a formal collaborative approach, assigning specific roles for EMT development and deployment to several national agencies: The Republic of Korea Ministry of Foreign Affairs, the Ministry of Health and Welfare, the Ministry of National Defense, the Korea International Cooperation Agency, Korea’s National Medical Center, and the Korea Foundation for International Healthcare. This network prepared KDRT for WHO EMT Verification and developing Standard Operation Procedures for the EMT Type 1. Based on this SOP, KDRT repeated simulations for each element to strengthening capabilities and enable deployment ensuring strong coordination with national and international partners in response.
Results/Outcomes:
After initiating KDRT’s journey towards EMT classification 2017, KDRT formalized cooperation with multiple agencies, and codified these roles and responsibilities in formal/published SOPs. Finally, the KDRT was verified by WHO in June 2022.
Conclusion:
This study provides a process within the operating system limited to the Republic of Korea, the country of the KDRT. However, it also can be used as a collaborative reference case in the EMT development and verification process.
We compared the pregnancy and live birth rates following transfer of early-stage embryos or blastocysts produced by somatic cell nuclear transfer using in vitro-matured oocytes. In total 102 ovaries were collected from dromedary camels at a local abattoir; from these 1048 cumulus–oocytes complexes (COCs) were aspirated and cultured for 42 h in a commercial maturation medium. Metaphase II oocytes were subjected to nuclear transfer. Somatic cell nuclear transfer-derived embryos were cultured in a commercial embryo medium for 2 or 7 days. Next, 71 early-stage embryos were surgically transferred to the left fallopian tube of 28 recipients and 47 blastocysts were transferred to the left uterine horn of 26 recipients. Early pregnancy was detected by serum progesterone (P4), and pregnancy was confirmed using ultrasonography on days 30 and 90 after embryo transfer. Pregnancy rate based on P4 level was 17.86% (5/28) and 11.54% (3/26) for early-stage embryo and blastocyst transfer, respectively. In the early-stage embryo group, out of five recipients, one recipient had lost the pregnancy by the first ultrasonography on day 30; two other recipients aborted at 14 and 24 weeks, and two recipients gave live births. In the blastocyst group, out of three recipients, one lost the pregnancy at an early stage and two recipients gave live births. Therefore, for dromedary camels, we recommend transvaginal blastocyst transfer from the standpoint of the pregnancy and live birth rate, ease of the transfer procedure, and comfort and safety of the recipients.
This study aimed to determine the effect of donor-transmitted atherosclerosis on the late aggravation of cardiac allograft vasculopathy in paediatric heart recipients aged ≥7 years.
Methods:
In total, 48 patients were included and 23 had donor-transmitted atherosclerosis (baseline maximal intimal thickness of >0.5 mm on intravascular ultrasonography). Logistic regression analyses were performed to identify risk factors for donor-transmitted atherosclerosis. Rates of survival free from the late aggravation of cardiac allograft vasculopathy (new or worsening cardiac allograft vasculopathy on following angiograms, starting 1 year after transplantation) in each patient group were estimated using the Kaplan–Meier method and compared using the log-rank test. The effect of the results of intravascular ultrasonography at 1 year after transplantation on the late aggravation of cardiac allograft vasculopathy, correcting for possible covariates including donor-transmitted atherosclerosis, was examined using the Cox proportional hazards model.
Results:
The mean follow-up duration after transplantation was 5.97 ± 3.58 years. The log-rank test showed that patients with donor-transmitted atherosclerosis had worse survival outcomes than those without (p = 0.008). Per the multivariate model considering the difference of maximal intimal thickness between baseline and 1 year following transplantation (hazard ratio, 22.985; 95% confidence interval, 1.948–271.250; p = 0.013), donor-transmitted atherosclerosis was a significant covariate (hazard ratio, 4.013; 95% confidence interval, 1.047–15.376; p = 0.043).
Conclusion:
Paediatric heart transplantation recipients with donor-transmitted atherosclerosis aged ≥7 years had worse late cardiac allograft vasculopathy aggravation-free survival outcomes.
Two aphid-transmitted RNA viruses, broad bean wilt virus 2 (BBWV2) and cucumber mosaic virus (CMV), are the most prevalent viruses in Korean pepper fields and cause chronic damage in pepper production. In this study, we employed a screening system for pathotype-specific resistance of pepper germplasm to BBWV2 and CMV by utilizing infectious cDNA clones of different pathotypes of the viruses (two BBWV2 strains and three CMV strains). We first examined pathogenic characteristics of the BBWV2 and CMV strains in various plant species and their phylogenetic positions in the virus population structures. We then screened 34 commercial pepper cultivars and seven accessions for resistance. While 21 pepper cultivars were resistant to CMV Fny strain, only two cultivars were resistant to CMV P1 strain. We also found only one cultivar partially resistant to BBWV2 RP1 strain. However, all tested commercial pepper cultivars were susceptible to the resistance-breaking CMV strain GTN (CMV-GTN) and BBWV2 severe strain PAP1 (BBWV2-PAP1), suggesting that breeding new cultivars resistant to these virus strains is necessary. Fortunately, we identified several pepper accessions that were resistant or partially resistant to CMV-GTN and one symptomless accession despite systemic infection with BBWV2-PAP1. These genetic resources will be useful in pepper breeding programs to deploy resistance to BBWV2 and CMV.
Background: After the Middle East respiratory syndrome coronavirus outbreak in Korea in 2015, the government newly established the additional reimbursement for infection prevention to encourage infection control activities in the hospitals. The new policy was announced in December 2015 and was implemented in September 2016. We evaluated how infection control activities improved in hospitals after the change of government policy in Korea. Methods: Three cross-sectional surveys using the WHO Hand Hygiene Self-Assessment Framework (HHSAF) were conducted in 2013, 2015, and 2017. Using multivariable linear regression model including hospital characteristics, we analyzed the changes in total HHSAF scores according to the survey time. Results: In total, 32 hospitals participated in the survey in 2013, 52 in 2015, and 101 in 2017. The number of inpatient beds per infection control professionals decreased from 324 in 2013 to 303 in 2015 and 179 in 2017. Most hospitals were at intermediate or advanced levels of progress (90.6% in 2013, 86.6% in 2015, and 94.1% in 2017). In a multivariable linear regression model, the total HHSAF scores were significantly associated with hospital teaching status (β coefficient of major teaching hospital, 52.6; 95% CI, 8.9–96.4; P = .018), bed size (β coefficient of 100-bed increase, 5.1; 95% CI, 0.3–9.8; P = .038), and survey time (β coefficient of 2017 survey, 45.1; 95% CI, 19.3–70.9; P = .001). Conclusions: After the national policy implementation, the number of infection control professionals increased, and the promotion of hand hygiene activities was strengthened in Korean hospitals.
Background: Recently, healthcare-associated infections (HAIs) in long-term care hospitals (LTCHs) have markedly increased, but no infection control policy has been established in South Korea. We investigated the current HAI surveillance system and executed a point-prevalence pilot study in LTCHs. Methods: HAIs were defined by newly established surveillance manual based on McGeer criteria revised in 2012. Three LTCHs in Seoul and Gyeonggi province were voluntarily recruited, and data were collected from up to 50 patients who were hospitalized on August 1. The medical records from September to November 2018 were retrospectively reviewed by a charge nurse for infection control per each hospitals after 1 day of training specific for LTCH surveillance. All data were reviewed by a senior researcher visiting onsite. Results: The participating hospitals had 272.33 ± 111.01 beds. Only 1 hospital had an onsite microbiological laboratory. In total, 156 patients were enrolled and 5 HAIs were detected, for a prevalence rate of 3.2%. The average patient age was 79.04 ± 9.92 years. The HAIs included 2 urinary tract infections, skin and soft-tissue infection, low respiratory infection, and conjunctivitis. Conclusions: This is the first survey of HAI in LTCHs in South Korea. The 3.2% prevalence rate is lower than those from previous reports from the European Union or the United States. This study supports the development of a national HAI surveillance and infection control system in LTCHs, although implementation may be limited due to the lack of laboratory support and infection control infrastructure in Korea.
We report our experience with an emergency room (ER) shutdown related to an accidental exposure to a patient with coronavirus disease 2019 (COVID-19) who had not been isolated.
Setting:
A 635-bed, tertiary-care hospital in Daegu, South Korea.
Methods:
To prevent nosocomial transmission of the disease, we subsequently isolated patients with suspected symptoms, relevant radiographic findings, or epidemiology. Severe acute respiratory coronavirus 2 (SARS-CoV-2) reverse-transcriptase polymerase chain reaction assays (RT-PCR) were performed for most patients requiring hospitalization. A universal mask policy and comprehensive use of personal protective equipment (PPE) were implemented. We analyzed effects of these interventions.
Results:
From the pre-shutdown period (February 10–25, 2020) to the post-shutdown period (February 28 to March 16, 2020), the mean hourly turnaround time decreased from 23:31 ±6:43 hours to 9:27 ±3:41 hours (P < .001). As a result, the proportion of the patients tested increased from 5.8% (N=1,037) to 64.6% (N=690) (P < .001) and the average number of tests per day increased from 3.8±4.3 to 24.7±5.0 (P < .001). All 23 patients with COVID-19 in the post-shutdown period were isolated in the ER without any problematic accidental exposure or nosocomial transmission. After the shutdown, several metrics increased. The median duration of stay in the ER among hospitalized patients increased from 4:30 hours (interquartile range [IQR], 2:17–9:48) to 14:33 hours (IQR, 6:55–24:50) (P < .001). Rates of intensive care unit admissions increased from 1.4% to 2.9% (P = .023), and mortality increased from 0.9% to 3.0% (P = .001).
Conclusions:
Problematic accidental exposure and nosocomial transmission of COVID-19 can be successfully prevented through active isolation and surveillance policies and comprehensive PPE use despite longer ER stays and the presence of more severely ill patients during a severe COVID-19 outbreak.
Objectives: Rotator cuff tear is the leading cause of the decline in quality of life for older adults, but comparative evidence on treatment effectiveness is lacking. This study systematically reviewed the effects of various rotator cuff tear treatments through a Bayesian meta-analysis of the related randomized clinical trials (RCTs).
Methods: We searched nine electronic databases for RCTs evaluating rotator cuff tear treatments from their inception through June 2017. A systematic review was performed following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) and the National Institute for Health and Care Excellence-Decision Support Unit guidelines (Supplementary Table 1). Outcomes included functional improvement, pain one year after surgical treatment, and tendon structural integrity. The Bayesian network meta-analysis was applied for functional improvement and pain, based on an assumption of consistency and similarity. Tendon integrity was reported descriptively.
Results: Fifteen RCTs were selected. Patients undergoing physiotherapy after open surgery showed statistically significant functional improvements compared with those undergoing physiotherapy only (mean differences, 9.1 [credible interval, 0.9–17.4]). Open surgery with physiotherapy was associated with a decrease in pain 1 year after treatment compared with when physiotherapy was combined with arthroscopic rotator cuff surgery, mini open surgery, platelet-rich plasma therapy, or physiotherapy alone (absolute value of mean difference 1.2 to 1.4). The tendon integrity results were inconsistent.
Conclusions: Some surgical treatments were associated with significant improvement in function and pain, but evidence regarding their comparative effectiveness is still lacking. A well-designed RCT discussing functional and structural treatment outcomes is needed in future.
Our objective was to evaluate long-term altered appearance, distress, and body image in posttreatment breast cancer patients and compare them with those of patients undergoing active treatment and with general population controls.
Method:
We conducted a cross-sectional survey between May and December of 2010. We studied 138 breast cancer patients undergoing active treatment and 128 posttreatment patients from 23 Korean hospitals and 315 age- and area-matched subjects drawn from the general population. Breast, hair, and skin changes, distress, and body image were assessed using visual analogue scales and the EORTC BR–23. Average levels of distress were compared across groups, and linear regression was utilized to identify the factors associated with body image.
Results:
Compared to active-treatment patients, posttreatment patients reported similar breast changes (6.6 vs. 6.2), hair loss (7.7 vs. 6.7), and skin changes (5.8 vs. 5.4), and both groups had significantly more severe changes than those of the general population controls (p < 0.01). For a similar level of altered appearance, however, breast cancer patients experienced significantly higher levels of distress than the general population. In multivariate analysis, patients with high altered appearance distress reported significantly poorer body image (–20.7, CI95% = –28.3 to –13.1) than patients with low distress.
Significance of results:
Posttreatment breast cancer patients experienced similar levels of altered appearance, distress, and body-image disturbance relative to patients undergoing active treatment but significantly higher distress and poorer body image than members of the general population. Healthcare professionals should acknowledge the possible long-term effects of altered appearance among breast cancer survivors and help them to manage the associated distress and psychological consequences.
This study aimed to investigate the relationship between the unemployment experience and depressive symptoms among mid-aged (ages 45–59) and elderly (ages 60 or above) persons and to examine further the effects of unemployment insurance, industrial accident compensation insurance (IACI) and national pension on the stated relationship. Data were used from the Korean Longitudinal Study of Aging (KLoSA) between 2006 and 2012. A total of 1,536 individuals employed at the 2006 baseline were followed. The association between employment status change during 2006 to 2008, 2008 to 2010 or 2010 to 2012 and depressive symptoms in years 2008, 2010 or 2012 were analysed using a generalised estimating equation model. Depressive symptoms were measured with the Center for Epidemiological Studies Depression Scale (CES-D 10) scale. The results showed that the ‘employed to unemployed’ group had statistically significant increases in depression scores in the mid-aged (β = 0.4884, p = 0.0038) and elderly (β = 0.8275, p ⩽ 0.0001) categories, compared to the ‘employed to employed’ group. Findings were maintained in groups without a social safety net. Contrastingly, the ‘employed to unemployed’ groups with unemployment insurance and IACI did not show statistically significant increases in depression scores. The ‘employed to unemployed’ category of individuals enrolled in the national pension system exhibited a lower increase of depression. Therefore, an enhanced focus on the mental health of unemployed individuals is required, in addition to the provision of a reliable social safety net.
Personality may predispose family caregivers to experience caregiving differently in similar situations and influence the outcomes of caregiving. A limited body of research has examined the role of some personality traits for health-related quality of life (HRQoL) among family caregivers of persons with dementia (PWD) in relation to burden and depression.
Methods:
Data from a large clinic-based national study in South Korea, the Caregivers of Alzheimer's Disease Research (CARE), were analyzed (N = 476). Path analysis was performed to explore the association between family caregivers’ personality traits and HRQoL. With depression and burden as mediating factors, direct and indirect associations between five personality traits and HRQoL of family caregivers were examined.
Results:
Results demonstrated the mediating role of caregiver burden and depression in linking two personality traits (neuroticism and extraversion) and HRQoL. Neuroticism and extraversion directly and indirectly influenced the mental HRQoL of caregivers. Neuroticism and extraversion only indirectly influenced their physical HRQoL. Neuroticism increased the caregiver's depression, whereas extraversion decreased it. Neuroticism only was mediated by burden to influence depression and mental and physical HRQoL.
Conclusions:
Personality traits can influence caregiving outcomes and be viewed as an individual resource of the caregiver. A family caregiver's personality characteristics need to be assessed for tailoring support programs to get the optimal benefits from caregiver interventions.
It is unclear how brain reserve interacts with gender and apolipoprotein E4 (APOE4) genotype, and how this influences the progression of Alzheimer's disease (AD). The association between intracranial volume (ICV) and progression to AD in subjects with mild cognitive impairment (MCI), and differences according to gender and APOE4 genotype, was investigated.
Methods:
Data from subjects initially diagnosed with MCI and at least two visits were downloaded from the ADNI database. Those who progressed to AD were defined as converters. The longitudinal influence of ICV was determined by survival analysis. The time of conversion from MCI to AD was set as a fiducial point, as all converters would be at a similar disease stage then, and longitudinal trajectories of brain atrophy and cognitive decline around that point were compared using linear mixed models.
Results:
Large ICV increased the risk of conversion to AD in males (HR: 4.24, 95% confidence interval (CI): 1.17–15.40) and APOE4 non-carriers (HR: 10.00, 95% CI: 1.34–74.53), but not in females or APOE4 carriers. Cognitive decline and brain atrophy progressed at a faster rate in males with large ICV than in those with small ICV during the two years before and after the time of conversion.
Conclusions:
Large ICV increased the risk of conversion to AD in males and APOE4 non-carriers with MCI. This may be due to its influence on disease trajectory, which shortens the duration of the MCI stage. A longitudinal model of progression trajectory is proposed.
The aim of the present study was to identify the genes differentially expressed in the visceral adipose tissue in a well-characterised mouse model of high-fat diet (HFD)-induced obesity. Male C57BL/6J mice (n 20) were fed either HFD (189 % of energy from fat) or low-fat diet (LFD, 42 % of energy from fat) for 16 weeks. HFD-fed mice exhibited obesity, insulin resistance, dyslipidaemia and adipose collagen accumulation, along with higher levels of plasma leptin, resistin and plasminogen activator inhibitor type 1, although there were no significant differences in plasma cytokine levels. Energy intake was similar in the two diet groups owing to lower food intake in the HFD group; however, energy expenditure was also lower in the HFD group than in the LFD group. Microarray analysis revealed that genes related to lipolysis, fatty acid metabolism, mitochondrial energy transduction, oxidation–reduction, insulin sensitivity and skeletal system development were down-regulated in HFD-fed mice, and genes associated with extracellular matrix (ECM) components, ECM remodelling and inflammation were up-regulated. The top ten up- or down-regulated genes include Acsm3, mt-Nd6, Fam13a, Cyp2e1, Rgs1 and Gpnmb, whose roles in the deterioration of obesity-associated adipose tissue are poorly understood. In conclusion, the genes identified here provide new therapeutic opportunities for prevention and treatment of diet-induced obesity.
Conus medullaris syndrome (CMS) is a clinical neurologic syndrome caused by a conus medullaris lesion. CMS is a heterogeneous entity with various etiologies such as trauma or a space-occupying lesion. Multiple cases of CMS following spinal anesthesia have been reported, but CMS after radioisotope (RI) cisternography has not yet been reported.
Methods:
We present four patients who developed CMS after RI cisternography.
Results:
All experienced neurological deficits such as paraparesis, sensory loss, and urinary incontinence three to four days after RI cisternography. Two showed abnormalities on lumbar magnetic resonance imaging, and three had complete symptom resolution within ten weeks.
Conclusions:
The pathomechanism of the CMS is unclear, but we hypothesize that RI neurotoxicity might be responsible. It is possible that the use of low-dose 99mTc-DTPA or an alternative diagnostic tool such as magnetic resonance cisternography could help to prevent this complication.
The aim of this study is to examine a relationship between a change in social activity and depression among Koreans aged 45 years or more.
Methods:
Data came from the Korean Longitudinal Study of Aging (KLoSA) (2006–2010), with 5,327 participants aged 45 years or more. The generalized estimating equation (GEE) with the logit link was used to investigate an association between a change in social activity during 2006–2008 (or 2008–2010) and depression among respondents in year 2008 (or Y2010). Depression was measured by Center for Epidemiological Studies Depression scale (CES-D10) and a change in social activity was classified with four categories, i.e. “consistent participation”, “consistent non-participation”, “participation to non-participation”, and “non-participation to participation”. Social activity was divided into various elements and the same analysis was conducted for each of these elements.
Results:
Those with consistent non-participation and from participation to non-participation were more likely to be depressed than those with consistent participation and from non-participation to participation in social activities (OR 1.44 [95% CI 1.22–1.71], OR 1.35 [95% CI 1.15–1.58] vs. OR 1.00 [Reference], OR 1.27 [95% CI 1.09–1.48]). In addition, the strength of the negative association between consistent or new participation in social activity and depression was different across different elements of social activity. The negative association was particularly strong for leisure, culture or sports clubs, and for family or school reunion.
Conclusion:
For improving the mental health of the population aged 45 years or more, the promotion of their continued or new participations in leisure/culture clubs and family/school reunion might be needed in South Korea.