We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The study aimed to delve into the incidence and risk factors associated with myocarditis and pericarditis following SARS-COV-2-19 vaccination, addressing a notable gap in understanding the safety profile of vaccinations. Through meticulous data selection from the National Health Insurance System (NHIS) database of Korea, the researchers employed both a case-crossover study and a nested case-control design to analyze temporal patterns and risk factors related to carditis occurrences post-immunization. Key findings revealed a significant association between SARS-COV-2-19 vaccination and the occurrence of carditis, with a strong temporal correlation observed within 10 days post-vaccination. Noteworthy factors contributing to carditis risk included the duration between vaccination and carditis, specific comorbidities and medication use. The study concluded by recommending an extended post-vaccination surveillance duration of at least 10 days and underscored the importance of considering individual medical histories and concurrent medication use in assessing vaccine-induced carditis risk. This study might contribute to understanding vaccine safety profiles and emphasizes the significance of comprehensive post-vaccination monitoring protocols.
Although, attempts to apply virtual reality (VR) in mental healthcare are rapidly increasing, it is still unclear whether VR relaxation can reduce stress more than conventional biofeedback.
Methods:
Participants consisted of 83 healthy adult volunteers with high stress, which was defined as a score of 20 or more on the Perceived Stress Scale-10 (PSS-10). This study used an open, randomized, crossover design with baseline, stress, and relaxation phases. During the stress phase, participants experienced an intentionally generated shaking VR and serial-7 subtraction. For the relaxation phase, participants underwent a randomly assigned relaxation session on day 1 among VR relaxation and biofeedack, and the other type of relaxation session was applied on day 2. We compared the StateTrait Anxiety Inventory-X1 (STAI-X1), STAI-X2, the Numeric Rating Scale (NRS), and physiological parameters including heart rate variability (HRV) indexes in the stress and relaxation phases.
Results:
A total of 74 participants were included in the analyses. The median age of participants was 39 years, STAI-X1 was 47.27 (SD = 9.92), and NRS was 55.51 (SD = 24.48) at baseline. VR and biofeedback significantly decreased STAI-X1 and NRS from the stress phase to the relaxation phase, while the difference of effect between VR and biofeedback was not significant. However, there was a significant difference in electromyography, LF/HF ratio, LF total, and NN50 between VR relaxation and biofeedback
Conclusion:
VR relaxation was effective in reducing subjectively reported stress in individuals with high stress.
Critical congenital heart disease (CCHD) refers to a group of heart defects that cause serious, life-threatening symptoms in the neonatal period and requires timely surgical or catheter interventions. We tried to explore current status of CCHD burden and the effect of early diagnosis of CCHD to mortality using the Korean national health insurance (NHI) data.
Methods
We analyzed the national health insurance (NHI) data from 2014 to 2018. We identified CCHD patients using the diagnosis codes and intervention codes from the claim data and the prevalence, mortality and medical expenditure of CCHD were analyzed. We linked neonatal data with their mother’s medical claim data and developed retrospective cohort data set for analyzing the effect of early diagnosis to mortality and related outcomes of CCHD treatment.
Results
The annual prevalence of neonatal CCHD in Korea was 0.144% percent. A total of 2,241 CCHD neonates, 1,546 (69.0%) underwent cardiac ultrasound within three days after birth, and mothers of 419 neonates had a record of prenatal fetal ultrasound (18.7%). In our comparison of neonates diagnosed with CCHD within three days of birth with those diagnosed with CCHD on or after day 4 of birth, the probability of early diagnosis increased for preterm infants and infants with low birth rate. Regarding mortality rate, most types of CCHD showed a significantly higher mortality rate in the early diagnosis group.
Conclusions
The reason for the high mortality rate despite a high early diagnosis rate pertains to the high percentage of patients with severe conditions that induce a serious heart rate within three days of birth. More than half of the neonates with CCHD were found to have not undergone a prenatal fetal ultrasound, rendering this an important policy target.
Critical congenital heart disease (CCHD) refers to a group of heart defects that cause serious, life-threatening symptoms in the neonatal period and requires timely surgical or catheter interventions. We reviewed evidence for incorporating a mandatory neonatal CCHD screening test as a national public health project for all neonates born in Korea by analyzing the validity and cost-effectiveness of neonatal CCHD screening using pulse oximetry in Korea.
Methods
We performed a rapid literature review to establish models for the diagnostic accuracy and economic evaluation of pulse oximetry. Also, we analyzed the prevalence, mortality, and medical expenditure for different types of CCHD using the national health insurance (NHI) data. We analyzed the cost-effectiveness of pulse oximetry by comparing the group of neonates who received a combination of a physical examination and pulse oximetry, and group of neonates who only received a physical examination. For the cost-effectiveness analysis for the CCHD screening test in this study, we used a duration of one year, diagnostic accuracy as the clinical endpoint, and Life Year Gain (LYG) as the effectiveness indicator.
Results
We used recent systematic review he pooled sensitivity can be enhanced from 76.5 percent (pulse oximetry alone) to 92 percent (combined with physical examination). We used a total of 2,334 neonates with CCHD data for the economic model. Our analysis revealed that adding pulse oximetry to the routine neonatal physical examination leads to 2.34 of LYG and a cost difference of USD1,080,602, showing a ICER of KRW610,063,240 (USD461,857)/LYG.
Conclusions
Considering the benefit of LYG and cost of reducing the complications and after effects of newborns with CCHD who survived early diagnosis, it is considered to be worthwhile in Korea for a mandatory screening test.
Nosocomial transmission of COVID-19 among immunocompromised hosts can have a serious impact on COVID-19 severity, underlying disease progression and SARS-CoV-2 transmission to other patients and healthcare workers within hospitals. We experienced a nosocomial outbreak of COVID-19 in the setting of a daycare unit for paediatric and young adult cancer patients. Between 9 and 18 November 2020, 473 individuals (181 patients, 247 caregivers/siblings and 45 staff members) were exposed to the index case, who was a nursing staff. Among them, three patients and four caregivers were infected. Two 5-year-old cancer patients with COVID-19 were not severely ill, but a 25-year-old cancer patient showed prolonged shedding of SARS-CoV-2 RNA for at least 12 weeks, which probably infected his mother at home approximately 7–8 weeks after the initial diagnosis. Except for this case, no secondary transmission was observed from the confirmed cases in either the hospital or the community. To conclude, in the day care setting of immunocompromised children and young adults, the rate of in-hospital transmission of SARS-CoV-2 was 1.6% when applying the stringent policy of infection prevention and control, including universal mask application and rapid and extensive contact investigation. Severely immunocompromised children/young adults with COVID-19 would have to be carefully managed after the mandatory isolation period while keeping the possibility of prolonged shedding of live virus in mind.
The study aims to examine whether cognitive deficits are different between patients with early stage Alzheimer's disease (AD) and patients with early stage vascular dementia (VaD) using the Korean version of the CERAD neuropsychological battery (CERAD-K-N).
Methods
Patients with early stage dementia, global Clinical Dementia Rating (CDR) 0.5 or 1 were consecutively recruited among first visitors to a dementia clinic, 257 AD patients and 90 VaD patients completed the protocol of the Korean version of the CERAD clinical assessment battery. CERAD-K-N was administered for the comprehensive evaluation of the neuropsychological function.
Results
Of the total 347 participants, 257 (69.1%) were AD group (CDR 0.5 = 66.9%) and 90 (21.9%) were VaD group (CDR 0.5 = 40.0%). Patients with very mild AD showed poorer performances in Boston naming test (BNT) (P = 0.028), word list memory test (P < 0.001), word list recall test (P < 0.001) and word list recognition test (WLRcT) (P = 0.006) than very mild VaD after adjustment of T score of MMSE-KC. However, the performance of trail making A (TMA) was more impaired in VaD group than in AD group. The performance of WLRcT (P < 0.001) was the worst among neuropsychological tests within AD group, whereas TMA was performed worst within VaD group.
Conclusions
Patients with early-stage AD have more cognitive deficits on memory and language while patients with early-stage VaD show worse cognitive function on attention/processing speed. In addition, as the first cognitive deficit, memory dysfunction comes in AD and deficit in attention/processing speed in VaD.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Serotonergic dysfunction may play an important role in motor and nonmotor symptoms of Parkinson’s disease (PD). The loudness dependence of auditory evoked potentials (LDAEP) has been used to evaluate serotonergic activity. Therefore, this study aimed to determine central serotonergic activity using LDAEP in de novo PD according to the age at onset and changes in serotonergic activity after dopaminergic treatment.
Methods:
A total of 30 patients with unmedicated PD, 16 in the early-onset and 14 in the late-onset groups, were enrolled. All subjects underwent comprehensive neurological examination, laboratory tests, the Unified Parkinson’s Disease Rating Scale, and LDAEP. The LDAEP was calculated as the slope of the two N1/P2 peaks measured at the Cz electrode, first at baseline conditions (pretreatment) and a second time after 12 weeks (post-treatment) following dopaminergic medications.
Results:
The absolute values of pretreatment N1/P2 LDAEP (early-onset: late-onset, 0.99 ± 0.68: 1.62 ± 0.88, p = 0.035) and post-treatment N1 LDAEP (early-onset: late-onset, −0.61 ± 0.61: −1.26 ± 0.91, p = 0.03) were significantly lower in the early-onset group compared with those of the late-onset group. In addition, a higher value of pretreatment N1/P2 LDAEP was significantly correlated with the late-onset group (coefficient = 1.204, p = 0.044). The absolute value of the N1 LDAEP decreased after 12 weeks of taking dopaminergic medication (pretreatment: post-treatment, −1.457 ± 1.078: −0.904 ± 0.812, p = 0.0018).
Conclusions:
Based on the results of this study, LDAEP could be a marker for serotonergic neurotransmission in PD. Central serotonergic activity assessed by LDAEP may be more preserved in early-onset PD patients and can be altered with dopaminergic medication.
Scholars often assume that reference groups are industry-wide, homogeneous, and stable. We examine this assumption and suggest hypotheses based on managers’ motivations such as self-enhancement and self-improvement, social identity, and affiliation-based impression management. We test hypotheses on failure-induced changes in reference groups and their direction in terms of upward and downward comparisons. An empirical examination of changes in reference groups for firms listed on the Dow Jones Industrial Average Index between 1993 and 2008 shows that performance below social aspirations induces changes in reference groups and toward upward comparisons. The results indicate that managers can choose to change the reference group – a cognition-centered response – as an alternative to such action-centered responses as organizational search and risk-taking in response to poor performance from social aspirations and that upward comparisons may be the result of social performance shortfalls to give a better impression and to improve firm performance.
The National Institute of Neurological Disease and Stroke-Canadian Stroke Network (NINDS-CSN) 5-minute neuropsychology protocol consists of only verbal tasks, and is proposed as a brief screening method for vascular cognitive impairment. We evaluated its feasibility within two weeks after stroke and ability to predict the development of post-stroke dementia (PSD) at 3 months after stroke.
Method:
We prospectively enrolled subjects with ischemic stroke within seven days of symptom onset who were consecutively admitted to 12 university hospitals. Neuropsychological assessments using the NINDS-CSN 5-minute and 60-minute neuropsychology protocols were administered within two weeks and at 3 months after stroke onset, respectively. PSD was diagnosed with reference to the American Heart Association/American Stroke Association statement, requiring deficits in at least two cognitive domains.
Results:
Of 620 patients, 512 (82.6%) were feasible for the NINDS-CSN 5-minute protocol within two weeks after stroke. The incidence of PSD was 16.2% in 308 subjects who had completed follow-up at 3 months after stroke onset. The total score of the NINDS-CSN 5-minute protocol differed significantly between those with and without PSD (4.0 ± 2.7, 7.4 ± 2.7, respectively; p < 0.01). A cut-off value of 6/7 showed reasonable discriminative power (sensitivity 0.82, specificity 0.67, AUC 0.74). The NINDS-CSN 5-minute protocol score was a significant predictor for PSD (adjusted odds ratio 6.32, 95% CI 2.65–15.05).
Discussion:
The NINDS-CSN 5-minute protocol is feasible to evaluate cognitive functions in patients with acute ischemic stroke. It might be a useful screening method for early identification of high-risk groups for PSD.
During the past decade, carbapenemase-producing Enterobacteriaceae (CPE) has emerged and spread across the world.1 The major carbapenemase enzymes currently being reported are KPC, NDM-1, VIM, IMP, and OXA.2 Because carbapenemase can be effectively transmitted via mobile genetic elements, and current therapeutic options for CPE infections are extremely limited, CPE may be one of the most serious contemporary threats to public health. However, very little is known about the characteristics of CPE carriage during hospitalization. The aims of this study were to investigate the clearance rate of CPE carriage and determine the number of consecutive negative cultures required to confirm CPE clearance. We also examined CPE transmission among hospitalized patients.
Infect. Control Hosp. Epidemiol. 2015;36(11):1361–1362
Decreased hemoglobin levels increase the risk of developing dementia among the elderly. However, the underlying mechanisms that link decreased hemoglobin levels to incident dementia still remain unclear, possibly due to the fact that few studies have reported on the relationship between low hemoglobin levels and neuroimaging markers. We, therefore, investigated the relationships between decreased hemoglobin levels, cerebral small-vessel disease (CSVD), and cortical atrophy in cognitively healthy women and men.
Methods:
Cognitively normal women (n = 1,022) and men (n = 1,018) who underwent medical check-ups and magnetic resonance imaging (MRI) were enrolled at a health promotion center. We measured hemoglobin levels, white matter hyperintensities (WMH) scales, lacunes, and microbleeds. Cortical thickness was automatically measured using surface based methods. Multivariate regression analyses were performed after controlling for possible confounders.
Results:
Decreased hemoglobin levels were not associated with the presence of WMH, lacunes, or microbleeds in women and men. Among women, decreased hemoglobin levels were associated with decreased cortical thickness in the frontal (Estimates, 95% confidence interval, −0.007, (−0.013, −0.001)), temporal (−0.010, (−0.018, −0.002)), parietal (−0.009, (−0.015, −0.003)), and occipital regions (−0.011, (−0.019, −0.003)). Among men, however, no associations were observed between hemoglobin levels and cortical thickness.
Conclusion:
Our findings suggested that decreased hemoglobin levels affected cortical atrophy, but not increased CSVD, among women, although the association is modest. Given the paucity of modifiable risk factors for age-related cognitive decline, our results have important public health implications.
During the past decades, a rapid nutritional transition has been observed along with economic growth in the Republic of Korea. Since this dramatic change in diet has been frequently associated with cancer and other non-communicable diseases, dietary monitoring is essential to understand the association. Benefiting from pre-existing standardised dietary methodologies, the present study aimed to evaluate the feasibility and describe the development of a Korean version of the international computerised 24 h dietary recall method (GloboDiet software) and its complementary tools, developed at the International Agency for Research on Cancer (IARC), WHO. Following established international Standard Operating Procedures and guidelines, about seventy common and country-specific databases on foods, recipes, dietary supplements, quantification methods and coefficients were customised and translated. The main results of the present study highlight the specific adaptations made to adapt the GloboDiet software for research and dietary surveillance in Korea. New (sub-) subgroups were added into the existing common food classification, and new descriptors were added to the facets to classify and describe specific Korean foods. Quantification methods were critically evaluated and adapted considering the foods and food packages available in the Korean market. Furthermore, a picture book of foods/dishes was prepared including new pictures and food portion sizes relevant to Korean diet. The development of the Korean version of GloboDiet demonstrated that it was possible to adapt the IARC-WHO international dietary tool to an Asian context without compromising its concept of standardisation and software structure. It, thus, confirms that this international dietary methodology, used so far only in Europe, is flexible and robust enough to be customised for other regions worldwide.
There is increasing evidence of a relationship between underweight or obesity and dementia risk. Several studies have investigated the relationship between body weight and brain atrophy, a pathological change preceding dementia, but their results are inconsistent. Therefore, we aimed to evaluate the relationship between body mass index (BMI) and cortical atrophy among cognitively normal participants.
Methods:
We recruited cognitively normal participants (n = 1,111) who underwent medical checkups and detailed neurologic screening, including magnetic resonance imaging (MRI) in the health screening visits between September 2008 and December 2011. The main outcome was cortical thickness measured using MRI. The number of subjects with five BMI groups in men/women was 9/9, 148/258, 185/128, 149/111, and 64/50 in underweight, normal, overweight, mild obesity, and moderate to severe obesity, respectively. Linear and non-linear relationships between BMI and cortical thickness were examined using multiple linear regression analysis and generalized additive models after adjustment for potential confounders.
Results:
Among men, underweight participants showed significant cortical thinning in the frontal and temporal regions compared to normal weight participants, while overweight and mildly obese participants had greater cortical thicknesses in the frontal region and the frontal, temporal, and occipital regions, respectively. However, cortical thickness in each brain region was not significantly different in normal weight and moderate to severe obesity groups. Among women, the association between BMI and cortical thickness was not statistically significant.
Conclusions:
Our findings suggested that underweight might be an important risk factor for pathological changes in the brain, while overweight or mild obesity may be inversely associated with cortical atrophy in cognitively normal elderly males.
Epidemiological studies have reported that higher education (HE) is associated with a reduced risk of incident Alzheimer's disease (AD). However, after the clinical onset of AD, patients with HE levels show more rapid cognitive decline than patients with lower education (LE) levels. Although education level and cognition have been linked, there have been few longitudinal studies investigating the relationship between education level and cortical decline in patients with AD. The aim of this study was to compare the topography of cortical atrophy longitudinally between AD patients with HE (HE-AD) and AD patients with LE (LE-AD).
Methods:
We prospectively recruited 36 patients with early-stage AD and 14 normal controls. The patients were classified into two groups according to educational level, 23 HE-AD (>9 years) and 13 LE-AD (≤9 years).
Results:
As AD progressed over the 5-year longitudinal follow-ups, the HE-AD showed a significant group-by-time interaction in the right dorsolateral frontal and precuneus, and the left parahippocampal regions compared to the LE-AD.
Conclusion:
Our study reveals that the preliminary longitudinal effect of HE accelerates cortical atrophy in AD patients over time, which underlines the importance of education level for predicting prognosis.
This study aimed to investigate the influences of age, education, and gender on the two total scores (TS-I and TS-II) of the Consortium to Establish a Registry for Alzheimer's Disease Neuropsychological assessment battery (CERAD-NP) and to provide normative information based on an analysis for a large number of elderly persons with a wide range of educational levels.
Methods:
In the study, 1,987 community-dwelling healthy volunteers (620 males and 1,367 females; 50–90 years of age; and zero to 25 years of education) were included. People with serious neurological, medical, and psychiatric disorders (including dementia) were excluded. All participants underwent the CERAD-NP assessment. TS-I was generated by summing raw scores from the CERAD-NP subtests, excluding Mini-Mental State Examination and Constructional Praxis (CP) recall subtests. TS-II was calculated by adding CP recall score to TS-I.
Results:
Both TS-I and TS-II were significantly influenced by demographic variables. Education accounted for the greatest proportion of score variance. Interaction effect between age and gender was found. Based on the results obtained, normative data of the CERAD-NP total scores were stratified by age (six overlapping tables), education (four strata), and gender.
Conclusions:
The normative information will be very useful for better interpretation of the CERAD-NP total scores in various clinical and research settings and for comparing individuals’ performance of the battery across countries.
Genetic variation in wild soybean (Glycine soja Sieb. and Zucc.) is a valuable resource for crop improvement efforts. Soybean is believed to have originated from China, Korea, and Japan, but little is known about the diversity or evolution of Korean wild soybean. Therefore, in this study, we evaluated the genetic diversity and population structure of 733 G. soja accessions collected in Korea using 21 simple sequence repeat (SSR) markers. The SSR loci produced 539 alleles (25.7 per locus) with a mean genetic diversity of 0.882 in these accessions. Rare alleles, those with a frequency of less than 5%, represented 75% of the total number. This collection was divided into two populations based on the principal coordinate analysis. Accessions from population 1 were distributed throughout the country, whereas most of the accessions from population 2 were distributed on the western side of the Taebaek and Sobaek mountains. The Korean G. soja collection evaluated in this study should provide useful background information for allele mining approach and breeding programmes to introgress alleles into the cultivated soybean (G. max (L). Merr.) from wild soybean.
Although there are rapidly growing concerns about the high rates of cognitive dysfunction in Korea, the knowledge of risk factors for Alzheimer’s disease (AD) among the general public in Korea remains to be elucidated.
Methods:
A total of 2767 randomly selected subjects from the Ansan Geriatric Study were questioned on their knowledge of putative risk factors for AD. Their answers were compared with their sociodemographic data and other variables.
Results:
The most common stated risk factor was being older (59.6%), followed by head trauma (33.6%) and cerebrovascular disease (30.4%). However, a substandard education, which is a known risk factor, was considered significant by only 9.5% of the subjects. Predictors for a worse knowledge of the risk factors for AD were being older, a lower level of education, lower economic status and the attitude that dementia is not curable.
Conclusion:
This study revealed that misunderstanding about AD is more prevalent in older subjects and those with a lower level of education, and so public health education on the basic concepts of AD should be targeted at this population.
Sources of variation in nutrient intake have been examined for Western diets, but little is known about the sources of variation and their differences by age and sex among Koreans. We examined sources of variation in nutrient intake and calculated the number of days needed to estimate usual intake using 12 d of dietary records (DR). To this end, four 3 d DR including two weekdays and one weekend day were collected throughout four seasons of 1 year from 178 male and 236 female adults aged 20–65 years residing in Seoul, Korea. The sources of variation were estimated using the random-effects model, and the variation ratio (within-individual:between-individual) was calculated to determine a desirable number of days. Variations attributable to the day of the week, recording sequence and seasonality were generally small, although the degree of variation differed by sex and age (20–45 years and 46–65 years). The correlation coefficient between the true intake and the observed intake (r) increased with additional DR days, reaching 0·7 at 3–4 d and 0·8 at 6–7 d. However, the degree of increase became attenuated with additional days: r increased by 13·0–26·9 % from 2 to 4 d, by 6·5–16·4 % from 4 to 7 d and by 4·0–11·6 % from 7 to 12 d for energy and fifteen nutrients. In conclusion, the present study suggests that the day of the week, recording sequence and seasonality minimally contribute to the variation in nutrient intake. To measure Korean usual dietary intake using open-ended dietary instruments, 3–4 d may be needed to achieve modest precision (r>0·7) and 6–7 d for high precision (r>0·8).
Vitamin D insufficiency is known to be related to cardiometabolic disorders; however, the associations among serum 25-hydroxyvitamin D (25(OH)D) concentration and metabolic syndrome and cardiometabolic risk factors in children and adolescents have not yet been clearly delineated. For this reason, we investigated the relationship among serum 25(OH)D concentration and metabolic syndrome and cardiometabolic risk factors among Korean adolescents.
Design
We performed a cross-sectional analysis and used hierarchical multivariate logistic regression analysis models to adjust for confounding variables.
Setting
We used the data gathered during the 2008–2009 Korea National Health and Nutrition Examination Survey (KNHANES).
Subjects
Our subjects included 1504 Korean adolescents aged 12–18 years who participated in the KNHANES.
Results
Vitamin D insufficiency, defined as 25(OH)D concentration <50 nmol/l, was found in 75·3 % of Korean adolescents and was associated with an increased risk of the prevalence of metabolic syndrome. Waist circumference and BMI were the most closely correlated cardiometabolic components of metabolic syndrome according to serum 25(OH)D status, but no significant relationship was found between serum 25(OH)D concentration and insulin resistance or for the risks for high blood pressure, hyperglycaemia, reduced HDL-cholesterol or hypertriacylglycerolaemia, with or without adjustment for confounding variables.
Conclusions
Low serum 25(OH)D concentration appears to be associated with several cardiometabolic risk factors and an increased risk of the prevalence of metabolic syndrome in Korean adolescents.
We aimed to assess the prevalence and associated factors of vitamin D deficiency in healthy adolescents and to determine parent–adolescent association in vitamin D status.
Design
A cross-sectional study.
Setting
Data from the Korean National Health and Nutrition Examination Survey (KNHANES) 2008–2009. Serum 25-hydroxyvitamin D (25(OH)D) levels were measured using 125I-labelled RIA kits. Vitamin D deficiency in adolescents was defined as 25(OH)D level <27·5 nmol/l, and 25(OH)D levels between 27·5 and <50 nmol/l were considered insufficient. For the parents, vitamin D deficiency was defined as 25(OH)D level <50 nmol/l.
Subjects
The study population consisted of 2062 adolescents (1095 boys, 967 girls; aged 10–18 years) and their parents (1005 fathers, 1341 mothers).
Results
Overall, 13·4 % of adolescents (boys 11·7 %, girls 15·4 %) were 25(OH)D deficient, 54·7 % were 25(OH)D insufficient. Prevalence of vitamin D deficiency increased with age (P < 0·0001). Parental vitamin D deficiency was more prevalent in vitamin D-deficient adolescents than in non-deficient adolescents (all P < 0·0001). In multivariate logistic regression analyses, predictors for vitamin D deficiency were senior high school students (OR = 3·45–4·33), winter/spring season (OR = 3·18–5·11/5·35–7·36) and parental vitamin D deficiency (OR = 1·78–4·88; all P < 0·05).
Conclusions
Vitamin D insufficiency is prevalent among healthy Korean adolescents and the parent–offspring association warrants vitamin D screening for family members of deficient individuals.