We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Emerging evidence indicates that gene–environment interactions (GEIs) are important underlying mechanisms for the development of schizophrenia (SZ). We investigated the associations of polygenic risk score for SZ (PRS-SZ), environmental measures, and their interactions with case–control status and clinical phenotypes among patients with schizophrenia spectrum disorders (SSDs).
Methods
The PRS-SZ for 717 SSD patients and 356 healthy controls (HCs) were calculated using the LDpred model. The Korea-Polyenvironmental Risk Score-I (K-PERS-I) and Early Trauma Inventory-Self Report (ETI-SR) were utilized as environmental measures. Logistic and linear regression analyses were performed to identify the associations of PRS-SZ and two environmental measures with case–control status and clinical phenotypes.
Results
The PRS-SZ explained 8.7% of SZ risk. We found greater associations of PRS-SZ and total scores of the K-PERS-I with case–control status compared to the ETI-SR total score. A significant additive interaction was found between PRS-SZ and K-PERS-I. With the subdomains of the K-PERS-I and ETI-SR, we identified significant multiplicative or additive interactions of PRS-SZ and parental socioeconomic status (pSES), childhood adversity, and recent life events in association with case–control status. For clinical phenotypes, significant interactions were observed between PRS-SZ and the ETI-SR total score for negative-self and between PRS-SZ and obstetric complications within the K-PERS-I for negative-others.
Conclusions
Our findings suggest that the use of aggregate scores for genetic and environmental measures, PRS-SZ and K-PERS-I, can more accurately predict case–control status, and specific environmental measures may be more suitable for the exploration of GEIs.
This study examines the presence of bacterial contamination on surgical gloves and suggests appropriate measures for an aseptic surgical environment. To prevent glove contamination during surgery, surgeons and assistants should change gloves periodically, and scrub nurses should be careful when opening packages and handing over implants.
To evaluate the impact of a vancomycin-resistant Enterococcus (VRE) screening policy change on the incidence of healthcare-associated (HA)-VRE bacteremia in an endemic hospital setting.
Design:
A quasi-experimental before-and-after study.
Setting:
A 1,989-bed tertiary-care referral center in Seoul, Republic of Korea.
Methods:
Since May 2010, our hospital has diminished VRE screening for admitted patients transferred from other healthcare facilities. We assessed the impact of this policy change on the incidence of HA-VRE bacteremia using segmented autoregression analysis of interrupted time series from January 2006 to December 2014 at the hospital and unit levels. In addition, we compared the molecular characteristics of VRE blood isolates collected before and after the screening policy change using multilocus sequence typing and pulsed-field gel electrophoresis.
Results:
After the VRE screening policy change, the incidence of hospital-wide HA-VRE bacteremia increased, although no significant changes of level or slope were observed. In addition, a significant slope change in the incidence of HA-VRE bacteremia (change in slope, 0.007; 95% CI, 0.001–0.013; P = .02) was observed in the hemato-oncology department. Molecular analysis revealed that various VRE sequence types appeared after the policy change and that clonally related strains became more predominant (increasing from 26.1% to 59.3%).
Conclusions:
The incidence of HA-VRE bacteremia increased significantly after VRE screening policy change, and this increase was mainly driven by high-risk patient populations. When planning VRE control programs in hospitals, different approaches that consider risk for severe VRE infection in patients may be required.
Network approach has been applied to a wide variety of psychiatric disorders. The aim of the present study was to identify network structures of remitters and non-remitters in patients with first-episode psychosis (FEP) at baseline and the 6-month follow-up.
Methods
Participants (n = 252) from the Korean Early Psychosis Study (KEPS) were enrolled. They were classified as remitters or non-remitters using Andreasen's criteria. We estimated network structure with 10 symptoms (three symptoms from the Positive and Negative Syndrome Scale, one depressive symptom, and six symptoms related to schema and rumination) as nodes using a Gaussian graphical model. Global and local network metrics were compared within and between the networks over time.
Results
Global network metrics did not differ between the remitters and non-remitters at baseline or 6 months. However, the network structure and nodal strengths associated with positive-self and positive-others scores changed significantly in the remitters over time. Unique central symptoms for remitters and non-remitters were cognitive brooding and negative-self, respectively. The correlation stability coefficients for nodal strength were within the acceptable range.
Conclusion
Our findings indicate that network structure and some nodal strengths were more flexible in remitters. Negative-self could be an important target for therapeutic intervention.
High-quality diets have been found to be beneficial in preventing long-term weight gain. However, concurrent changes in diet quality and body weight over time have rarely been reported. We examined the association between 10-year changes in diet quality and body weight in the Multiethnic Cohort Study. Analyses included 53 977 African Americans, Native Hawaiians, Japanese Americans, Latinos and Whites, who completed both baseline (1993–1996, 45–69 years) and 10-year follow-up (2003–2008) surveys including a FFQ and had no history of heart disease or cancer. Using multivariable regression, weight changes were regressed on changes in four diet quality indexes, Healthy Eating Index-2015, Alternative Healthy Eating Index-2010, alternate Mediterranean Diet and Dietary Approaches to Stop Hypertension scores. Mean weight change over 10 years was 1·2 (sd 6·8) kg in men and 1·5 (sd 7·2) kg in women. Compared with stable diet quality (< 0·5 sd change), the greatest increase (≥ 1 sd increase) in the diet scores was associated with less weight gain (by 0·55–1·17 kg in men and 0·62–1·31 kg in women). Smaller weight gain with improvement in diet quality was found in most subgroups by race/ethnicity, baseline age and baseline BMI. The inverse association was stronger in younger age and higher BMI groups. Ten-year improvement in diet quality was associated with a smaller weight gain, which varied by race/ethnicity and baseline age and BMI. Our findings suggest that maintaining a high-quality diet and improving diet quality over time may prevent excessive weight gain.
To investigate the impacts of depression screening, diagnosis and treatment on major adverse cardiac events (MACEs) in acute coronary syndrome (ACS).
Methods
Prospective cohort study including a nested 24-week randomised clinical trial for treating depression was performed with 5–12 years after the index ACS. A total of 1152 patients recently hospitalised with ACS were recruited from 2006 to 2012, and were divided by depression screening and diagnosis at baseline and 24-week treatment allocation into five groups: 651 screening negative (N), 55 screening positive but no depressive disorder (S), 149 depressive disorder randomised to escitalopram (E), 151 depressive disorder randomised to placebo (P) and 146 depressive disorder receiving medical treatment only (M).
Results
Cumulative MACE incidences over a median 8.4-year follow-up period were 29.6% in N, 43.6% in S, 40.9% in E, 53.6% in P and 59.6% in M. Compared to N, screening positive was associated with higher incidence of MACE [adjusted hazards ratio 2.15 (95% confidence interval 1.63–2.83)]. No differences were found between screening positive with and without a formal depressive disorder diagnosis. Of those screening positive, E was associated with a lower incidence of MACE than P and M. M had the worst outcomes even compared to P, despite significantly milder depressive symptoms at baseline.
Conclusions
Routine depression screening in patients with recent ACS and subsequent appropriate treatment of depression could improve long-term cardiac outcomes.
Firefighters are routinely exposed to various traumatic events and often experience a range of trauma-related symptoms. Although these repeated traumatic exposures rarely progress to the development of post-traumatic stress disorder, firefighters are still considered to be a vulnerable population with regard to trauma.
Aims
To investigate how the human brain responds to or compensates for the repeated experience of traumatic stress.
Method
We included 98 healthy firefighters with repeated traumatic experiences but without any diagnosis of mental illness and 98 non-firefighter healthy individuals without any history of trauma. Functional connectivity within the fear circuitry, which consists of the dorsal anterior cingulate cortex, insula, amygdala, hippocampus and ventromedial prefrontal cortex (vmPFC), was examined using resting-state functional magnetic resonance imaging. Trauma-related symptoms were evaluated using the Impact of Event Scale – Revised.
Results
The firefighter group had greater functional connectivity between the insula and several regions of the fear circuitry including the bilateral amygdalae, bilateral hippocampi and vmPFC as compared with healthy individuals. In the firefighter group, stronger insula–amygdala connectivity was associated with greater severity of trauma-related symptoms (β = 0.36, P = 0.005), whereas higher insula–vmPFC connectivity was related to milder symptoms in response to repeated trauma (β = −0.28, P = 0.01).
Conclusions
The current findings suggest an active involvement of insular functional connectivity in response to repeated traumatic stress. Functional connectivity of the insula in relation to the amygdala and vmPFC may be potential pathways that underlie the risk for and resilience to repeated traumatic stress, respectively.
This study aimed to investigate associations among spirituality, coping strategies, quality of life (QOL), and the effects of depression and anxiety thereon in cancer patients.
Method
In total, 237 cancer patients referred to a psycho-oncology clinic at a university hospital in Korea were enrolled. After identifying predictors of patient QOL in a stepwise regression model, we developed a hypothetical path model wherein interpersonal coping was considered as a mediating variable between spirituality (meaning/peace) and QOL and wherein depression and anxiety affected each of these three variables.
Result
The direct effect of spirituality (meaning/peace) on QOL was 36.7%. In an indirect model, interpersonal coping significantly mediated the relationship between spirituality (meaning/peace) and QOL. Depression exerted the largest negative effect on spirituality (meaning/peace), interpersonal coping, and QOL. Anxiety had negative effects on spirituality (meaning/peace) and QOL, but a positive effect on interpersonal coping.
Significance of results
Interpersonal coping strategies work as a partial mediator of the relationship between meaning/peace subscales of spirituality and QOL. Effective management of depression may help in achieving better outcomes associated therewith. Greater attention and efforts to improve social connectedness and meaning of life in spiritual well-being may improve the QOL of cancer patients.
Our objective was to evaluate long-term altered appearance, distress, and body image in posttreatment breast cancer patients and compare them with those of patients undergoing active treatment and with general population controls.
Method:
We conducted a cross-sectional survey between May and December of 2010. We studied 138 breast cancer patients undergoing active treatment and 128 posttreatment patients from 23 Korean hospitals and 315 age- and area-matched subjects drawn from the general population. Breast, hair, and skin changes, distress, and body image were assessed using visual analogue scales and the EORTC BR–23. Average levels of distress were compared across groups, and linear regression was utilized to identify the factors associated with body image.
Results:
Compared to active-treatment patients, posttreatment patients reported similar breast changes (6.6 vs. 6.2), hair loss (7.7 vs. 6.7), and skin changes (5.8 vs. 5.4), and both groups had significantly more severe changes than those of the general population controls (p < 0.01). For a similar level of altered appearance, however, breast cancer patients experienced significantly higher levels of distress than the general population. In multivariate analysis, patients with high altered appearance distress reported significantly poorer body image (–20.7, CI95% = –28.3 to –13.1) than patients with low distress.
Significance of results:
Posttreatment breast cancer patients experienced similar levels of altered appearance, distress, and body-image disturbance relative to patients undergoing active treatment but significantly higher distress and poorer body image than members of the general population. Healthcare professionals should acknowledge the possible long-term effects of altered appearance among breast cancer survivors and help them to manage the associated distress and psychological consequences.
Background: Patients diagnosed with Parkinson’s disease (PD) on clinics who subsequently turn out to have normal dopamine transporter images have been referred to as scans without evidence of dopaminergic deficits (SWEDDs) patients. Cardiovascular autonomic dysfunction has frequently been reported in PD. In this study, we determined the similarities and differences in cardiac autonomic dysfunction between SWEDDs and PD patients. This study investigated whether 24-hour ambulatory blood pressure monitoring (24-hour ABPM) can help identify possible cases with SWEDDs. Methods: We enrolled 28 SWEDDs patients, 46 patients with PD, and 30 healthy controls. To evaluate cardiac autonomic function, 24-hour ABPM was performed on all subjects. Cardiac metaiodobenzylguanidine (MIBG) scintigraphy was performed on the SWEDDs and PD subjects. Results: The percentage nocturnal decline in blood pressure differed significantly among SWEDDs patients, PD patients, and controls (p<0.05). In addition to the abnormal nocturnal BP, regulation (nondipping and reverse dipping) was significantly higher in SWEDDs and PD subjects than in the control subjects (p<0.05). There was no significant correlation between the % nocturnal blood pressure reduction and parameters of cardiac MIBG uptake ratio. However, orthostatic hypotension was significant correlated with the nocturnal blood pressure dip (%), nocturnal blood pressure patterns, and the cardiac MIBG uptake ratio (early and late) in combined SWEDDs and PD subjects. Conclusions: Pathologic nocturnal blood pressure regulation and nocturnal hypertension, known characteristics of PD, are also present in SWEDDs. Moreover, cardiac sympathetic denervation should not be attributed to cardiac autonomic dysfunction in SWEDDs patients. As with PD patients, the SWEDDs patients studied here tended to have cardiac autonomic dysfunction.
FFQ comprising food items, intake frequency categories and portion sizes have been used in large-scale observational studies to assess long-term dietary exposure. Although gender is an important influence on food choice and portion size, gender differences are not often analysed during FFQ development. This study investigated whether gender differences were considered sufficiently when developing FFQ, which affects the results of validation studies. A PubMed search using combinations of ‘FFQ’, ‘Food Frequency Questionnaire’, ‘Validation’ and ‘Validity’ identified 246 validation studies available in English, published between January 1983 and May 2014, which included healthy male and female adults. The development process of the 196 FFQ used in the 246 validation studies was examined. Of these, twenty-one FFQ (10·7 %) considered gender during item selection or portion size determination, and were therefore classified as gender specific (GS), but 175 (89·3 %) did not consider gender, and were classified as ‘not gender specific (NGS)’. When the ratios between intake levels obtained using the FFQ and a reference method for energy and seven nutrients were compared between the GS group and the NGS group, more significant differences were observed in women than in men (four v. one nutrient). Intake of three nutrients was significantly underestimated in both sexes in the GS group. In the NGS group, nutrient intakes were significantly overestimated more often in women than in men (four v. one). These results indicate that not considering gender in FFQ development causes greater inaccuracy in dietary intake assessment in women than in men. Results of nutritional epidemiological studies should be re-evaluated for their validity, especially if the studies used NGS-FFQ.
Many transgenic domestic animals have been developed to produce therapeutic proteins in the mammary gland, and this approach is one of the most important methods for agricultural and biomedical applications. However, expression and secretion of a protein varies because transgenes are integrated at random sites in the genome. In addition, distal enhancers are very important for transcriptional gene regulation and tissue-specific gene expression. Development of a vector system regulated accurately in the genome is needed to improve production of therapeutic proteins. The objective of this study was to develop a knock-in system for expression of human fibroblast growth factor 2 (FGF2) in the bovine β-casein gene locus. The F2A sequence was fused to the human FGF2 gene and inserted into exon 3 of the β-casein gene. We detected expression of human FGF2 mRNA in the HC11 mouse mammary epithelial cells by RT-PCR and human FGF2 protein in the culture media using western blot analysis when the knock-in vector was introduced. We transfected the knock-in vector into bovine ear fibroblasts and produced knock-in fibroblasts using the clustered regularly interspaced short palindromic repeats (CRISPR)/Cas9 system. Moreover, the CRISPR/Cas9 system was more efficient than conventional methods. In addition, we produced knock-in blastocysts by somatic cell nuclear transfer using the knock-in fibroblasts. Our knock-in fibroblasts may help to create cloned embryos for development of transgenic dairy cattle expressing human FGF2 protein in the mammary gland via the expression system of the bovine β-casein gene.
Decreased hemoglobin levels increase the risk of developing dementia among the elderly. However, the underlying mechanisms that link decreased hemoglobin levels to incident dementia still remain unclear, possibly due to the fact that few studies have reported on the relationship between low hemoglobin levels and neuroimaging markers. We, therefore, investigated the relationships between decreased hemoglobin levels, cerebral small-vessel disease (CSVD), and cortical atrophy in cognitively healthy women and men.
Methods:
Cognitively normal women (n = 1,022) and men (n = 1,018) who underwent medical check-ups and magnetic resonance imaging (MRI) were enrolled at a health promotion center. We measured hemoglobin levels, white matter hyperintensities (WMH) scales, lacunes, and microbleeds. Cortical thickness was automatically measured using surface based methods. Multivariate regression analyses were performed after controlling for possible confounders.
Results:
Decreased hemoglobin levels were not associated with the presence of WMH, lacunes, or microbleeds in women and men. Among women, decreased hemoglobin levels were associated with decreased cortical thickness in the frontal (Estimates, 95% confidence interval, −0.007, (−0.013, −0.001)), temporal (−0.010, (−0.018, −0.002)), parietal (−0.009, (−0.015, −0.003)), and occipital regions (−0.011, (−0.019, −0.003)). Among men, however, no associations were observed between hemoglobin levels and cortical thickness.
Conclusion:
Our findings suggested that decreased hemoglobin levels affected cortical atrophy, but not increased CSVD, among women, although the association is modest. Given the paucity of modifiable risk factors for age-related cognitive decline, our results have important public health implications.
To determine the influence of early pain relief for patients with suspected appendicitis on the diagnostic performance of surgical residents.
Methods
A prospective randomized, double-blind, placebo-controlled trial was conducted for patients with suspected appendicitis. The patients were randomized to receive placebo (normal saline intravenous [IV]) infusions over 5 minutes or the study drug (morphine 5 mg IV). All of the clinical evaluations by surgical residents were performed 30 minutes after administration of the study drug or placebo. After obtaining the clinical probability of appendicitis, as determined by the surgical residents, abdominal computed tomography was performed. The primary objective was to compare the influence of IV morphine on the ability of surgical residents to diagnose appendicitis.
Results
A total of 213 patients with suspected appendicitis were enrolled. Of these patients, 107 patients received morphine, and 106 patients received placebo saline. The negative appendectomy percentages in each group were similar (3.8% in the placebo group and 3.2% in the pain control group, p=0.62). The perforation rates in each group were also similar (18.9% in the placebo group and 14.3% in the pain control group, p=0.75). Receiver operating characteristic analysis revealed that the overall diagnostic accuracy in each group was similar (the area under the curve of the placebo group and the pain control group was 0.63 v. 0.61, respectively, p=0.81).
Conclusions
Early pain control in patients with suspected appendicitis does not affect the diagnostic performance of surgical residents.
There is increasing evidence of a relationship between underweight or obesity and dementia risk. Several studies have investigated the relationship between body weight and brain atrophy, a pathological change preceding dementia, but their results are inconsistent. Therefore, we aimed to evaluate the relationship between body mass index (BMI) and cortical atrophy among cognitively normal participants.
Methods:
We recruited cognitively normal participants (n = 1,111) who underwent medical checkups and detailed neurologic screening, including magnetic resonance imaging (MRI) in the health screening visits between September 2008 and December 2011. The main outcome was cortical thickness measured using MRI. The number of subjects with five BMI groups in men/women was 9/9, 148/258, 185/128, 149/111, and 64/50 in underweight, normal, overweight, mild obesity, and moderate to severe obesity, respectively. Linear and non-linear relationships between BMI and cortical thickness were examined using multiple linear regression analysis and generalized additive models after adjustment for potential confounders.
Results:
Among men, underweight participants showed significant cortical thinning in the frontal and temporal regions compared to normal weight participants, while overweight and mildly obese participants had greater cortical thicknesses in the frontal region and the frontal, temporal, and occipital regions, respectively. However, cortical thickness in each brain region was not significantly different in normal weight and moderate to severe obesity groups. Among women, the association between BMI and cortical thickness was not statistically significant.
Conclusions:
Our findings suggested that underweight might be an important risk factor for pathological changes in the brain, while overweight or mild obesity may be inversely associated with cortical atrophy in cognitively normal elderly males.
Epidemiological studies have reported that higher education (HE) is associated with a reduced risk of incident Alzheimer's disease (AD). However, after the clinical onset of AD, patients with HE levels show more rapid cognitive decline than patients with lower education (LE) levels. Although education level and cognition have been linked, there have been few longitudinal studies investigating the relationship between education level and cortical decline in patients with AD. The aim of this study was to compare the topography of cortical atrophy longitudinally between AD patients with HE (HE-AD) and AD patients with LE (LE-AD).
Methods:
We prospectively recruited 36 patients with early-stage AD and 14 normal controls. The patients were classified into two groups according to educational level, 23 HE-AD (>9 years) and 13 LE-AD (≤9 years).
Results:
As AD progressed over the 5-year longitudinal follow-ups, the HE-AD showed a significant group-by-time interaction in the right dorsolateral frontal and precuneus, and the left parahippocampal regions compared to the LE-AD.
Conclusion:
Our study reveals that the preliminary longitudinal effect of HE accelerates cortical atrophy in AD patients over time, which underlines the importance of education level for predicting prognosis.
Cancer is a leading cause of death, and the dietary pattern in Korea is changing rapidly from a traditional Korean diet to a Westernised diet. In the present study, we investigated the effects of dietary factors on cancer risk with a prospective cohort study. Among 26 815 individuals who participated in cancer screening examinations from September 2004 to December 2008, 8024 subjects who completed a self-administered questionnaire concerning demographic and lifestyle factors, and a 3 d food record were selected. As of September 2013, 387 cancer cases were identified from the National Cancer Registry System, and the remaining individuals were included in the control group. The hazard ratio (HR) of cancer for the subjects older than or equal to 50 years of age was higher (HR 1·80, 95 % CI 1·41, 2·31; P< 0·0001) than that for the other subjects. Red meat consumption, Na intake and obesity (BMI ≥ 25 kg/m2) were positively associated with overall cancer incidence in men (HR 1·41, 95 % CI 1·02, 1·94; P= 0·0382), gastric cancer (HR 2·34, 95 % CI 1·06, 5·19; P= 0·0365) and thyroid cancer (HR 1·56, 95 % CI 1·05, 2·31; P= 0·0270), respectively. Participants who had at least three dietary risk factors among the high intakes of red meat and Na, low intakes of vegetables and fruits, and obesity suggested by the World Cancer Research Fund/American Institute for Cancer Research at baseline tended to have a higher risk of cancer than the others (HR 1·26, 95 % CI 0·99, 1·60; P= 0·0653). In summary, high intakes of red meat and Na were significant risk factors of cancer among Koreans.