We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Depression is the largest global contributor to non-fatal disease burden(1). A growing body of evidence suggests that dietary behaviours, such as higher fruit and vegetable intake, may be protective against the risk of depression(2). However, this evidence is primarily from high-income countries, despite over 80% of the burden of depression being experienced in low- and middle-income countries(1). There are also limited studies to date focusing on older adults. The aim of this study was to prospectively examine the associations between baseline fruit and vegetable intake and incidence of depression in adults aged 45-years and older from 10 cohorts across six continents, including four cohorts from low and middle-income countries. The association between baseline fruit and vegetable intake and incident depression over a 3–6-year follow-up period was examined using Cox proportional hazard regression after controlling for a range of potential confounders. Participants were 7771 community-based adults aged 45+ years from 10 diverse cohorts. All cohorts were members of the Cohort Studies of Memory in an International Consortium collaboration(3). Fruit intake (excluding juice) and vegetable intake was collected using either a comprehensive food frequency questionnaire, short food questionnaire or diet history. Depressive symptoms were assessed using validated depression measures, and depression was defined as a score greater than or equal to a validated cut-off. Prior to analysis all data were harmonised. Analysis was performed by cohort and then cohort results were combined using meta-analysis. Subgroup analysis was performed by sex, age (45 – 64 versus 65+ years) and income level of country (high income countries versus low- and middle-income countries). There were 1537 incident cases of depression over 32,420 person-years of follow-up. Mean daily intakes of fruit were 1.7 ± 1.5 serves and vegetables 1.9 ± 1.4. serves. We found no association between fruit and vegetable intakes and risk of incident depression in any of the analyses, and this was consistent across the subgroup analyses. The low intake of fruit and vegetables of participants, diverse measures used across the different cohorts, and modest sample size of our study compared with prior studies in the literature, may have prevented an association being detected. Further investigation using standardised measures in larger cohorts of older adults from low- to middle-income countries is needed. Future research should consider the potential relationship between different types of fruits and vegetables and depression.
Persons newly diagnosed with dementia and their family member is imperative often experience uncertainty and inadequate support. This study aims to evaluate a post-diagnostic support programme guided by the 5 Pillars Model proposed by Alzheimer Scotland on the self-efficacy among persons with early dementia and their family members.
Methods:
A prospective cohort study design was conducted between 2019 and 2022. Subject recruitment was conducted in four non-government organizations. A multi-domain empowerment programme, covering various aspects about dementia knowledge, management skills, peer support, future decision-making and community resources, was developed. The programme was provided to people newly diagnosed of early dementia in small group format over 2 months and to family members individually through an eLearning platform over 9 months. Self efficacy in dementia management of people with dementia and their family members were measured using Chronic Disease Self-efficacy Scale and Caregiver Self-efficacy Scale (CSES), respectively, whereas caregiving burden was measured using Zarit Burden Interview (ZBI). Study outcomes were measured at baseline, immediate and 6-month post-intervention. Paired t-tests were performed to detect within-subject changes over time.
Results:
A total of 151 persons with early dementia and 294 family caregivers completed assessment at baseline and follow up. Self-efficacy in dementia management reported by persons with dementia at 6-month post-intervention was significantly higher than that reported at baseline (p = .021) and immediate post-intervention (i.e. 2-month follow up) (p = .006). Family members reported a significantly higher CSES score (p < .001) and subscale scores in thoughts (p = .001) and disruptive behaviour management (p = .001) at 9-month follow up, but significant reduction in caregiving burden (p < .001) was only noted among those who perceived higher burden than the local norms at baseline (ZBI score ≥ 25, n = 110).
Discussion:
This study provides empirical evidence that post-diagnostic support would empower persons with early dementia and their family members on adapting the impacts brought by dementia. Further study on examining the longer term effects on care outcomes and health service utilisation would be valuable.
People with dementia are more prone to premature nursing home placement after hospitalization due to physical and mental deconditioning which makes care-at- home more difficult. This study aimed to evaluate the effect of a post hospital discharge transitional care program on reduction of nursing home placement in people with dementia.
Methods:
A matched case-control study was conducted between 2018 and 2021. A transitional care program using case management approach was developed. Participants enrolled the program by self-enrolment or referral from hospitals or NGOs. Community-dwelling people with dementia discharged from hospitals received a four- week residential care at a dementia care centre with intensive nursing care, physiotherapy and group activities promoting social engagement, followed by eight- week day care rehabilitation activities to improve their mobility and cognitive functioning. They were matched on a 1:5 ratio by age and sex to people with dementia discharged from a convalescent hospital who did not participate in this program for comparison. The study outcome was nursing home admission, measured three months (i.e. post-intervention), six months, and nine months after hospital discharge. Multinomial logistic regression was conducted to investigate factors associated with nursing home placement at each measurement time-point.
Results:
361 hospital admission episodes (n=67 interevntion, n=294 control) were examined. The regression results showed that participants in the intervention group were significantly less likely to be admitted to nursing home three months (OR = 0.023, 95% CI: 0.003-0.201, p = .001) and six months (OR = 0.094, 95% CI: 0.025-0.353, p = .001) than the controls after hospital discharge, but the intervention effect did not sustain nine months after hospital discharge. Longer hospital length of stay, and hospital admission due to dementia, mental disturbances such as delirium, or mental disorders IPA_Abstract_PDP_20230119_clean 2 such as schizophrenia significantly predicted nursing home admission three months and six months after hospital discharge.
Conclusion:
The transitional care program could help reduce nursing home placement in people with dementia after hospital discharge. To sustain the intervention effect, more continual support after the intervention as well as family caregiver training would be required.
TDuring COVID-19 pandemic, it was noticed that it was students who were mostly affected by the changes that aroused because of the pandemic. The interesting part is whether students’ well-being could be associated with their fields of study as well as coping strategies.
Objectives
In this study, we aimed to assess 1) the mental health of students from nine countries with a particular focus on depression, anxiety, and stress levels and their fields of study, 2) the major coping strategies of students after one year of the COVID-19 pandemic.
Methods
We conducted an anonymous online cross-sectional survey on 12th April – 1st June 2021 that was distributed among the students from Poland, Mexico, Egypt, India, Pakistan, China, Vietnam, Philippines, and Bangladesh. To measure the emotional distress, we used the Depression, Anxiety, and Stress Scale-21 (DASS-21), and to identify the major coping strategies of students - the Brief-COPE.
Results
We gathered 7219 responses from students studying five major studies: medical studies (N=2821), social sciences (N=1471), technical sciences (N=891), artistic/humanistic studies (N=1094), sciences (N=942). The greatest intensity of depression (M=18.29±13.83; moderate intensity), anxiety (M=13.13±11.37; moderate intensity ), and stress (M=17.86±12.94; mild intensity) was observed among sciences students. Medical students presented the lowest intensity of all three components - depression (M=13.31±12.45; mild intensity), anxiety (M=10.37±10.57; moderate intensity), and stress (M=13.65±11.94; mild intensity). Students of all fields primarily used acceptance and self-distraction as their coping mechanisms, while the least commonly used were self-blame, denial, and substance use. The group of coping mechanisms the most frequently used was ‘emotional focus’. Medical students statistically less often used avoidant coping strategies compared to other fields of study. Substance use was only one coping mechanism that did not statistically differ between students of different fields of study. Behavioral disengagement presented the highest correlation with depression (r=0.54), anxiety (r=0.48), and stress (r=0.47) while religion presented the lowest positive correlation with depression (r=0.07), anxiety (r=0.14), and stress (r=0.11).
Conclusions
1) The greatest intensity of depression, anxiety, and stress was observed among sciences students, while the lowest intensity of those components was found among students studying medicine.
2) Not using avoidant coping strategies might be associated with lower intensity of all DASS components among students.
3) Behavioral disengagement might be strongly associated with greater intensity of depression, anxiety, and stress among students.
4) There was no coping mechanism that provided the alleviation of emotional distress in all the fields of studies of students.
There is evidence that child maltreatment is associated with shorter telomere length in early life.
Aims
This study aims to examine if child maltreatment is associated with telomere length in middle- and older-age adults.
Method
This was a retrospective cohort study of 141 748 UK Biobank participants aged 37–73 years at recruitment. Leukocyte telomere length was measured with quantitative polymerase chain reaction, and log-transformed and scaled to have unit standard deviation. Child maltreatment was recalled by participants. Linear regression was used to analyse the association.
Results
After adjusting for sociodemographic characteristics, participants with three or more types of maltreatment presented with the shortest telomere lengths (β = −0.05, 95% CI −0.07 to −0.03; P < 0.0001), followed by those with two types of maltreatment (β = −0.02, 95% CI −0.04 to 0.00; P = 0.02), referent to those who had none. When adjusted for depression and post-traumatic stress disorder, the telomere lengths of participants with three or more types of maltreatment were still shorter (β = −0.04, 95% CI −0.07 to −0.02; P = 0.0008). The telomere lengths of those with one type of maltreatment were not significantly different from those who had none. When mutually adjusted, physical abuse (β = −0.05, 95% CI −0.07 to −0.03; P < 0.0001) and sexual abuse (β = −0.02, 95% CI −0.04 to 0.00; P = 0.02) were independently associated with shorter telomere length.
Conclusions
Our findings showed that child maltreatment is associated with shorter telomere length in middle- and older-aged adults, independent of sociodemographic and mental health factors.
Patients with bipolar disorder (BPD) are prone to engage in risk-taking behaviours and self-harm, contributing to higher risk of traumatic injuries requiring medical attention at the emergency room (ER).We hypothesize that pharmacological treatment of BPD could reduce the risk of traumatic injuries by alleviating symptoms but evidence remains unclear. This study aimed to examine the association between pharmacological treatment and the risk of ER admissions due to traumatic injuries.
Methods
Individuals with BPD who received mood stabilizers and/or antipsychotics were identified using a population-based electronic healthcare records database in Hong Kong (2001–2019). A self-controlled case series design was applied to control for time-invariant confounders.
Results
A total of 5040 out of 14 021 adults with BPD who received pharmacological treatment and had incident ER admissions due to traumatic injuries from 2001 to 2019 were included. An increased risk of traumatic injuries was found 30 days before treatment [incidence rate ratio (IRR) 4.44 (3.71–5.31), p < 0.0001]. After treatment initiation, the risk remained increased with a smaller magnitude, before returning to baseline [IRR 0.97 (0.88–1.06), p = 0.50] during maintenance treatment. The direct comparison of the risk during treatment to that before and after treatment showed a significant decrease. After treatment cessation, the risk was increased [IRR 1.34 (1.09–1.66), p = 0.006].
Conclusions
This study supports the hypothesis that pharmacological treatment of BPD was associated with a lower risk of ER admissions due to traumatic injuries but an increased risk after treatment cessation. Close monitoring of symptoms relapse is recommended to clinicians and patients if treatment cessation is warranted.
With high-sensitivity kiloparsec-scale radio polarimetry, we can examine the jet-medium interactions and get a better understanding of the blazar divide in radio-loud (RL) AGN. We are analyzing the radio polarimetric observations with the EVLA and GMRT of 24 quasars and BL Lacs belonging to the Palomar-Green (PG) sample. The RL quasars show extensive polarisation structures in their cores, jets, lobes, and hotspots, whereas preliminary results suggest that BL Lacs exhibit polarisation primarily in their cores and inner jet regions. These findings imply that both intrinsic (central engine-related) and extrinsic (environment-related) variables are important in the formation of the blazar subclasses. The Fanaroff-Riley (FR) dichotomy can also be studied assuming RL unification and looking through the lens of blazars. Due to the radio-unbiased nature of the optically/UV-selected PG sample, we find a large fraction of the PG quasars are restarted, distorted (S- or X-shaped), or have a hybrid FR morphology.
Predictors of new-onset bipolar disorder (BD) or psychotic disorder (PD) have been proposed on the basis of retrospective or prospective studies of ‘at-risk’ cohorts. Few studies have compared concurrently or longitudinally factors associated with the onset of BD or PDs in youth presenting to early intervention services. We aimed to identify clinical predictors of the onset of full-threshold (FT) BD or PD in this population.
Method
Multi-state Markov modelling was used to assess the relationships between baseline characteristics and the likelihood of the onset of FT BD or PD in youth (aged 12–30) presenting to mental health services.
Results
Of 2330 individuals assessed longitudinally, 4.3% (n = 100) met criteria for new-onset FT BD and 2.2% (n = 51) met criteria for a new-onset FT PD. The emergence of FT BD was associated with older age, lower social and occupational functioning, mania-like experiences (MLE), suicide attempts, reduced incidence of physical illness, childhood-onset depression, and childhood-onset anxiety. The emergence of a PD was associated with older age, male sex, psychosis-like experiences (PLE), suicide attempts, stimulant use, and childhood-onset depression.
Conclusions
Identifying risk factors for the onset of either BD or PDs in young people presenting to early intervention services is assisted not only by the increased focus on MLE and PLE, but also by recognising the predictive significance of poorer social function, childhood-onset anxiety and mood disorders, and suicide attempts prior to the time of entry to services. Secondary prevention may be enhanced by greater attention to those risk factors that are modifiable or shared by both illness trajectories.
While studies suggest that nutritional supplementation may reduce aggressive behavior in children, few have examined their effects on specific forms of aggression. This study tests the primary hypothesis that omega-3 (ω-3), both alone and in conjunction with social skills training, will have particular post-treatment efficacy for reducing childhood reactive aggression relative to baseline.
Methods
In this randomized, double-blind, stratified, placebo-controlled, factorial trial, a clinical sample of 282 children with externalizing behavior aged 7–16 years was randomized into ω-3 only, social skills only, ω-3 + social skills, and placebo control groups. Treatment duration was 6 months. The primary outcome measure was reactive aggression collected at 0, 3, 6, 9, and 12 months, with antisocial behavior as a secondary outcome.
Results
Children in the ω-3-only group showed a short-term reduction (at 3 and 6 months) in self-report reactive aggression, and also a short-term reduction in overall antisocial behavior. Sensitivity analyses and a robustness check replicated significant interaction effects. Effect sizes (d) were small, ranging from 0.17 to 0.31.
Conclusions
Findings provide some initial support for the efficacy of ω-3 in reducing reactive aggression over and above standard care (medication and parent training), but yield only preliminary and limited support for the efficacy of ω-3 in reducing overall externalizing behavior in children. Future studies could test further whether ω-3 shows promise in reducing more reactive, impulsive forms of aggression.
Early detection of karyotype abnormalities, including aneuploidy, could aid producers in identifying animals which, for example, would not be suitable candidate parents. Genome-wide genetic marker data in the form of single nucleotide polymorphisms (SNPs) are now being routinely generated on animals. The objective of the present study was to describe the statistics that could be generated from the allele intensity values from such SNP data to diagnose karyotype abnormalities; of particular interest was whether detection of aneuploidy was possible with both commonly used genotyping platforms in agricultural species, namely the Applied BiosystemsTM AxiomTM and the Illumina platform. The hypothesis was tested using a case study of a set of dizygotic X-chromosome monosomy 53,X sheep twins. Genome-wide SNP data were available from the Illumina platform (11 082 autosomal and 191 X-chromosome SNPs) on 1848 male and 8954 female sheep and available from the AxiomTM platform (11 128 autosomal and 68 X-chromosome SNPs) on 383 female sheep. Genotype allele intensity values, either as their original raw values or transformed to logarithm intensity ratio (LRR), were used to accurately diagnose two dizygotic (i.e. fraternal) twin 53,X sheep, both of which received their single X chromosome from their sire. This is the first reported case of 53,X dizygotic twins in any species. Relative to the X-chromosome SNP genotype mean allele intensity values of normal females, the mean allele intensity value of SNP genotypes on the X chromosome of the two females monosomic for the X chromosome was 7.45 to 12.4 standard deviations less, and were easily detectable using either the AxiomTM or Illumina genotype platform; the next lowest mean allele intensity value of a female was 4.71 or 3.3 standard deviations less than the population mean depending on the platform used. Both 53,X females could also be detected based on the genotype LRR although this was more easily detectable when comparing the mean LRR of the X chromosome of each female to the mean LRR of their respective autosomes. On autopsy, the ovaries of the two sheep were small for their age and evidence of prior ovulation was not appreciated. In both sheep, the density of primordial follicles in the ovarian cortex was lower than normally found in ovine ovaries and primary follicle development was not observed. Mammary gland development was very limited. Results substantiate previous studies in other species that aneuploidy can be readily detected using SNP genotype allele intensity values generally already available, and the approach proposed in the present study was agnostic to genotype platform.
Central nervous system infections (CNSI) are a leading cause of death and long-term disability in children. Using ICD-10 data from 2005 to 2015 from three central hospitals in Ho Chi Minh City (HCMC), Vietnam, we exploited generalized additive mixed models (GAMM) to examine the spatial-temporal distribution and spatial and climatic risk factors of paediatric CNSI, excluding tuberculous meningitis, in this setting. From 2005 to 2015, there were 9469 cases of paediatric CNSI; 33% were ⩽1 year old at admission and were mainly diagnosed with presumed bacterial CNSI (BI) (79%), the remainder were >1 year old and mainly diagnosed with presumed non-bacterial CNSI (non-BI) (59%). The urban districts of HCMC in proximity to the hospitals as well as some outer districts had the highest incidences of BI and non-BI; BI incidence was higher in the dry season. Monthly BI incidence exhibited a significant decreasing trend over the study. Both BI and non-BI were significantly associated with lags in monthly average temperature, rainfall, and river water level. Our findings add new insights into this important group of infections in Vietnam, and highlight where resources for the prevention and control of paediatric CNSI should be allocated.
Calcium-based renal calculi demonstrated significant heterogeneity in the structure, density, mineral composition, and material hardness not elucidated by routine clinical testing. Mineral density distributions within calcium oxalate stones revealed differential areas of low (590±80 mg/cc), medium (840±140 mg/cc), and high (1100±200 mg/cc) densities. Apatite stones also contained regions of low (700±200 mg/cc), medium (1100±200 mg/cc), and high (1400±140 mg/cc) densities within layers extending from single or multiple nucleation sites. Despite having lower average mineral density, calcium oxalate (CaOx) stones demonstrated higher material hardness compared to apatite stones, suggesting other chemical components might be involved in determining stone hardness properties. Carbon concentrated sites were identified between morphologic layers in CaOx stones and in stratified layers of apatite stones. Elemental analyses revealed numerous additional trace elements in both stone types. Despite the widespread assumption that stone mineral density is an indicator of susceptibility to lithotripsy, calcium stone mineral density estimates do not directly correlate with actual ex vivo stone hardness. Underlying stone heterogeneity in both structure and mineral density could explain why historical approaches have failed in accurately predicting response of stones to lithotripsy.
We report the temperature dependence of Er optical centers in GaN epilayers prepared by metal-organic chemical vapor deposition under the resonant excitation (4I15/2 → 4I9/2) excitation using a Ti:Sapphire laser (λexc = 809 nm). High resolution infrared spectroscopy and temperature dependence measurements of photoluminescence intensity from Er ions in GaN have been performed to identify the crystal filed splitting of the first excited state, 4I13/2. Here, we have employed a simple approach to determine activation energies which are related to the thermal population of electrons from the lowest level to the higher level of the crystal field splitting of the first excited state.
Experiments on the National Ignition Facility show that multi-dimensional effects currently dominate the implosion performance. Low mode implosion symmetry and hydrodynamic instabilities seeded by capsule mounting features appear to be two key limiting factors for implosion performance. One reason these factors have a large impact on the performance of inertial confinement fusion implosions is the high convergence required to achieve high fusion gains. To tackle these problems, a predictable implosion platform is needed meaning experiments must trade-off high gain for performance. LANL has adopted three main approaches to develop a one-dimensional (1D) implosion platform where 1D means measured yield over the 1D clean calculation. A high adiabat, low convergence platform is being developed using beryllium capsules enabling larger case-to-capsule ratios to improve symmetry. The second approach is liquid fuel layers using wetted foam targets. With liquid fuel layers, the implosion convergence can be controlled via the initial vapor pressure set by the target fielding temperature. The last method is double shell targets. For double shells, the smaller inner shell houses the DT fuel and the convergence of this cavity is relatively small compared to hot spot ignition. However, double shell targets have a different set of trade-off versus advantages. Details for each of these approaches are described.
A Bayesian Belief Network (BBN) for assessing the potential risk of dengue virus emergence and distribution in Western Australia (WA) is presented and used to identify possible hotspots of dengue outbreaks in summer and winter. The model assesses the probabilities of two kinds of events which must take place before an outbreak can occur: (1) introduction of the virus and mosquito vectors to places where human population densities are high; and (2) vector population growth rates as influenced by climatic factors. The results showed that if either Aedes aegypti or Ae. albopictus were to become established in WA, three centres in the northern part of the State (Kununurra, Fitzroy Crossing, Broome) would be at particular risk of experiencing an outbreak. The model can also be readily extended to predict the risk of introduction of other viruses carried by Aedes mosquitoes, such as yellow fever, chikungunya and Zika viruses.
Infections following cardiovascular implantable electronic device (CIED) procedures, including pacemaker and implantable cardioverter–defibrillators, are devastating and costly. Preimplantation prophylactic antimicrobials are effective for reducing postprocedural infections. However, routine postprocedural antimicrobials are not associated with improved outcomes, and they may be harmful. Thus, we sought to characterize antimicrobial use patterns following CIED procedures.
DESIGN
All patients who underwent CIED procedures from October 1, 2007 to September 30, 2013 and had procedural information entered into the VA Clinical Assessment Reporting and Tracking (CART) software program were included in this study. All antibiotic prescriptions lasting more than 24 hours following device implantation or revision were identified using pharmacy databases, and postprocedural antibiotic use lasting more than 24 hours was characterized.
RESULTS
In total, 3,712 CIED procedures were performed at 34 VA facilities on 3,570 patients with a mean age of 71.7 years (standard deviation [SD], 11.1 years), 98.4% of whom were male. Postprocedural antibiotics >24 hours were prescribed following 1,579 of 3,712 CIED procedures (42.5%). The median duration of therapy was 5 days (interquartile range [IQR], 3–7 days). The most commonly prescribed antibiotic was cephalexin (1,152 of 1,579; 72.9%), followed by doxycycline (118 of 1,579; 7.47%) and ciprofloxacin (93 of 1,579; 5.9%). Vancomycin was used in 73 of 1,579 prescriptions (4.62%). Among the highest quartile of procedural volume, prescribing practices varied considerably, ranging from 3.2% to 77.6%.
CONCLUSIONS
Nearly 1 in 2 patients received prolonged postprocedural antimicrobial therapy following CIED procedures, and the rate of postprocedural antimicrobial therapy use varied considerably by facility. Given the lack of demonstrated benefit of routine prolonged antimicrobial therapy following CIED procedures, antimicrobial use following cardiac device interventions may be a potential target for quality improvement programs and antimicrobial stewardship.
Objectives: Clinical neuroscience is increasingly turning to imaging the human brain for answers to a range of questions and challenges. To date, the majority of studies have focused on the neural basis of current psychiatric symptoms, which can facilitate the identification of neurobiological markers for diagnosis. However, the increasing availability and feasibility of using imaging modalities, such as diffusion imaging and resting-state fMRI, enable longitudinal mapping of brain development. This shift in the field is opening the possibility of identifying predictive markers of risk or prognosis, and also represents a critical missing element for efforts to promote personalized or individualized medicine in psychiatry (i.e., stratified psychiatry). Methods: The present work provides a selective review of potentially high-yield populations for longitudinal examination with MRI, based upon our understanding of risk from epidemiologic studies and initial MRI findings. Results: Our discussion is organized into three topic areas: (1) practical considerations for establishing temporal precedence in psychiatric research; (2) readiness of the field for conducting longitudinal MRI, particularly for neurodevelopmental questions; and (3) illustrations of high-yield populations and time windows for examination that can be used to rapidly generate meaningful and useful data. Particular emphasis is placed on the implementation of time-appropriate, developmentally informed longitudinal designs, capable of facilitating the identification of biomarkers predictive of risk and prognosis. Conclusions: Strategic longitudinal examination of the brain at-risk has the potential to bring the concepts of early intervention and prevention to psychiatry. (JINS, 2016, 22, 164–179)
Objectives: One of the most prominent features of schizophrenia is relatively lower general cognitive ability (GCA). An emerging approach to understanding the roots of variation in GCA relies on network properties of the brain. In this multi-center study, we determined global characteristics of brain networks using graph theory and related these to GCA in healthy controls and individuals with schizophrenia. Methods: Participants (N=116 controls, 80 patients with schizophrenia) were recruited from four sites. GCA was represented by the first principal component of a large battery of neurocognitive tests. Graph metrics were derived from diffusion-weighted imaging. Results: The global metrics of longer characteristic path length and reduced overall connectivity predicted lower GCA across groups, and group differences were noted for both variables. Measures of clustering, efficiency, and modularity did not differ across groups or predict GCA. Follow-up analyses investigated three topological types of connectivity—connections among high degree “rich club” nodes, “feeder” connections to these rich club nodes, and “local” connections not involving the rich club. Rich club and local connectivity predicted performance across groups. In a subsample (N=101 controls, 56 patients), a genetic measure reflecting mutation load, based on rare copy number deletions, was associated with longer characteristic path length. Conclusions: Results highlight the importance of characteristic path lengths and rich club connectivity for GCA and provide no evidence for group differences in the relationships between graph metrics and GCA. (JINS, 2016, 22, 240–249)
To assess the effectiveness of infection control preparedness for human infection with influenza A H7N9 in Hong Kong.
DESIGN
A descriptive study of responses to the emergence of influenza A H7N9.
SETTING
A university-affiliated teaching hospital.
PARTICIPANTS
Healthcare workers (HCWs) with unprotected exposure (not wearing N95 respirator during aerosol-generating procedure) to a patient with influenza A H7N9.
METHODS
A bundle approach including active and enhanced surveillance, early airborne infection isolation, rapid molecular diagnostic testing, and extensive contact tracing for HCWs with unprotected exposure was implemented. Seventy HCWs with unprotected exposure to an index case were interviewed especially regarding their patient care activities.
RESULTS
From April 1, 2013, through May 31, 2014, a total of 126 (0.08%) of 163,456 admitted patients were tested for the H7 gene by reverse transcription-polymerase chain reaction per protocol. Two confirmed cases were identified. Seventy (53.8%) of 130 HCWs had unprotected exposure to an index case, whereas 41 (58.6%) and 58 (82.9%) of 70 HCWs wore surgical masks and practiced hand hygiene after patient care, respectively. Sixteen (22.9%) of 70 HCWs were involved in high-risk patient contacts. More HCWs with high-risk patient contacts received oseltamivir prophylaxis (P=0.088) and significantly more had paired sera collected for H7 antibody testing (P<0.001). Ten (14.3%) of 70 HCWs developed influenza-like illness during medical surveillance, but none had positive results by reverse transcription-polymerase chain reaction. Paired sera was available from 33 of 70 HCWs with unprotected exposure, and none showed seroconversion against H7N9.
CONCLUSIONS
Despite the delay in airborne precautions implementation, no patient-to-HCW transmission of influenza A H7N9 was demonstrated.