We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Early life stress (ELS) and a Western diet (WD) promote mood and cardiovascular disorders, however, how these risks interact in disease pathogenesis is unclear. We assessed effects of ELS with or without a subsequent WD on behaviour, cardiometabolic risk factors, and cardiac function/ischaemic tolerance in male mice. Fifty-six new-born male C57BL/6J mice were randomly allocated to a control group (CON) undisturbed before weaning, or to maternal separation (3h/day) and early (postnatal day 17) weaning (MSEW). Mice consumed standard rodent chow (CON, n = 14; MSEW, n = 15) or WD chow (WD, n = 19; MSEW + WD, n = 19) from week 8 to 24. Fasted blood was sampled and open field test and elevated plus maze (EPM) tests undertaken at 7, 15, and 23 weeks of age, with hearts excised at 24 weeks for Langendorff perfusion (evaluating pre- and post-ischaemic function). MSEW alone transiently increased open field activity at 7 weeks; body weight and serum triglycerides at 4 and 7 weeks, respectively; and final blood glucose levels and insulin resistance at 23 weeks. WD increased insulin resistance and body weight gain, the latter potentiated by MSEW. MSEW + WD was anxiogenic, reducing EPM open arm activity vs. WD alone. Although MSEW had modest metabolic effects and did not influence cardiac function or ischaemic tolerance in lean mice, it exacerbated weight gain and anxiogenesis, and improved ischaemic tolerance in WD fed animals. MSEW-induced increases in body weight (obesity) in WD fed animals in the absence of changes in insulin resistance may have protected the hearts of these mice.
Perceived cognitive dysfunction is a common feature of late-life depression (LLD) that is associated with diminished quality of life and greater disability. Similar associations have been demonstrated in individuals with Hoarding Disorder. The degree to which hoarding behaviors (HB) are associated with greater perceived cognitive dysfunction and disability in individuals with concurrent LLD is not known.
Participants and Methods:
Participants with LLD (N=83) completed measures of hoarding symptom severity (Savings Inventory-Revised; SI-R) and were classified into two groups based on HB severity: LLD+HB who exhibited significant HB (SI-R . 41, n = 25) and LLD with low HB (SI-R < 41, n = 58). Additional measures assessed depression severity (Hamilton Depression Rating Scale; HDRS), perceived cognitive difficulties (Everyday Cognition Scale; ECOG), and disability (World Health Organization Disability Assessment Scale [WHODAS]-II-Short). Given a non-normal distribution of ECOG and WHODAS-II scores, non-parametric Wilcoxon-Mann-Whitney tests were used to assess group differences in perceived cognitive dysfunction and disability. A regression model assessed the extent to which perceived cognitive dysfunction was associated with hoarding symptom severity measured continuously, covarying for age, education, gender, and depression severity. A separate regression model assessed the extent to which disability scores were associated with perceived cognitive dysfunction and HB severity covarying for demographics and depression severity.
Results:
LLD+HB endorsed significantly greater perceived cognitive dysfunction (W = 1023, p = 0.003) and greater disability (W = 1006, p = < 0.001) compared to LLD. Regression models accounting for demographic characteristics and depression severity revealed that greater HB severity was associated with greater perceived cognitive dysfunction (β = 0.009, t = 2.765, p = 0.007). Increased disability was associated with greater perceived cognitive dysfunction (β = 4.792, t(71) = 3.551, p = 0.0007) and HB severity (β = 0.080, t(71) = 1.944, p = 0.056) approached significance after accounting for variance explained by depression severity and demographic covariates.
Conclusions:
Our results suggest that hoarding behaviors are associated with increased perceived cognitive dysfunction and greater disability in individuals with LLD. Screening for HB in individuals with LLD may help identify those at greater risk for poor cognitive and functional outcomes. Interventions that target HB and perceived cognitive difficulties may decrease risk for disability in LLD. However, longitudinal studies would be required to further evaluate these relationships.
Dreaming has always aroused our curiosity. Theories as to the cause and function of dreams have been described since the beginning of recorded history (George 2020). In the late 19th century, experimental psychologists and psychologically-minded researchers from other disciplines made important methodological contibutions, emptical observations, and conceptual developments to the study of dreams (e.g., Jastrow, 1888; Manacéïne, 1897; De Sanctis, 1899; Vold, 1897). At the end of the 19th century, Mary Whiton Calkins and her female students made pioneering advancements in the psychological science of dreams (Calkins 1893; Weed et al. 1896). Freud’s psychoanalytic theory soon overshadowed these groundbreaking empirical works as the interpretation of dream content and their presumed reflections of the unconscious mind became the focus. The detection of rapid eye movements during sleep in 1953 and the suggestion that dreams occurred exclusively during this newly defined sleep state electrified the field of dream research (Aserinsky and Kleitman 1953; Dement and Kleitman 1957). Although eye movements (Ladd, 1892), increased brain pulsations (Mosso, 1881), and electroencephalographic patterns (Loomis et al., 1937; Davis et al., 1938) had been previously argued to empirically correspond to dreaming, this discovery catalyzed the first “Meeting of Researchers in the Field of EEG and Dreams” at the University of Chicago in 1961 organized by psychologist Allan Rechtschaffen (Association for the Psychophysiological Study of Sleep Records). Renamed the Annual Meeting of the Association of the Psychophysiological Study of Sleep in subsequent years, these early meetings consisted principally of psychiatrists and psychologists, most of whom with interests in dream research. Among them, John Antrobus, Rosalind Cartwright, G. William Domhoff, David Foulkes, Donald R. Goodenough, Calvin S. Hall, Ernest Hartmann and Joe Kamiya, made valuable contributions to our understanding of dreaming through decades of psychological research (Antrobus, 1992; Domhoff and Kamyia, 1964; Ellman and Antrobus, 1991; Foulkes, 1966, 1985; Goodenough et al., 1965; Hall and Van de Castle, 1966; Hartmann, 2010). While David Foulkes tirelessly advocated for his vision of a descriptive and explanitory dream psychology, Rosalind Cartwright developed an applied vision for the field outlining over 100 dream-related questions that remain pertinent to sleep psychology (Cartwright 1977, 1978, 2010). With the rise of sleep medicine and the vicissitudes of funding, dream research drifted to the fringe of sleep research by the end of the 1980s (Foulkes 1996). Nevertheless, dreaming remains a central topic of sleep psychology, and many questions remain to be answered.
Johnson, Bilovich, and Tuckett set out a helpful framework for thinking about how humans make decisions under radical uncertainty and contrast this with classical decision theory. We show that classical theories assume so little about psychology that they are not necessarily in conflict with this approach, broadening its appeal.
OBJECTIVES/GOALS: Glioblastomas (GBMs) are heterogeneous, treatment-resistant tumors that are driven by populations of cancer stem cells (CSCs). In this study, we perform an epigenetic-focused functional genomics screen in GBM organoids and identify WDR5 as an essential epigenetic regulator in the SOX2-enriched, therapy resistant cancer stem cell niche. METHODS/STUDY POPULATION: Despite their importance for tumor growth, few molecular mechanisms critical for CSC population maintenance have been exploited for therapeutic development. We developed a spatially resolved loss-of-function screen in GBM patient-derived organoids to identify essential epigenetic regulators in the SOX2-enriched, therapy resistant niche. Our niche-specific screens identified WDR5, an H3K4 histone methyltransferase responsible for activating specific gene expression, as indispensable for GBM CSC growth and survival. RESULTS/ANTICIPATED RESULTS: In GBM CSC models, WDR5 inhibitors blocked WRAD complex assembly and reduced H3K4 trimethylation and expression of genes involved in CSC-relevant oncogenic pathways. H3K4me3 peaks lost with WDR5 inhibitor treatment occurred disproportionally on POU transcription factor motifs, required for stem cell maintenance and including the POU5F1(OCT4)::SOX2 motif. We incorporated a SOX2/OCT4 motif driven GFP reporter system into our CSC cell models and found that WDR5 inhibitor treatment resulted in dose-dependent silencing of stem cell reporter activity. Further, WDR5 inhibitor treatment altered the stem cell state, disrupting CSC in vitro growth and self-renewal as well as in vivo tumor growth. DISCUSSION/SIGNIFICANCE: Our results unveiled the role of WDR5 in maintaining the CSC state in GBM and provide a rationale for therapeutic development of WDR5 inhibitors for GBM and other advanced cancers. This conceptual and experimental framework can be applied to many cancers, and can unmask unique microenvironmental biology and rationally designed combination therapies.
The inclusion of kinetic effects into fluid models has been a long standing problem in magnetic reconnection and plasma physics. Generally, the pressure tensor is reduced to a scalar which is an approximation used to aid in the modelling of large scale global systems such as the Earth's magnetosphere. This unfortunately omits important kinetic physics which have been shown to play a crucial role in collisionless regimes. The multi-fluid ten-moment model, however, retains the full symmetric pressure tensor. The ten-moment model is constructed by taking moments of the Vlasov equation up to second order, and includes the scalar density, the vector bulk-flow and the symmetric pressure tensor for a total of ten separate components. Use of the multi-fluid ten-moment model requires a closure which truncates the cascading system of equations. Here we look to leverage data-driven methodologies to seek a closure which may improve the physical fidelity of the ten-moment multi-fluid model in collisionless regimes. Specifically, we use the sparse identification of nonlinear dynamics (SINDy) method for symbolic equation discovery to seek the truncating closure from fully kinetic particle-in-cell simulation data, which inherently retains the relevant kinetic physics. We verify our method by reproducing the ten-moment model from the particle-in-cell (PIC) data and use the method to generate a closure truncating the ten-moment model which is analysed through the nonlinear phase of reconnection.
This book presents a novel interpretation of the nature, causes and consequences of sex inequality in the modern labour market. Employing a sophisticated new theoretical framework, and drawing on original fieldwork, the book develops a subtle account of the phenomenon of sex segregation and offers a major challenge to existing approaches.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
There is growing interest globally in using real-world data (RWD) and real-world evidence (RWE) for health technology assessment (HTA). Optimal collection, analysis, and use of RWD/RWE to inform HTA requires a conceptual framework to standardize processes and ensure consistency. However, such framework is currently lacking in Asia, a region that is likely to benefit from RWD/RWE for at least two reasons. First, there is often limited Asian representation in clinical trials unless specifically conducted in Asian populations, and RWD may help to fill the evidence gap. Second, in a few Asian health systems, reimbursement decisions are not made at market entry; thus, allowing RWD/RWE to be collected to give more certainty about the effectiveness of technologies in the local setting and inform their appropriate use. Furthermore, an alignment of RWD/RWE policies across Asia would equip decision makers with context-relevant evidence, and improve timely patient access to new technologies. Using data collected from eleven health systems in Asia, this paper provides a review of the current landscape of RWD/RWE in Asia to inform HTA and explores a way forward to align policies within the region. This paper concludes with a proposal to establish an international collaboration among academics and HTA agencies in the region: the REAL World Data In ASia for HEalth Technology Assessment in Reimbursement (REALISE) working group, which seeks to develop a non-binding guidance document on the use of RWD/RWE to inform HTA for decision making in Asia.
The incidence of dementia in Black, Asian and minority ethnic (BAME) groups is increasing in the UK, with concern about underdiagnosis and late presentation.
Aims
By reviewing referrals to memory clinics from Leicester City we examined whether the following differed by ethnicity: the proportion with a diagnosis of dementia, type of dementia and severity at presentation.
Method
We examined referrals between 2010 and 2017: all those whose ethnicity was recorded as Black (n = 131) and a random sample of 260 Asian and 259 White British referrals. Severity of dementia was assessed by record review. Odds ratios (ORs) were adjusted for general practice, age, gender and year of referral.
Results
A diagnosis of dementia was recorded in 193 (74.5%) White British, 96 (73.3%) Black and 160 (61.5%) Asian referrals. Compared with Asians, White British had twice the adjusted odds of a dementia diagnosis (OR = 1.99 (1.23–3.22). Of those with dementia, Alzheimer's disease was more common in White British (57.0%) than in Asian (43.8%) and Black referrals (51.0%): adjusted OR White British versus Asian 1.76 (1.11–2.77). Of those with dementia, the proportion with moderate/severe disease was highest in White British (66.8%), compared with 61.9% in Asian and 45.8% in Black groups. The adjusted OR for the White versus Black groups was 2.03 (1.10–3.72), with no significant difference between Asian and White British groups.
Conclusions
Differences in confirmed dementia suggest general practitioners have a lower threshold for referral for possible dementia in some BAME groups. Unlike other centres, we found no evidence of greater severity at presentation in Asian and Black groups.
This study explored factors that influence academic achievement and hence, future career prospects. The relationships between the factors, academic trait boredom, approach to learning and academic achievement were examined using data collected from university students at a small English university and from their student records. The initial statistical analysis revealed significant effects of gender on learning approach and two of the three academic trait boredom subscales. Female students proved to be less prone to academic trait boredom than their male counterparts. A model was then developed that showed how a student’s choice of learning approach was influenced by academic trait boredom and impinged on academic achievement. This modelling also confirmed that students who are more prone to academic trait boredom are also more likely to adopt a surface approach to learning rather than a deep or strategic one. The results of this investigation have implications for students, lecturers, course designers and learning support staff both here in this one location as well as elsewhere across the higher education sector.
The initial classic Fontan utilising a direct right atrial appendage to pulmonary artery anastomosis led to numerous complications. Adults with such complications may benefit from conversion to a total cavo-pulmonary connection, the current standard palliation for children with univentricular hearts.
Methods:
A single institution, retrospective chart review was conducted for all Fontan conversion procedures performed from July, 1999 through January, 2017. Variables analysed included age, sex, reason for Fontan conversion, age at Fontan conversion, and early mortality or heart transplant within 1 year after Fontan conversion.
Results:
A total of 41 Fontan conversion patients were identified. Average age at Fontan conversion was 24.5 ± 9.2 years. Dominant left ventricular physiology was present in 37/41 (90.2%) patients. Right-sided heart failure occurred in 39/41 (95.1%) patients and right atrial dilation was present in 33/41 (80.5%) patients. The most common causes for Fontan conversion included atrial arrhythmia in 37/41 (90.2%), NYHA class II HF or greater in 31/41 (75.6%), ventricular dysfunction in 23/41 (56.1%), and cirrhosis or fibrosis in 7/41 (17.1%) patients. Median post-surgical follow-up was 6.2 ± 4.9 years. Survival rates at 30 days, 1 year, and greater than 1-year post-Fontan conversion were 95.1, 92.7, and 87.8%, respectively. Two patients underwent heart transplant: the first within 1 year of Fontan conversion for heart failure and the second at 5.3 years for liver failure.
Conclusions:
Fontan conversion should be considered early when atrial arrhythmias become common rather than waiting for severe heart failure to ensue, and Fontan conversion can be accomplished with an acceptable risk profile.
The distance-of-influence of an individual devil's-claw plant to cotton leaf, stem, boll (reproductive parts), and combined aboveground plant parts was determined in two field experiments. The distance-of-influence could not be detected for the first 6 weeks after emergence; by 9 to 12 weeks, it extended up to 25 cm or more; and by the end of the season, up to 50 cm on each side of the weed. Cotton leaf and stem weights were less sensitive than cotton boll weights for measuring distance-of-influence from devil's-claw. At maturity, cotton boll weight was reduced 62 and 51% in 1983 and 45 and 29% in 1984 for sampling intervals of 0 to 25 and 25 to 50 cm, respectively. Interference from cotton reduced devil's-claw stem weight, seed capsule weight, and whole plant biomass by 6 weeks after emergence and reduced leaf biomass by 9 weeks in 1983. All except stem biomass were affected in 1984. Distance-of-influence and weed-density interference studies predicted lint yield loss similarly.
This paper reports on a project conducted between 2008 and 2011 that was established to allow eight Australian Indigenous men who had been in prison to tell their stories of incarceration.
Background
The Shed in Western Sydney, NSW, Australia, was set up in response to the high male suicide rate in that area, its objective being to support men at risk. Aboriginal men were the most at risk, and they are presently imprisoned at a rate of 13 times more than non-Indigenous men. This small project sought to give voice to the men behind the statistics and point to a significant problem in Australian society.
Methods
Interviews were conducted by an Indigenous male, questions covering age at first entering the penal system, number of prison stays, support, and health. This paper is framed around responses to these questions.
Results
All but one of the men were recidivist offenders, and over half were under 15 years of age when first offending. All talked about a lack of support both inside and after leaving prison, and alcohol and depression figured strongly in the accounts. Disadvantage and social exclusion, lack of support such as access to housing and health services, figure significantly in the men’s stories. It is only when social issues are addressed that any gains will be achieved and a cycle of recidivism broken.
An efficient and robust method to measure vitamin D (25-hydroxy vitamin D3 (25(OH)D3) and 25-hydroxy vitamin D2 in dried blood spots (DBS) has been developed and applied in the pan-European multi-centre, internet-based, personalised nutrition intervention study Food4Me. The method includes calibration with blood containing endogenous 25(OH)D3, spotted as DBS and corrected for haematocrit content. The methodology was validated following international standards. The performance characteristics did not reach those of the current gold standard liquid chromatography-MS/MS in plasma for all parameters, but were found to be very suitable for status-level determination under field conditions. DBS sample quality was very high, and 3778 measurements of 25(OH)D3 were obtained from 1465 participants. The study centre and the season within the study centre were very good predictors of 25(OH)D3 levels (P<0·001 for each case). Seasonal effects were modelled by fitting a sine function with a minimum 25(OH)D3 level on 20 January and a maximum on 21 July. The seasonal amplitude varied from centre to centre. The largest difference between winter and summer levels was found in Germany and the smallest in Poland. The model was cross-validated to determine the consistency of the predictions and the performance of the DBS method. The Pearson’s correlation between the measured values and the predicted values was r 0·65, and the sd of their differences was 21·2 nmol/l. This includes the analytical variation and the biological variation within subjects. Overall, DBS obtained by unsupervised sampling of the participants at home was a viable methodology for obtaining vitamin D status information in a large nutritional study.
Electroencephalography (EEG) is playing an increasingly important role in the management of comatose patients in the intensive care unit.
Methods:
The techniques of EEG monitoring are reviewed. Initially, standard, discontinuous recordings were performed in intensive care units (ICUs). Later, continuous displays of “raw EEG” (CEEG) were used. More recently, the addition of quantitative techniques allowed for more effective reading.
Results and Conclusions:
Applications of continuous EEG to clinical problems are discussed. The most useful role of CEEG appears to be the detection and management of nonconvulsive seizures. There is a need for controlled studies to assess the role for CEEG in neuro-ICUs and general ICUs.