We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
When archaeologists discuss ‘ancestor cults’ or ‘ancestor veneration’, what this might entail in practice usually remains vague, leading to charges that the concept of ‘ancestors’ is often applied generically. In this article, the authors combine bioarchaeological, taphonomic, radiocarbon, and isotopic studies to explore the ritual practice of the selective retention, curation, and deposition of a group of human crania and mandibles. Between 5500–5400 bc, Neolithic people at Masseria Candelaro (Puglia, Italy) deposited broken crania and mandibles from about fifteen individuals in a heap in the centre of the village. These individuals were mostly probable males, collected over the course of two centuries and actively used, with their deposition marking the final disposal of a ritual collection. The motivations for the curation of cranial bone are investigated through comparison with archaeological and ethnographic examples, advancing an interpretation of ritual practice directed towards ancestors.
Fear learning is a core component of conceptual models of how adverse experiences may influence psychopathology. Specifically, existing theories posit that childhood experiences involving childhood trauma are associated with altered fear learning processes, while experiences involving deprivation are not. Several studies have found altered fear acquisition in youth exposed to trauma, but not deprivation, although the specific patterns have varied across studies. The present study utilizes a longitudinal sample of children with variability in adversity experiences to examine associations among childhood trauma, fear learning, and psychopathology in youth.
Methods
The sample includes 170 youths aged 10–13 years (M = 11.56, s.d. = 0.47, 48.24% female). Children completed a fear conditioning task while skin conductance responses (SCR) were obtained, which included both acquisition and extinction. Childhood trauma and deprivation severity were measured using both parent and youth report. Symptoms of anxiety, externalizing problems, and post-traumatic stress disorder (PTSD) were assessed at baseline and again two-years later.
Results
Greater trauma-related experiences were associated with greater SCR to the threat cue (CS+) relative to the safety cue (CS−) in early fear acquisition, controlling for deprivation, age, and sex. Deprivation was unrelated to fear learning. Greater SCR to the threat cue during early acquisition was associated with increased PTSD symptoms over time controlling for baseline symptoms and mediated the relationship between trauma and prospective changes in PTSD symptoms.
Conclusions
Childhood trauma is associated with altered fear learning in youth, which may be one mechanism linking exposure to violence with the emergence of PTSD symptoms in adolescence.
An average of 1300 adults develop First Episode Psychosis (FEP) in Ireland each year. Early Intervention in Psychosis (EIP) is now widely accepted as best practice in the treatment of conditions such as schizophrenia. A local EIP programme was established in the Dublin South Central Mental Health Service in 2012.
Methods:
This is a cross-sectional study of service users presenting to the Dublin South Central Mental Health Service with FEP from 2016 to 2022 following the introduction of the EIP programme. We compared this to a previously published retrospective study of treatment as usual from 2002 to 2012.
Results:
Most service users in this study were male, single, unemployed and living with their partner or spouse across both time periods. Cognitive Behavioural Therapy for psychosis was provided to 12% (n = 8) of service users pre-EIP as compared to 52% (n = 30) post-programme introduction (p < 0.001), and 3% (n = 2) of service users engaged with behavioural family therapy pre-EIP as opposed to 15% (n = 9) after (p < 0.01). Rates of composite baseline physical healthcare monitoring improved significantly (p < 0.001).
Conclusion:
Exclusive allocation of multidisciplinary team staff to EIP leads to improved compliance with recommended guidelines, particularly CBT-p, formal family therapy and physical health monitoring.
Paradigmatic of a cultural shift in Irish education in the 1960s, St Brendan’s Community School in Birr, County Offaly was designed by Peter and Mary Doyle as a flexible and extendable mat-building articulated by generous social spaces including exterior courtyards and an interior ‘street’. Owned by the Department of Education and Skills, administered by a Board of Management and occupied by approximately one thousand staff and pupils daily, St Brendan’s has been in continuous use since its opening in 1980. Generations of students have benefited from the intimate relationship between the cultural and social life of the school and the architectural form, fabric, and technology that facilitates it. But by the beginning of the twentieth-first century, due to the lack of consideration given to such aspects at the time it was conceived and constructed, the building was suffering from ongoing material degradation and issues in environmental performance.
This article reflects on a research project undertaken on the school, which aimed to provide the means by which its learning environment and energy use could be improved and optimised in a manner consistent with the integrity of the architects’ conceptual thinking and built design: the opportunity for St Brendan’s to continue its course as a successful paradigm, this time for twenty-first-century education through the reconciliation of its future use with its social and education heritage. Guided by the ‘three dimensions of modernity’ - social, technical, and aesthetic - this process involved the development of new ‘ways of seeing’ and ‘methods of action’ applied to the school realised through the production of a series of representations that collectively identified, mapped, and re-presented the significances and values of the school, element by element. The relationships between these phenomena were complex and necessitated an innovative interdisciplinary approach.
St Brendan’s may have embodied a radical new social agenda for education and indeed society in twentieth-century Ireland in its architecture, but the building remains unlisted and, until recently, its significance (nationally and internationally) has been much overlooked. Part of this project’s agenda, therefore, involved raising awareness in the value of the building among existing and potential future stakeholders. The creation of accessible forms of communication that would both synthesise and make clear the complex data generated and the relationships between them was of central importance to the team’s approach. The paper ultimately argues that while attuned to a specific site, these techniques contain the possibility of a wider application, a new visual literacy for the conservation of twentieth-century buildings.
Northern Ireland has had the highest suicide and self-harm rate in the UK since 2012 according to National Statistics Office with 12.5 deaths per 100,000 population compared to 10.5 in the rest of the country. Evidence shows that the risk of suicide hugely increases following self-harm, and the greatest risk is immediately after the self-harm episode. Better access to health care, especially to primary care, in this period, can actively reduce the risk to this vulnerable patient group. Patients assessed for self-harm in the emergency department are often followed up by the mental health/crisis team. Due to lack of resources and staff shortages this is often not possible in a timely fashion. NICE suggests that patients should be offered a follow-up appointment in primary care within 48 hours of discharge. We aimed to ensure 70% of patients discharged from secondary care following an episode of suicidal ideation or self-harm are contacted proactively by mental health practitioner (MHP) or GP within 48 hours of communication from secondary care.
Methods
The project underwent two PDSA cycles. An electronic workflow was created to provide easy patient identification, assessment and follow-up. A process mapping was done after discussion with the GPs, administrative team, practice nurses and MHP. Outcome was measured by finding out percentage of patients: 1) Contacted within 48 hours of communication following an episode of self-harm 2) Appropriately coded 3) Comprehensively assessed 4) Risk stratified and minimized following each cycle.
Results
Over a period of three months, following two PDSA cycles, the frequency of these contacts increased from 0 to 80% (median) with an average 3.8 (83%) patients reviewed per week. The patient experience and satisfaction also improved significantly.
Conclusion
General practice (GP) has long been known as the next of kin for patients in the health care system. As GP is mostly the first point of contact for the patients, it can contribute significantly to ease the rising pressure on the mental health team. Also, a small number of weekly contacts from each GP can make a huge difference in nationwide patient safety and experience. We hope this intervention will significantly improve patient safety and reduce further self-harm presentation to ED in the long run.
Vitamin D, Ca and dairy products are negatively associated with colorectal cancer (CRC) incidence, but little is known of their influence on CRC survival. To investigate prediagnostic intakes of vitamin D, Ca and dairy products for their relevance to CRC prognosis, we analysed 504 CRC patients enrolled in the Newfoundland Colorectal Cancer Registry Cohort Study who were diagnosed for the first time with CRC between 1999 and 2003. Follow-up for mortality and cancer recurrence was through April 2010. Data on diet and lifestyle factors were gathered via a validated, semi-quantitative FFQ and a Personal History Questionnaire. Multivariate Cox models estimated hazard ratios (HR) and 95 % CI for the relationship of prediagnostic intakes of vitamin D, Ca and dairy products with all-cause mortality (overall survival, OS) and disease-free survival (DFS) among CRC patients. We found that prediagnostic Ca intake from foods, but not total Ca intake, was negatively associated with all-cause mortality (HR for Q2 v. Q1, 0·44; 95 % CI, 0·26, 0·75). An inverse relationship was also seen in a dose–response fashion for prediagnostic cheese intake (HR for Q4 v. Q1, 0·57, 95 % CI, 0·34, 0·95, Ptrend = 0·029). No evidence for modification by sex, physical activity, alcohol drinking and cigarette smoking was observed. In summary, high prediagnostic intakes of cheese and Ca from foods may be associated with increased survival among CRC patients. By manipulating diet, this study may contribute to the development of novel therapies that add to the armamentarium against CRC. Replication studies are required before any nutritional interventions are made available.
Sociology is a highly reflexive subject. All scholarly disciplines examine themselves reflexively in terms of theory and practice as they apply what the sociologist of science Robert Merton once called ‘organised scepticism’. Sociology adds to this constant internal academic debate a forceful, almost obsessive, concern about its very purpose and rationale. This attentiveness to founding principles shows itself in significant intellectual interest in the ‘canon’ of great thinkers and its history as a discipline, in vigorous debate about the boundaries of the discipline, and in considerable inventiveness in developing new areas and subfields of sociology. This fascination with the purpose and social organization of the discipline also reflects in the debate about sociology's civic engagements and commitments, its level of activism, and its moral and political purposes.
This echoes the contemporary discussion about the idea of public sociology. Public sociology is a new phrase for a long-standing debate about the purpose of sociology that began with the discipline's origins. It is therefore no coincidence that students in the twentyfirst century, when being introduced to sociology for the first time, wrestle with ideas formulated centuries before, for while social change has rendered some of these ideas redundant, particularly the Social Darwinism of the nineteenth century and functionalism in the 1950s, familiarity with these earlier debates and frameworks is the lens into understanding the purpose, value and prospect of sociology as key thinkers conceived it in the past. The ideas may have changed but the moral purpose has not.
A contentious discipline is destined to argue continually about its past. Some see the roots of sociology grounded in medieval scholasticism, in eighteenth-century Scotland, with the Scottish Enlightenment's engagement with the social changes wrought by commercialism, in conservative reactions to the Enlightenment, or in nineteenthcentury encounters with the negative effects of industrialization and modernization. Contentious disciplines however, are condemned to always live in their past if they do not also develop a vision for their future; a sense purpose and a rationale that takes the discipline forward. Sociology has always been forward looking, offering an analysis and diagnosis of what C. Wright Mills liked to call the human condition. Interest in the social condition and in its improvement and betterment for the majority of ordinary men and women, has always been sociology's ultimate objective.
Sociology is a highly reflexive subject. All scholarly disciplines examine themselves reflexively in terms of theory and practice as they apply what the sociologist of science Robert Merton once called ‘organised scepticism’. Sociology adds to this constant internal academic debate also a vigorous, almost obsessive, concern about its very purpose and rationale. This attentiveness to founding principles shows itself in significant intellectual interest in the ‘canon’ of great thinkers and its history as a discipline, in vigorous debate about the boundaries of the discipline, and in considerable inventiveness in developing new areas and subfields of sociology. This fascination with the purpose and social organization of the discipline also reflects in the debate about sociology's civic engagements and commitments, its level of activism, and its moral and political purposes.
This echoes the contemporary discussion about the idea of public sociology. ‘Public sociology’ is a new phrase for a long-standing debate about the purpose of sociology that began with the discipline's origins. It is therefore no coincidence that students in the 21st century, when being introduced to sociology for the first time, wrestle with ideas formulated centuries before, for while social change has rendered some of these ideas redundant, particularly the Social Darwinism of the 19th century and functionalism in the 1950s, familiarity with these earlier debates and frameworks is the lens into understanding the purpose, value and prospect of sociology as key thinkers conceived it in the past. The ideas may have changed but the moral purpose has not.
A contentious discipline is destined to argue continually about its past. Some see the roots of sociology grounded in medieval scholasticism, in 18th century Scotland, with the Scottish Enlightenment's engagement with the social changes wrought by commercialism, in conservative reactions to the Enlightenment or in 19th century encounters with the negative effects of industrialization and modernization. Contentious disciplines, however, are condemned to always live in their past if they do not also develop a vision for their future; a sense purpose and a rationale that takes the discipline forward. Sociology has always been forward looking, offering an analysis and diagnosis of what C. Wright Mills liked to call the human condition.
Exposure to childhood adversity is a powerful risk factor for psychopathology. Despite extensive efforts, we have not yet identified effective or scalable interventions that prevent the emergence of mental health problems in children who have experienced adversity. In this modified Delphi study, we identified intervention strategies for effectively targeting both the neurodevelopmental mechanisms linking childhood adversity and psychopathology – including heightened emotional reactivity, difficulties with emotion regulation, blunted reward processing, and social information processing biases, as well as a range of psychopathology symptoms. We iteratively synthesized information from experts in the field and relevant meta-analyses through three surveys, first with experts in intervention development, prevention, and childhood adversity (n = 32), and then within our study team (n = 8). The results produced increasing stability and good consensus on intervention strategy recommendations for specific neurodevelopmental mechanisms and symptom presentations and on strength of evidence ratings of intervention strategies targeting youth and parents. More broadly, our findings highlight how intervention decision making can be informed by meta-analyses, enhanced by aggregate group feedback, saturated before consensus, and persistently subjective or even contradictory. Ultimately, the results converged on several promising intervention strategies for prevention programming with adversity-exposed youth, which will be tested in an upcoming clinical trial.
Sociology is a highly reflexive subject. All scholarly disciplines examine themselves reflexively in terms of theory and practice as they apply what the sociologist of science Robert Merton once called ‘organised scepticism’. Sociology adds to this constant internal academic debate also a vigorous, almost obsessive, concern about its very purpose and rationale. This attentiveness to founding principles shows itself in significant intellectual interest in the ‘canon’ of great thinkers and its history as a discipline, in vigorous debate about the boundaries of the discipline, and in considerable inventiveness in developing new areas and subfields of sociology. This fascination with the purpose and social organisation of the discipline also reflects in the debate about sociology's civic engagements and commitments, its level of activism, and its moral and political purposes.
This echoes the contemporary discussion about the idea of public sociology. ‘Public sociology’ is a new phrase for a long-standing debate about the purpose of sociology that began with the discipline's origins. It is therefore no coincidence that students in the twentyfirst century, when being introduced to sociology for the first time, wrestle with ideas formulated centuries before, for while social change has rendered some of these ideas redundant, particularly the Social Darwinism of the nineteenth century and functionalism in the 1950s, familiarity with these earlier debates and frameworks is the lens into understanding the purpose, value and prospect of sociology as key thinkers conceived it in the past. The ideas may have changed but the moral purpose has not.
A contentious discipline is destined to argue continually about its past. Some see the roots of sociology grounded in medieval scholasticism, in eighteenth-century Scotland, with the Scottish Enlightenment's engagement with the social changes wrought by commercialism, in conservative reactions to the Enlightenment or in nineteenthcentury encounters with the negative effects of industrialisation and modernisation. Contentious disciplines, however, are condemned to always live in their past if they do not also develop a vision for their future; a sense of purpose and a rationale that takes the discipline forward. Sociology has always been forward looking, offering an analysis and diagnosis of what C. Wright Mills liked to call the human condition.
The enteroendocrine system is located in the gastrointestinal (GI) tract, and makes up the largest endocrine system in the human body. Despite that, its roles and functions remain incompletely understood. Gut regulatory peptides are the main products of enteroendocrine cells, and play an integral role in the digestion and absorption of nutrients through their effect on intestinal secretions and gut motility. Several peptides, such as cholecystokinin, polypeptide YY and glucagon-like peptide-1, have traditionally been reported to suppress appetite following food intake, so-called satiety hormones. In this review, we propose that, in the healthy individual, this system to regulate appetite does not play a dominant role in normal food intake regulation, and that there is insufficient evidence to wholly link postprandial endogenous gut peptides with appetite-related behaviours. Instead, or additionally, top-down, hedonic drive and neurocognitive factors may have more of an impact on food intake. In GI disease however, supraphysiological levels of these hormones may have more of an impact on appetite regulation as well as contributing to other unpleasant abdominal symptoms, potentially as part of an innate response to injury. Further work is required to better understand the mechanisms involved in appetite control and unlock the therapeutic potential offered by the enteroendocrine system in GI disease and obesity.
OBJECTIVES/SPECIFIC AIMS: The aims of this study are 2-fold: (1) to determine if maternal schistosomiasis affects maternal immunity to tetanus and/or transplacental transfer of antitetanus toxoid (TT) immunoglobulin G (IgG) from mother to infant and (2) determine the influence of maternal schistosomiasis on infant BCG vaccine immunogenicity. METHODS/STUDY POPULATION: The study will utilize blood samples from a historic cohort of 100 mother-infant pairs from Kisumu, Kenya, a schistosomiasis-endemic area. For the first aim, we will evaluate maternal schistosomal circulating anodic antigen, which has improved sensitivity and specificity to detect active schistosomiasis from serum, and antisoluble egg antigen IgG positivity compared with quantitative maternal anti-TT IgG at delivery and anti-TT IgG cord blood to maternal blood ratio (cord:maternal ratio). For the second aim, we will evaluate association between maternal schistosomiasis as detected by circulating anodic antigen and antisoluble egg antigen IgG at delivery and infant BCG-specific Th1-cytokine positive CD4+ cells at 10 weeks following BCG vaccination at birth. RESULTS/ANTICIPATED RESULTS: We hypothesize that active maternal schistosomiasis will be associated with decreased maternal anti-TT IgG and reduced efficiency of transplacental transfer, as measured by infant cord blood to maternal blood ratio of anti-TT IgG. We also expect that maternal schistosomiasis will be associated with decreased infant immunogenicity to BCG vaccine. DISCUSSION/SIGNIFICANCE OF IMPACT: This is a formative study on infant vaccine immunity using laboratory methodology not previously applied. Understanding infant immunity in the setting of maternal schistosomiasis will inform vaccination strategies and tailor vaccine development in schistosome-endemic areas such as Kenya, where neither TB nor neonatal tetanus have been eradicated. Additionally, our results will inform public health policies to consider integration of antischistosomal agents in antenatal care.
The deep subsurface of other planetary bodies is of special interest for robotic and human exploration. The subsurface provides access to planetary interior processes, thus yielding insights into planetary formation and evolution. On Mars, the subsurface might harbour the most habitable conditions. In the context of human exploration, the subsurface can provide refugia for habitation from extreme surface conditions. We describe the fifth Mine Analogue Research (MINAR 5) programme at 1 km depth in the Boulby Mine, UK in collaboration with Spaceward Bound NASA and the Kalam Centre, India, to test instruments and methods for the robotic and human exploration of deep environments on the Moon and Mars. The geological context in Permian evaporites provides an analogue to evaporitic materials on other planetary bodies such as Mars. A wide range of sample acquisition instruments (NASA drills, Small Planetary Impulse Tool (SPLIT) robotic hammer, universal sampling bags), analytical instruments (Raman spectroscopy, Close-Up Imager, Minion DNA sequencing technology, methane stable isotope analysis, biomolecule and metabolic life detection instruments) and environmental monitoring equipment (passive air particle sampler, particle detectors and environmental monitoring equipment) was deployed in an integrated campaign. Investigations included studying the geochemical signatures of chloride and sulphate evaporitic minerals, testing methods for life detection and planetary protection around human-tended operations, and investigations on the radiation environment of the deep subsurface. The MINAR analogue activity occurs in an active mine, showing how the development of space exploration technology can be used to contribute to addressing immediate Earth-based challenges. During the campaign, in collaboration with European Space Agency (ESA), MINAR was used for astronaut familiarization with future exploration tools and techniques. The campaign was used to develop primary and secondary school and primary to secondary transition curriculum materials on-site during the campaign which was focused on a classroom extra vehicular activity simulation.
Traumatic events are associated with increased risk of psychotic experiences, but it is unclear whether this association is explained by mental disorders prior to psychotic experience onset.
Aims
To investigate the associations between traumatic events and subsequent psychotic experience onset after adjusting for post-traumatic stress disorder and other mental disorders.
Method
We assessed 29 traumatic event types and psychotic experiences from the World Mental Health surveys and examined the associations of traumatic events with subsequent psychotic experience onset with and without adjustments for mental disorders.
Results
Respondents with any traumatic events had three times the odds of other respondents of subsequently developing psychotic experiences (OR=3.1, 95% CI 2.7–3.7), with variability in strength of association across traumatic event types. These associations persisted after adjustment for mental disorders.
Conclusions
Exposure to traumatic events predicts subsequent onset of psychotic experiences even after adjusting for comorbid mental disorders.
This article describes a formal proof of the Kepler conjecture on dense sphere packings in a combination of the HOL Light and Isabelle proof assistants. This paper constitutes the official published account of the now completed Flyspeck project.
Impairments in learning and recall have been well established in amnestic mild cognitive impairment (aMCI). However, a relative dearth of studies has examined the profiles of memory strategy use in persons with aMCI relative to those with Alzheimer's disease (AD). Participants with aMCI, nonamnestic MCI, AD, and healthy older adults were administered the California Verbal Learning Test-II (CVLT-II). Measures of semantic clustering and recall were obtained across learning and delayed recall trials. In addition, we investigated whether deficits in semantic clustering were related to progression from healthy aging to aMCI and from aMCI to AD. The aMCI group displayed similar semantic clustering performance as the AD participants, whereas the AD group showed greater impairments on recall relative to the aMCI participants. Control participants who progressed to aMCI showed reduced semantic clustering at the short delay at baseline compared to individuals who remained diagnostically stable across follow-up visits. These findings show that the ability to engage in an effective memory strategy is compromised in aMCI, before AD has developed, suggesting that disruptions in semantic networks are an early marker of the disease. (JINS, 2014, 20, 1–11)
Several N-nitroso compounds (NOC) have been shown to be carcinogenic in a variety of laboratory animals, but evidence of their carcinogenicity in humans is lacking. We aimed to examine the association between NOC intake and colorectal cancer (CRC) risk and possible effect modification by vitamins C and E and protein in a large case–control study carried out in Newfoundland and Labrador and Ontario, Canada. A total of 1760 case patients with pathologically confirmed adenocarcinoma and 2481 population controls were asked to complete a self-administered FFQ to evaluate their dietary intakes 1 year before diagnosis (for cases) or interview (for controls). Adjusted OR and 95 % CI were calculated across the quintiles of NOC (measured by N-nitrosodimethylamine (NDMA)) intake and relevant food items using unconditional logistic regression. NDMA intake was found to be associated with a higher risk of CRC (highest v. lowest quintiles: OR 1·42, 95 % CI 1·03, 1·96; P for trend = 0·005), specifically for rectal carcinoma (OR 1·61, 95 % CI 1·11, 2·35; P for trend = 0·01). CRC risk also increased with the consumption of NDMA-containing meats when the highest tertile was compared with the lowest tertile (OR 1·47, 95 % CI 1·03, 2·10; P for trend = 0·20). There was evidence of effect modification between dietary vitamin E and NDMA. Individuals with high NDMA and low vitamin E intakes had a significantly increased risk than those with both low NDMA and low vitamin E intakes (OR 3·01, 95 % CI 1·43, 6·51; P for interaction = 0·017). The present results support the hypothesis that NOC intake may be positively associated with CRC risk in humans. Vitamin E, which inhibits nitrosation, could modify the effect of NDMA on CRC risk.