We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Written by an established climate change scientist, this book introduces readers to cutting-edge climate change science. Unlike many books on the topic that devote themselves to recent events, this volume provides a historical context and describes early research results as well as key modern scientific findings. It explains how the climate change issue has developed over many decades, how the science has progressed, how diplomacy has (so far) proven unable to find a means of limiting global emissions of heat-trapping substances, and how the forecast for future climate change has become more worrisome. A scientific or mathematical background is not necessary to read this book, which includes no equations, jargon, complex charts or graphs, or quantitative science at all. Anyone who can read a newspaper will understand this book. It is ideal for introductory courses on climate change, especially for non-science major students.
The next-generation radio astronomy instruments are providing a massive increase in sensitivity and coverage, largely through increasing the number of stations in the array and the frequency span sampled. The two primary problems encountered when processing the resultant avalanche of data are the need for abundant storage and the constraints imposed by I/O, as I/O bandwidths drop significantly on cold storage. An example of this is the data deluge expected from the SKA Telescopes of more than 60 PB per day, all to be stored on the buffer filesystem. While compressing the data is an obvious solution, the impacts on the final data products are hard to predict. In this paper, we chose an error-controlled compressor – MGARD – and applied it to simulated SKA-Mid and real pathfinder visibility data, in noise-free and noise-dominated regimes. As the data have an implicit error level in the system temperature, using an error bound in compression provides a natural metric for compression. MGARD ensures the compression incurred errors adhere to the user-prescribed tolerance. To measure the degradation of images reconstructed using the lossy compressed data, we proposed a list of diagnostic measures, exploring the trade-off between these error bounds and the corresponding compression ratios, as well as the impact on science quality derived from the lossy compressed data products through a series of experiments. We studied the global and local impacts on the output images for continuum and spectral line examples. We found relative error bounds of as much as 10%, which provide compression ratios of about 20, have a limited impact on the continuum imaging as the increased noise is less than the image RMS, whereas a 1% error bound (compression ratio of 8) introduces an increase in noise of about an order of magnitude less than the image RMS. For extremely sensitive observations and for very precious data, we would recommend a $0.1\%$ error bound with compression ratios of about 4. These have noise impacts two orders of magnitude less than the image RMS levels. At these levels, the limits are due to instabilities in the deconvolution methods. We compared the results to the alternative compression tool DYSCO, in both the impacts on the images and in the relative flexibility. MGARD provides better compression for similar error bounds and has a host of potentially powerful additional features.
Patients with posttraumatic stress disorder (PTSD) exhibit smaller regional brain volumes in commonly reported regions including the amygdala and hippocampus, regions associated with fear and memory processing. In the current study, we have conducted a voxel-based morphometry (VBM) meta-analysis using whole-brain statistical maps with neuroimaging data from the ENIGMA-PGC PTSD working group.
Methods
T1-weighted structural neuroimaging scans from 36 cohorts (PTSD n = 1309; controls n = 2198) were processed using a standardized VBM pipeline (ENIGMA-VBM tool). We meta-analyzed the resulting statistical maps for voxel-wise differences in gray matter (GM) and white matter (WM) volumes between PTSD patients and controls, performed subgroup analyses considering the trauma exposure of the controls, and examined associations between regional brain volumes and clinical variables including PTSD (CAPS-4/5, PCL-5) and depression severity (BDI-II, PHQ-9).
Results
PTSD patients exhibited smaller GM volumes across the frontal and temporal lobes, and cerebellum, with the most significant effect in the left cerebellum (Hedges’ g = 0.22, pcorrected = .001), and smaller cerebellar WM volume (peak Hedges’ g = 0.14, pcorrected = .008). We observed similar regional differences when comparing patients to trauma-exposed controls, suggesting these structural abnormalities may be specific to PTSD. Regression analyses revealed PTSD severity was negatively associated with GM volumes within the cerebellum (pcorrected = .003), while depression severity was negatively associated with GM volumes within the cerebellum and superior frontal gyrus in patients (pcorrected = .001).
Conclusions
PTSD patients exhibited widespread, regional differences in brain volumes where greater regional deficits appeared to reflect more severe symptoms. Our findings add to the growing literature implicating the cerebellum in PTSD psychopathology.
The macro-social and environmental conditions in which people live, such as the level of a country’s development or inequality, are associated with brain-related disorders. However, the relationship between these systemic environmental factors and the brain remains unclear. We aimed to determine the association between the level of development and inequality of a country and the brain structure of healthy adults.
Methods
We conducted a cross-sectional study pooling brain imaging (T1-based) data from 145 magnetic resonance imaging (MRI) studies in 7,962 healthy adults (4,110 women) in 29 different countries. We used a meta-regression approach to relate the brain structure to the country’s level of development and inequality.
Results
Higher human development was consistently associated with larger hippocampi and more expanded global cortical surface area, particularly in frontal areas. Increased inequality was most consistently associated with smaller hippocampal volume and thinner cortical thickness across the brain.
Conclusions
Our results suggest that the macro-economic conditions of a country are reflected in its inhabitants’ brains and may explain the different incidence of brain disorders across the world. The observed variability of brain structure in health across countries should be considered when developing tools in the field of personalized or precision medicine that are intended to be used across the world.
Current evidence underscores a need to transform how we do clinical research, shifting from academic-driven priorities to co-led community partnership focused programs, accessible and relevant career pathway programs that expand opportunities for career development, and design of trainings and practices to develop cultural competence among research teams. Failures of equitable research translation contribute to health disparities. Drivers of this failed translation include lack of diversity in both researchers and participants, lack of alignment between research institutions and the communities they serve, and lack of attention to structural sources of inequity and drivers of mistrust for science and research. The Duke University Research Equity and Diversity Initiative (READI) is a program designed to better align clinical research programs with community health priorities through community engagement. Organized around three specific aims, READI-supported programs targeting increased workforce diversity, workforce training in community engagement and cultural competence, inclusive research engagement principles, and development of trustworthy partnerships.
This study investigates two clayey facies from the Bomkoul area in the littoral region of Cameroon for their suitability as fired clay building products. The field study consisted of a geological survey and a geotechnical mission (G0). Assessment of the raw clayey materials included their mineralogy, particle size, determination of Atterberg limits, density and shear stress. Firing properties (shrinkage, water absorption and flexural strength) at 900−1100°C were also determined. The two main facies observed in the field are the mottled red/yellow grey clays from surface ‘A’ with a thickness of 2.0–2.5 m and the deep blackish fossiliferous schisteous grey clays ‘B’ with a thickness of 8−10 m. Estimation based on boreholes revealed a minimum of 1,400,000 tons of clayey materials. These reserves will supply a small brick-manufacturing unit for a minimum period of 25 years at an extraction rate of 50,000 tons per year. The main clay minerals of both samples are kaolinite (35% and 49%) and illite (1–11%). Both samples contain quartz (47% and 49%) as non-clay minerals, associated with a small amount of anatase (0.5–2.6%) and trace hematite (<1%). The major oxides are SiO2 (71–76%) and Al2O3 (14%). The raw clayey material ‘A’ was finer and more plastic than the ‘B’ facies. The technological properties of the fired bricks obtained from the ‘A’ facies showed greater potential than the ‘B’ facies in terms of sonority and flexural strength. A mixture made of 40% ‘A’ and 60% ‘B’ yielded satisfactory brick properties at 1050°C.
Treatment guidelines recommend evidence-based psychological therapies for adults with intellectual disabilities with co-occurring anxiety or depression. No previous research has explored the effectiveness of these therapies in mainstream psychological therapy settings or outside specialist settings.
Aims
To evaluate the effectiveness of psychological therapies delivered in routine primary care settings for people with intellectual disability who are experiencing co-occurring depression or anxiety.
Method
This study used linked electronic healthcare records of 2 048 542 adults who received a course of NHS Talking Therapies for anxiety and depression in England between 2012 and 2019 to build a retrospective, observational cohort of individuals with intellectual disability, matched 1:2 with individuals without intellectual disability. Logistic regressions were used to compare metrics of symptom improvement and deterioration used in the national programme, on the basis of depression and anxiety measures collected before and at the last attended therapy session.
Results
The study included 6870 adults with intellectual disability and 2 041 672 adults without intellectual disability. In unadjusted analyses, symptoms improved on average for people with intellectual disability after a course of therapy, but these individuals experienced poorer outcomes compared with those without intellectual disability (reliable improvement 60.2% for people with intellectual disability v. 69.2% for people without intellectual disability, odds ratio 0.66, 95% CI 0.63–0.70; reliable deterioration 10.3% for people with intellectual disability v. 5.7% for those without intellectual disability, odds ratio 1.89, 95% CI 1.75–2.04). After propensity score matching, some differences were attenuated (reliable improvement, adjusted odds ratio 0.97, 95% CI 1.91–1.04), but some outcomes remained poorer for people with intellectual disability (reliable deterioration, adjusted odds ratio 1.28, 95% CI 1.16–1.42).
Conclusions
Evidence-based psychological therapies may be effective for adults with intellectual disability, but their outcomes may be similar to (for improvement and recovery) or poorer than (for deterioration) those for adults without intellectual disability. Future work should investigate the impact of adaptations of therapies for those with intellectual disability to make such interventions more effective and accessible for this population.
Paternity leave may promote greater gender equality in domestic labour. Though numerous studies show that paternity leave promotes greater fathers’ involvement in childcare, less is known about whether paternity leave-taking may facilitate fathers’ involvement in other forms of domestic labour such as housework. Using repeated cross-sectional data on different-gender partnered US parents from the Study on Parents’ Divisions of Labor During COVID-19 (SPDLC), this study examines the extent to which paternity leave-taking and length of paternity leave are associated with US fathers’ shares of, and time spent on, housework. Findings suggest that paternity leave-taking is positively associated with fathers’ shares of, and time spent on, housework tasks. Longer paternity leaves are also associated with fathers performing greater shares of housework. Overall, this study indicates that the benefits of paternity leave likely extend to fathers’ greater participation in housework, providing additional support for the belief that increased use of paternity leave may help to promote gender equality in domestic labour.
Is Nine-Men Morris, in the hands of perfect players, a win for white or for black - or a draw? Can king, rook, and knight always defeat king and two knights in chess? What can Go players learn from economists? What are nimbers, tinies, switches and minies? This book deals with combinatorial games, that is, games not involving chance or hidden information. Their study is at once old and young: though some games, such as chess, have been analyzed for centuries, the first full analysis of a nontrivial combinatorial game (Nim) only appeared in 1902. The first part of this book will be accessible to anyone, regardless of background: it contains introductory expositions, reports of unusual tournaments, and a fascinating article by John H. Conway on the possibly everlasting contest between an angel and a devil. For those who want to delve more deeply, the book also contains combinatorial studies of chess and Go; reports on computer advances such as the solution of Nine-Men Morris and Pentominoes; and theoretical approaches to such problems as games with many players. If you have read and enjoyed Martin Gardner, or if you like to learn and analyze new games, this book is for you.
During 1996–7 MSRI held a full academic year program on Combinatorics, with special emphasis on the connections with other branches of mathematics, such as algebraic geometry, topology, commutative algebra, representation theory, and convex geometry. The rich combinatorial problems arising from the study of various algebraic structures are the subject of this book, which represents work done or presented at seminars during the program. It contains contributions on matroid bundles, combinatorial representation theory, lattice points in polyhedra, bilinear forms, combinatorial differential topology and geometry, Macdonald polynomials and geometry, enumeration of matchings, the generalized Baues problem, and Littlewood–Richardson semigroups. These expository articles, written by some of the most respected researchers in the field, will continue to be of use to graduate students and researchers in combinatorics as well as algebra, geometry, and topology.
Actuaries must model mortality to understand, manage and price risk. Continuous-time methods offer considerable practical benefits to actuaries analysing portfolio mortality experience. This paper discusses six categories of advantage: (i) reflecting the reality of data produced by everyday business practices, (ii) modelling rapid changes in risk, (iii) modelling time- and duration-varying risk, (iv) competing risks, (v) data-quality checking and (vi) management information. Specific examples are given where continuous-time models are more useful in practice than discrete-time models.
We reprise some common statistical models for actuarial mortality analysis using grouped counts. We then discuss the benefits of building mortality models from the most elementary items. This has two facets. First, models are better based on the mortality of individuals, rather than groups. Second, models are better defined in continuous time, rather than over fixed intervals like a year. We show how Poisson-like likelihoods at the “macro” level are built up by product integration of sequences of infinitesimal Bernoulli trials at the “micro” level. Observed data is represented through a stochastic mortality hazard rate, and counting processes provide the natural notation for left-truncated and right-censored actuarial data, individual or age-grouped. Together these explain the “pseudo-Poisson” behaviour of survival model likelihoods.
Recent changes to US research funding are having far-reaching consequences that imperil the integrity of science and the provision of care to vulnerable populations. Resisting these changes, the BJPsych Portfolio reaffirms its commitment to publishing mental science and advancing psychiatric knowledge that improves the mental health of one and all.