We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The concept of ’causal crypticity’ was introduced by Richardson (2021) to describe the DOHaD field’s high tolerance for both causes and effects that are challenging to observe in nature, show small effect sizes, and are unstable across study populations and contexts. Causal crypticity can be understood in three ways: as an epistemic norm; as a boundary-delimiting signature of field culture or epistemic style; and as a promissory mode. Increasingly, causal crypticity characterises many fields of the big data-rich, postgenomic life sciences, making DOHaD science a useful index case for scholars of the history, philosophy, and social studies of science interested in the epistemic terrain and social implications of postgenomic sciences. The chapter concludes with a discussion of ethical and accountable claimsmaking in DOHaD science under conditions of causal crypticity.
Both impulsivity and compulsivity have been identified as risk factors for problematic use of the internet (PUI). Yet little is known about the relationship between impulsivity, compulsivity and individual PUI symptoms, limiting a more precise understanding of mechanisms underlying PUI.
Aims
The current study is the first to use network analysis to (a) examine the unique association among impulsivity, compulsivity and PUI symptoms, and (b) identify the most influential drivers in relation to the PUI symptom community.
Method
We estimated a Gaussian graphical model consisting of five facets of impulsivity, compulsivity and individual PUI symptoms among 370 Australian adults (51.1% female, mean age = 29.8, s.d. = 11.1). Network structure and bridge expected influence were examined to elucidate differential associations among impulsivity, compulsivity and PUI symptoms, as well as identify influential nodes bridging impulsivity, compulsivity and PUI symptoms.
Results
Results revealed that four facets of impulsivity (i.e. negative urgency, positive urgency, lack of premeditation and lack of perseverance) and compulsivity were related to different PUI symptoms. Further, compulsivity and negative urgency were the most influential nodes in relation to the PUI symptom community due to their highest bridge expected influence.
Conclusions
The current findings delineate distinct relationships across impulsivity, compulsivity and PUI, which offer insights into potential mechanistic pathways and targets for future interventions in this space. To realise this potential, future studies are needed to replicate the identified network structure in different populations and determine the directionality of the relationships among impulsivity, compulsivity and PUI symptoms.
Charles W. Mills’s persistent criticisms of Rawls’s ideal theory aptly motivated a turn to non-ideal theory, but also have deep, unexplored implications for Rawls’s ideal theory. Since ideal theory will remain useful as a supplement to non-ideal theory, these are worth pursuing. The criticisms put forward by Mills reveal two serious flaws in Rawls’s ideal theory: First, Rawls’s well-known focus on the “basic structure of society” as the primary subject of social justice puts way too much stock in the legal regulation of society, correspondingly ignoring other powerful types of social norms. Second, and relatedly, Rawls’s conception of social power is too highly moralized (too Hohfeldian) to enable him to come to grips with oppressive social power. These flaws need to be corrected and the limitations they entail must be overcome for a Rawlsian ideal society to be sufficiently resistant to breeding new forms of bigoted oppression.
Background: Pain in a common symptom in adult-onset idiopathic dystonia (AOID). An appropriate tool to understand this symptom is needed to improve AOID patients’ care. We developed a rating instrument for pain in AOID and validated it in cervical dystonia (CD). Methods: Development and validation of the Pain in Dystonia Scale (PIDS) in three phases: 1. International experts and participants generated and evaluated the preliminary items for content validity; 2. The PIDS was drafted and revised, followed by cognitive interviews to ensure suitability for self-administration; and 3. the clinimetric properties of the final PIDS were assessed in 85 participants. Results: PIDS evaluates pain severity (by body part), functional impact and external modulating factors. It showed high test-retest reliability the total score (0.9, p<0.001), intraclass correlation coefficients higher than 0.7 for all items and high internal consistency (Cronbach’s alpha 0.9). Convergent validity analysis revealed a strong correlation between the PIDS severity score and the TWSTRS pain subscale (0.8, p<0.001), the brief pain inventory short form (0.7, p<0.001) and impact of pain on daily functioning (0.7, p<0.001). Conclusions: The PIDS is the first specific questionnaire developed to evaluate pain in patients with AOID with high-level clinimetric properties in people with CD.
To support school foods programmes by evaluating the relationship between nutritional quality, cost, student consumption and the environmental impacts of menus.
Design:
Using linear programming and data from previously served menu items, the relationships between the nutritional quality, cost, student consumption and the environmental impacts of lunch menus were investigated. Optimised lunch menus with the maximum potential student consumption and nutritional quality and lowest costs and environmental impacts were developed and compared with previously served menus (baseline).
Setting:
Boston Public Schools (BPS), Boston Massachusetts, USA.
Participants:
Menu items served on the 2018–2019 BPS lunch menu (n 142).
Results:
Using single-objective models, trade-offs were observed between most interests, but the use of multi-objective models minimised these trade-offs. Compared with the current weekly menus offered, multi-objective models increased potential caloric intake by up to 27 % and Healthy Eating Index scores by up to 19 % and reduced costs and environmental impacts by up to 13 % and 71 %, respectively. Improvements were made by reducing the frequency of beef and cheese entrées and increasing the frequency of fish and legume entrées on weekly menus.
Conclusions:
This work can be extrapolated to monthly menus to provide further direction for school districts, and the methods can be employed with different recipes and constraints. Future research should test the implementation of optimised menus in schools and consider the broader implications of implementation.
We present a timeseries of 14CO2 for the period 1910–2021 recorded by annual plants collected in the southwestern United States, centered near Flagstaff, Arizona. This timeseries is dominated by five commonly occurring annual plant species in the region, which is considered broadly representative of the southern Colorado Plateau. Most samples (1910–2015) were previously archived herbarium specimens, with additional samples harvested from field experiments in 2015–2021. We used this novel timeseries to develop a smoothed local record with uncertainties for “bomb spike” 14C dating of recent terrestrial organic matter. Our results highlight the potential importance of local records, as we document a delayed arrival of the 1963–1964 bomb spike peak, lower values in the 1980s, and elevated values in the last decade in comparison to the most current Northern Hemisphere Zone 2 record. It is impossible to retroactively collect atmospheric samples, but archived annual plants serve as faithful scribes: samples from herbaria around the Earth may be an under-utilized resource to improve understanding of the modern carbon cycle.
To monitor for drug-related cardiac arrhythmias, psychiatrists regularly perform and interpret 12-lead (12L) and, increasingly often, six-lead (6L) electrocardiograms (ECGs). It is not known how training on this complex skill is updated or how well psychiatrists can interpret relevant arrhythmias on either device.
We conducted an online survey and ECG interpretation test of cardiac rhythms relevant to psychiatrists.
A total of 183 prescribers took part; 75% did not regularly update their ECG interpretation skills, and only 22% felt confident in interpreting ECGs. Most participants were able to recognise normal ECGs. For both 6L and 12L ECGs, the majority of participants were able to recognise abnormal ECGs, but fewer than 50% were able to correctly identify relevant arrhythmias (complete heart block and long QTc). A small number prescribed in the presence of potentially fatal arrhythmias. These findings suggest a need for mandatory ECG interpretation training to improve safe prescribing practice.
The impact of the coronavirus disease 2019 (COVID-19) pandemic on mental health is still being unravelled. It is important to identify which individuals are at greatest risk of worsening symptoms. This study aimed to examine changes in depression, anxiety and post-traumatic stress disorder (PTSD) symptoms using prospective and retrospective symptom change assessments, and to find and examine the effect of key risk factors.
Method
Online questionnaires were administered to 34 465 individuals (aged 16 years or above) in April/May 2020 in the UK, recruited from existing cohorts or via social media. Around one-third (n = 12 718) of included participants had prior diagnoses of depression or anxiety and had completed pre-pandemic mental health assessments (between September 2018 and February 2020), allowing prospective investigation of symptom change.
Results
Prospective symptom analyses showed small decreases in depression (PHQ-9: −0.43 points) and anxiety [generalised anxiety disorder scale – 7 items (GAD)-7: −0.33 points] and increases in PTSD (PCL-6: 0.22 points). Conversely, retrospective symptom analyses demonstrated significant large increases (PHQ-9: 2.40; GAD-7 = 1.97), with 55% reported worsening mental health since the beginning of the pandemic on a global change rating. Across both prospective and retrospective measures of symptom change, worsening depression, anxiety and PTSD symptoms were associated with prior mental health diagnoses, female gender, young age and unemployed/student status.
Conclusions
We highlight the effect of prior mental health diagnoses on worsening mental health during the pandemic and confirm previously reported sociodemographic risk factors. Discrepancies between prospective and retrospective measures of changes in mental health may be related to recall bias-related underestimation of prior symptom severity.
Supplemental food from anthropogenic sources is a source of conflict with humans for many wildlife species. Food-seeking behaviours by black bears Ursus americanus and brown bears Ursus arctos can lead to property damage, human injury and mortality of the offending bears. Such conflicts are a well-known conservation management issue wherever people live in bear habitats. In contrast, the use of anthropogenic foods by the polar bear Ursus maritimus is less common historically but is a growing conservation and management issue across the Arctic. Here we present six case studies that illustrate how negative food-related interactions between humans and polar bears can become either chronic or ephemeral and unpredictable. Our examination suggests that attractants are an increasing problem, exacerbated by climate change-driven sea-ice losses that cause increased use of terrestrial habitats by bears. Growing human populations and increased human visitation increase the likelihood of human–polar bear conflict. Efforts to reduce food conditioning in polar bears include attractant management, proactive planning and adequate resources for northern communities to reduce conflicts and improve human safety. Permanent removal of unsecured sources of nutrition, to reduce food conditioning, should begin immediately at the local level as this will help to reduce polar bear mortality.
Primary care providers (PCPs) are expected to help patients with obesity to lose weight through behavior change counseling and patient-centered use of available weight management resources. Yet, many PCPs face knowledge gaps and clinical time constraints that hinder their ability to successfully support patients’ weight loss. Fortunately, a small and growing number of physicians are now certified in obesity medicine through the American Board of Obesity Medicine (ABOM) and can provide personalized and effective obesity treatment to individual patients. Little is known, however, about how to extend the expertise of ABOM-certified physicians to support PCPs and their many patients with obesity.
Aim:
To develop and pilot test an innovative care model – the Weight Navigation Program (WNP) – to integrate ABOM-certified physicians into primary care settings and to enhance the delivery of personalized, effective obesity care.
Methods:
Quality improvement program with an embedded, 12-month, single-arm pilot study. Patients with obesity and ≥1 weight-related co-morbidity may be referred to the WNP by PCPs. All patients seen within the WNP during the first 12 months of clinical operations will be compared to a matched cohort of patients from another primary care site. We will recruit a subset of WNP patients (n = 30) to participate in a remote weight monitoring pilot program, which will include surveys at 0, 6, and 12 months, qualitative interviews at 0 and 6 months, and use of an electronic health record (EHR)-based text messaging program for remote weight monitoring.
Discussion:
Obesity is a complex chronic condition that requires evidence-based, personalized, and longitudinal care. To deliver such care in general practice, the WNP leverages the expertise of ABOM-certified physicians, health system and community weight management resources, and EHR-based population health management tools. The WNP is an innovative model with the potential to be implemented, scaled, and sustained in diverse primary care settings.
Poor mental health is a state of psychological distress that is influenced by lifestyle factors such as sleep, diet, and physical activity. Compulsivity is a transdiagnostic phenotype cutting across a range of mental illnesses including obsessive–compulsive disorder, substance-related and addictive disorders, and is also influenced by lifestyle. Yet, how lifestyle relates to compulsivity is presently unknown, but important to understand to gain insights into individual differences in mental health. We assessed (a) the relationships between compulsivity and diet quality, sleep quality, and physical activity, and (b) whether psychological distress statistically contributes to these relationships.
Methods
We collected harmonized data on compulsivity, psychological distress, and lifestyle from two independent samples (Australian n = 880 and US n = 829). We used mediation analyses to investigate bidirectional relationships between compulsivity and lifestyle factors, and the role of psychological distress.
Results
Higher compulsivity was significantly related to poorer diet and sleep. Psychological distress statistically mediated the relationship between poorer sleep quality and higher compulsivity, and partially statistically mediated the relationship between poorer diet and higher compulsivity.
Conclusions
Lifestyle interventions in compulsivity may target psychological distress in the first instance, followed by sleep and diet quality. As psychological distress links aspects of lifestyle and compulsivity, focusing on mitigating and managing distress may offer a useful therapeutic approach to improve physical and mental health. Future research may focus on the specific sleep and diet patterns which may alter compulsivity over time to inform lifestyle targets for prevention and treatment of functionally impairing compulsive behaviors.
The COVID-19 pandemic has shone a spotlight on how health outcomes are unequally distributed among different population groups, with disadvantaged communities and individuals being disproportionality affected in terms of infection, morbidity and mortality, as well as vaccine access. Recently, there has been considerable debate about how social disadvantage and inequality intersect with developmental processes to result in a heightened susceptibility to environmental stressors, economic shocks and large-scale health emergencies. We argue that DOHaD Society members can make important contributions to addressing issues of inequality and improving community resilience in response to COVID-19. In order to do so, it is beneficial to engage with and adopt a social justice framework. We detail how DOHaD can align its research and policy recommendations with a social justice perspective to ensure that we contribute to improving the health of present and future generations in an equitable and socially just way.
Exotic conifers are rapidly spreading in many regions of New Zealand, as well as in many other countries, with detrimental impacts on both natural ecosystems and some productive sector environments. Herbicides, in particular the active ingredient triclopyr, are an important tool to manage invasive conifers, yet there is a paucity of information that quantifies the amount of herbicide required to kill trees of different sizes when applied as a basal bark treatment. Two sequential experiments were conducted to define the amount of triclopyr required to kill individual invasive lodgepole pine (Pinus contorta Douglas ex Loudon), trees of different sizes when applied in a methylated seed oil to bark (either the whole stem or base of the tree) and to determine which tree size variates (height, diameter at breast height [DBH], crown diameter [CD]) or derived attributes (crown area, crown volume index) best characterized this dose–response relationship. The outcomes of the dose–response research were compared with field operations where triclopyr was applied to the bark of trees from an aerial platform. Applying the herbicide to the whole stem, as opposed to the base of the tree only, significantly increased treatment efficacy. The tree size variates DBH, CD, crown area, and crown volume index all provided good fits to the tree mortality data, with >91% prediction accuracy. Of these variates, CD provided the most practical measure of tree size for ease of in-field calculation of dose by an operator. Herbicide rates used in field operations were seven to eight times higher than lethal doses calculated from experimental data. Our results highlight the potential for substantial reductions in herbicide rates for exotic conifer control, especially if dose–response data are combined with remotely sensed quantitative measurements of canopy area or volume using new precision technologies such as unmanned aerial vehicles.
To develop a fully automated algorithm using data from the Veterans’ Affairs (VA) electrical medical record (EMR) to identify deep-incisional surgical site infections (SSIs) after cardiac surgeries and total joint arthroplasties (TJAs) to be used for research studies.
Design:
Retrospective cohort study.
Setting:
This study was conducted in 11 VA hospitals.
Participants:
Patients who underwent coronary artery bypass grafting or valve replacement between January 1, 2010, and March 31, 2018 (cardiac cohort) and patients who underwent total hip arthroplasty or total knee arthroplasty between January 1, 2007, and March 31, 2018 (TJA cohort).
Methods:
Relevant clinical information and administrative code data were extracted from the EMR. The outcomes of interest were mediastinitis, endocarditis, or deep-incisional or organ-space SSI within 30 days after surgery. Multiple logistic regression analysis with a repeated regular bootstrap procedure was used to select variables and to assign points in the models. Sensitivities, specificities, positive predictive values (PPVs) and negative predictive values were calculated with comparison to outcomes collected by the Veterans’ Affairs Surgical Quality Improvement Program (VASQIP).
Results:
Overall, 49 (0.5%) of the 13,341 cardiac surgeries were classified as mediastinitis or endocarditis, and 83 (0.6%) of the 12,992 TJAs were classified as deep-incisional or organ-space SSIs. With at least 60% sensitivity, the PPVs of the SSI detection algorithms after cardiac surgeries and TJAs were 52.5% and 62.0%, respectively.
Conclusions:
Considering the low prevalence rate of SSIs, our algorithms were successful in identifying a majority of patients with a true SSI while simultaneously reducing false-positive cases. As a next step, validation of these algorithms in different hospital systems with EMR will be needed.
This essay seeks to get beyond the narrow debate between two candidate grounds for indexing advantage in accounts of justice: the Rawlsian primary goods of income and wealth and capability or capabilities. Rawls is more deeply committed to multidimensionality than this debate has tended to recognize. Commitment to multidimensionality is shallow if each of the multiple dimensions is seen as contributory to something sought only for its own sake that can be adequately represented along a single dimension, such as welfare or well-being as they are sometimes conceived. To avoid treating multidimensionality shallowly — whether within the domain of justice or outside it — defenders of appealing to capabilities would do well to follow Rawls in recognizing a division of moral labour among multiple principles, with the different principles serving different social values and addressing different sets of social institutions. This approach offers an attractive and flexible alternative to single-principle outcome-ranking approaches. Along the way, in reference to the older debates, it is shown that there is, for Rawls, no single currency of justice and that he has serious reasons, grounded in respect for the fact of pluralism, to avoid resting too much theoretical weight on the idea of well-being.
Media coverage of non-suicidal self-injury (NSSI) ranges from providing helpful education to displaying graphic images. We offer the first research-informed, consensus-based guidelines for the responsible reporting and depicting of NSSI in the media, while also advising on ideas for dissemination and collaboration between media professionals and healthcare experts.
Wild radish (Raphanus raphanistrum L.) is a weed found globally in agricultural systems. The facultative winter annual nature of this plant and high genetic variability makes modeling its growth and phenology difficult. In the present study, R. raphanistrum natural seedbanks exhibited a biphasic pattern of emergence, with emergence peaks occurring in both fall and spring. Traditional sigmoidal models were inadequate to fit this pattern, regardless of the predictive environmental variable, and a corresponding biphasic model (sigmoidal + Weibull) was used to describe emergence based on the best parameters. Each best-fit chronological, thermal, and hydrothermal model accounted for at least 85% of the variation of the validation data. Observations on phenology progression from four cohorts were used to create a common model that described all cohorts adequately. Different phenological stages were described using chronological, thermal, hydrothermal, daylength-dependent thermal time, and daylength-dependent hydrothermal time. Integrating daylength and temperature into the models was important for predicting reproductive stages of R. raphanistrum.