We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Adolescence is a critical developmental phase during which young people are vulnerable to the experiences of mental ill-health and social exclusion (consisting of various domains including education and employment, housing, finances and social supports and relationships). The aims of this study were to (i) obtain an understanding of the relationships between social exclusion, mental health and wellbeing of young people; and (ii) identify potentially modifiable targets, or population groups that require greater or targeted supports.
Methods
Data were obtained from the Mission Australia 2022 Youth Survey, Australia’s largest annual population-wide survey of young people aged 15–19 years (n = 18,800). Participants’ experiences of social exclusion in different domains were explored (e.g., prevalence, co-occurrence and controlling for differences in demographic characteristics). Multivariable linear regression models were used to map the relationships between social exclusion domains and mental health and wellbeing, controlling for confounding factors where necessary.
Results
Sixty per cent of all young people experienced social exclusion in at least one domain, 25% in multiple. Young people who identified as gender diverse, Indigenous, living in a remote/rural or socio-economically disadvantaged area and with a culturally diverse background were more likely to report social exclusion. A strong association was seen between all domains of social exclusion and poor mental health (e.g., higher psychological distress and loneliness, reduced personal wellbeing, reduced sense of control over their life and a more negative outlook on the future). Notably, difficulties in socialising and obtaining social support were critical factors linked to increased psychological distress and reduced wellbeing.
Conclusions
Findings underscore the need to address multiple domains of social exclusion concurrently, and in collaboration with youth mental healthcare. Prevention efforts aimed at early identification and intervention should be prioritised to support young people vulnerable to social exclusion. Screening approaches are needed to identify individuals and groups of young people in need of support, and to facilitate care coordination across multiple providers.
Late-life depression (LLD) is characterized by repeated recurrent depressive episodes even with maintenance treatment. It is unclear what clinical and cognitive phenotypic characteristics present during remission predict future recurrence.
Methods:
Participants (135 with remitted LLD and 69 comparison subjects across three institutions) completed baseline phenotyping, including psychiatric, medical, and social history, psychiatric symptom and personality trait assessment, and neuropsychological testing. Participants were clinically assessed every two months for two years while receiving standard antidepressant treatment. Analyses examined group differences in phenotypic measure using general linear models. Concurrent associations between phenotypic measures and diagnostic groups were examined using LASSO logistic regression.
Results:
Sixty (44%) LLD participants experienced a relapse over the two-year period. Numerous phenotypic measures across all domains differed between remitted LLD and comparison participants. Only residual depressive symptom severity, rumination, medical comorbidity, and executive dysfunction significantly predicted LLD classification. Fewer measures differed between relapsing and sustained remission LLD subgroups, with the relapsing group exhibiting greater antidepressant treatment intensity, greater fatigue, rumination, and disability, higher systolic blood pressure, greater life stress and lower instrumental social support. Relapsing group classification was informed by antidepressant treatment intensity, lower instrumental social support, and greater life stress.
Conclusions:
A wide range of phenotypic factors differed between remitted LLD and comparison groups. Fewer measures differed between relapsing and sustained remission LLD subgroups, with less social support and greater stress informing vulnerability to subsequent relapse. This research suggests potential targets for relapse prevention and emphasizes the need for clinically translatable relapse biomarkers to inform care.
Alternative strategies to fumigation are needed to manage weeds and improve strawberry fruit yield in annual hill plasticulture production systems. Field experiments were conducted in Blackstone, VA, for two consecutive growing seasons, 2013/14 and 2014/15, to assess the efficacy of 4 wk and 8 wk soil solarization (SS) and application of mustard seed meal (MSM) at 1,121 kg ha−1, alone and in combination, for weed control efficacy and crop yield estimation in this production system. These treatments were compared to the use of 1,3-dichloropropene (1,3-D) + chloropicrin (Pic) as a fumigation standard at 188 kg ha−1 and an untreated control (UTC). Over both growing seasons, compared to 1,3-D+ Pic, the SS-MSM-8wk and SS-8wk treatments provided equivalent or reduced cumulative weed count, including weed count of several dominant weed species such as annual ryegrass, speedwell, common chickweed, and cudweed. The SS-4wk and MSM-4wk treatments did not affect weed density compared with the UTC. The MSM-8 wk and 4-wk treatments reduced cumulative weed counts over that of the UTC. In the second growing season, the total yield was significantly higher after the 1,3-D + Pic fumigation treatment compared with yield after other treatments. The SS-4wk, MSM-4wk, and MSM-8wk treatments did not improve the total or marketable yield compared with the UTC. The marketable yield after the SS-MSM-8wk treatment was similar to that of the 1,3-D + Pic treatment. In conclusion, the SS-8wk and SS-MSM-8wk treatments may be effective weed management strategies for organic growers, small farms, or growers who cannot use chemical fumigants due to new regulations and potential risks to human health.
The term ‘natural theology’ provokes a variety of reactions, spanning from whole-hearted endorsement to passionate rejection. Charged as it is with polemical and pejorative undertones, this debate begs for an intervention. If the scholarly community is to engage constructively with the concept and practice of natural theology – either by way of acceptance, rejection, or something in between – clarity in its definition and identification is imperative. The aim of this paper is to try to shed some light on three of the most common definitions of ‘natural theology’ in contemporary scholarship, to provide clarity about the ways in which they differ, and to propose some conceptual refinements in the hope that, if adopted, more fruitful discourse may take place in relation to this much-debated and interdisciplinary phrase.
Effectively addressing climate change requires new approaches to action, implementation and social change. Urban societies are profoundly shaped by faith, with religion influencing the physical environment, institutional structures and lives of citizens. Consequently, there is a need to consider seriously religion's role in mobilizing or constraining climate action in cities. Research is presented that shows the potential of faith-based organizations and faith perspectives to minimize and adapt to climate impacts. A framework for sensitively engaging faith communities in urban climate policy is developed, based on the power of shared values among diverse stakeholder groups to mobilize climate action through partnerships.
Technical summary
Global environmental research and policy frameworks have begun to emphasize the importance of culture and multi-sector partnerships for urban sustainability governance. However, there has been little explicit attention paid to religion and belief as ubiquitous urban socio-cultural phenomena. This article reviews literature on the intersection of religion and climate change in the context of cities. Religious responses to climate change are presented as a typology spanning physicalities, practices, ‘prophetic’ imagination and policy arenas. Key themes are then intersected with areas of focal activity presented in the most recent IPCC reports. Religion is shown to offer both opportunities and barriers for effective urban climate adaptation and mitigation. A new model of religious-civic partnership is then developed as a framework for guiding urban climate policy implementation. This model presents religion as vital to shaping the ‘value landscape’ of cities and calls for collaborative action based on identifying, enriching and mobilizing shared values. As cities become increasingly more populous, heterogeneous, globally teleconnected and exposed to climate impacts, there is an urgent need for research and policy that effectively engages with the historic and evolving presence and impact of religion within urban environments.
Social media summary
Effective action on climate change in cities requires new modes of engagement with religious perspectives, grounded in shared values.
The host-protective immune response to infection with gastrointestinal (GI) nematodes involves a range of interacting processes that begin with recognition of the parasite's antigens and culminate in an inflammatory reaction in the intestinal mucosa. Precisely which immune effectors are responsible for the loss of specific worms is still not known although many candidate effectors have been proposed. However, it is now clear that many different genes regulate the response and that differences between hosts (fast or strong versus slow or weak responses) can be explained by allelic variation in crucial genes associated with the gene cascade that accompanies the immune response and/or genes encoding constitutively expressed receptor/signalling molecules. Major histocompatibility complex (MHC) genes have been recognized for some time as decisive in controlling immunity, and evidence that non-MHC genes are equally, if not more important in this respect has also been available for two decades. Nevertheless, whilst the former have been mapped in mice, only two candidate loci have been proposed for non-MHC genes and relatively little is known about their roles. Now, with the availability of microsatellite markers, it is possible to exploit linkage mapping techniques to identify quantitative trait loci (QTL) responsible for resistance to GI nematodes. Four QTL for resistance to Heligmosomoides polygyrus, and additional QTL affecting faecal egg production by the worms and the accompanying immune responses, have been identified. Fine mapping and eventually the identification of the genes (and their alleles) underlying QTL for resistance/susceptibility will permit informed searches for homologues in domestic animals, and human beings, through comparative genomic maps. This information in turn will facilitate targeted breeding to improve resistance in domestic animals and, in human beings, focused application of treatment and control strategies for GI nematodes.
Thin section, XRD, SEM, and isotopic techniques have been used to study authigenic kaolinite occurring in reservoir sandstones of the Lower Permian Aldebaran Sandstone. Where the unit is no longer an active aquifer, kaolinite is an intermediate-stage phase, and is highly depleted in deuterium (δDSMOW = −115 to −99‰) and 18O (δ18OSMOW = +7.8 to +8.9‰), indicating that precipitation must have been from meteoric water. Deep penetration of this water is linked to Late Triassic deformation and uplift of the Denison Trough sequence, an event which led to exposure of the Aldebaran Sandstone by the Early Jurassic prior to its re-burial beneath Jurassic and Cretaceous sedimentary rocks. The same water was probably involved in the creation of secondary porosity in the interval.
Where the Aldebaran Sandstone is presently undergoing meteoric flushing, kaolinite is relatively enriched in deuterium (δDSMOW = −104 to −93‰) and 18O (δ18OSMOW = +11.7 to +14.6‰), reflecting precipitation largely from post-Mesozoic meteoric water which was isotopically heavier than the Mesozoic water involved in intermediate-stage kaolinite precipitation. This temporal shift in meteoric water isotopic composition is related to the northward drift of the Australian continent to lower latitudes since the Mesozoic Era.
We collected infant food samples from 714 households in Kisumu, Kenya, and estimated the prevalence and concentration of Enterococcus, an indicator of food hygiene conditions. In a subset of 212 households, we quantified the change in concentration in stored food between a morning and afternoon feeding time. In addition, household socioeconomic characteristics and hygiene practices of the caregivers were documented. The prevalence of Enterococcus in infant foods was 50% (95% confidence interval: 46.1 - 53.4), and the mean log10 colony-forming units (CFUs) was 1.1 (SD + 1.4). No risk factors were significantly associated with the prevalence and concentration of Enterococcus in infant foods. The mean log10 CFU of Enterococcus concentration was 0.47 in the morning and 0.73 in the afternoon foods with a 0.64 log10 mean increase in matched samples during storage. Although no factors were statistically associated with the prevalence and the concentration of Enterococcus in infant foods, household flooring type was significantly associated with an increase in concentration during storage, with finished floors leading to 1.5 times higher odds of concentration increase compared to unfinished floors. Our study revealed high prevalence but low concentration of Enterococcus in infant food in low-income Kisumu households, although concentrations increased during storage implying potential increases in risk of exposure to foodborne pathogens over a day. Further studies aiming at investigating contamination of infant foods with pathogenic organisms and identifying effective mitigation measures are required to ensure infant food safety.
The identification of Assyrian personal names in Babylonian sources poses a challenge because there is a considerable degree of overlap between the name repertoires of Babylonia and Assyria in the first millennium BCE. As a first step, this chapter identifies three relevant categories of names attested in Babylonian sources: distinctively Assyrian names, distinctively Babylonian names, and names common to both Assyria and Babylonia. The next step is to isolate names belonging to the first category: those that are distinctively Assyrian. To this end, the chapter identifies four diagnostic features which may occur separately or in combination: (i) Assyrian divine elements; (ii) Assyrian toponyms; (iii) Assyrian dialectal forms, and (iv) vocabulary particular to the Neo-Assyrian onomasticon. Orthography and phonology, including the treatment of sibilants, are further considerations. The chapter also addresses the historical background since this provides important context for investigating the presence of Assyrian name-bearers in Babylonia, both before and after the fall of Assyria in 612 BCE.
The nano-aluminosilicate mineral allophane is common in soils formed from parent materials containing volcanic ash and often contains Fe. Due to its lack of long-range order, the structure of allophane is still not completely understood. In the present study, Fe K-edge X-ray absorption fine structure (XAFS) was used to examine Fe-containing natural and synthetic allophane and imogolite samples. Results indicated that Fe substitutes for octahedrally coordinated Al in allophane, and that Fe exhibits a clustered distribution within the octahedral sheet. Iron adsorbed on allophane surfaces is characterized by spectral features distinct from those of isomorphically substituted Fe and of ferrihydrite. Fe adsorbed on the allophane surfaces probably exists as small polynuclear complexes exhibiting Fe-Fe edge sharing, similar to poorly crystalline Fe oxyhydroxides. The XAFS spectra of natural allophane and imogolite indicate that the Fe in the minerals is a combination of isomorphically substituted and surface-adsorbed Fe. In the synthetic Fe-substituted allophanes, the Fe XAFS spectra did not vary with the Al:Si ratio. Theoretical fits of the extended XAFS (EXAFS) spectra suggest that local atomic structure around octahedral Fe in allophanes is more similar to Fe in a smectite-like structure than to a published theoretical nanoball structure.
The Eighth World Congress of Pediatric Cardiology and Cardiac Surgery (WCPCCS) will be held in Washington DC, USA, from Saturday, 26 August, 2023 to Friday, 1 September, 2023, inclusive. The Eighth World Congress of Pediatric Cardiology and Cardiac Surgery will be the largest and most comprehensive scientific meeting dedicated to paediatric and congenital cardiac care ever held. At the time of the writing of this manuscript, The Eighth World Congress of Pediatric Cardiology and Cardiac Surgery has 5,037 registered attendees (and rising) from 117 countries, a truly diverse and international faculty of over 925 individuals from 89 countries, over 2,000 individual abstracts and poster presenters from 101 countries, and a Best Abstract Competition featuring 153 oral abstracts from 34 countries. For information about the Eighth World Congress of Pediatric Cardiology and Cardiac Surgery, please visit the following website: [www.WCPCCS2023.org]. The purpose of this manuscript is to review the activities related to global health and advocacy that will occur at the Eighth World Congress of Pediatric Cardiology and Cardiac Surgery.
Acknowledging the need for urgent change, we wanted to take the opportunity to bring a common voice to the global community and issue the Washington DC WCPCCS Call to Action on Addressing the Global Burden of Pediatric and Congenital Heart Diseases. A copy of this Washington DC WCPCCS Call to Action is provided in the Appendix of this manuscript. This Washington DC WCPCCS Call to Action is an initiative aimed at increasing awareness of the global burden, promoting the development of sustainable care systems, and improving access to high quality and equitable healthcare for children with heart disease as well as adults with congenital heart disease worldwide.
Background: The Canadian Registry for Amyloidosis Research (CRAR) is a nationwide disease registry of transthyretin (ATTR) and light-chain (AL) amyloidosis. Recent advances in disease-modifying therapy have improved prognosis, however there is a critical need for real-world evidence to address knowledge gaps, particularly longer-term therapeutic outcomes and surveillance strategies. Methods: A multi-stakeholder process was undertaken to develop a consensus dataset for ATTR- and AL-amyloidosis. This process included surveys to rank the importance of potential data items, and a consensus meeting of the CRAR steering committee, (comprised of multidisciplinary clinical experts, and patient organization representatives). Patients and patient organizations supported the development and implementation of a patient-reported dataset. Results: Consensus data items include disease onset, progression, severity, treatments, and outcomes, as well as patient-reported outcomes. Both prospective and retrospective (including deceased) patient cohorts are included. Further baseline data will be presented on an initial cohort of patients. Conclusions: CRAR has been established to collect a longitudinal, multidisciplinary dataset that will evaluate amyloidosis care and outcomes. CRAR has launched at multiple specialty amyloidosis centers nationally and is continually expanding. The growth of this program will promote opportunities to assess real-world safety and efficacy and inform the cost-effectiveness of therapies while supporting patient recruitment for research.
Good judgment is often gauged against two gold standards – coherence and correspondence. Judgments are coherent if they demonstrate consistency with the axioms of probability theory or propositional logic. Judgments are correspondent if they agree with ground truth. When gold standards are unavailable, silver standards such as consistency and discrimination can be used to evaluate judgment quality. Individuals are consistent if they assign similar judgments to comparable stimuli, and they discriminate if they assign different judgments to dissimilar stimuli. We ask whether “superforecasters”, individuals with noteworthy correspondence skills (see Mellers et al., 2014) show superior performance on laboratory tasks assessing other standards of good judgment. Results showed that superforecasters either tied or out-performed less correspondent forecasters and undergraduates with no forecasting experience on tests of consistency, discrimination, and coherence. While multifaceted, good judgment may be a more unified than concept than previously thought.
People who possess greater mathematical skills (i.e., numeracy) are generally more accurate in interpreting numerical data than less numerate people. However, recent evidence has suggested that more numerate people may use their numerical skills to interpret data only if their initial interpretation conflicts with their worldview. That is, if an initial, intuitive (but incorrect) interpretation of data appears to disconfirm one’s beliefs, then numerical skills are used to further process the data and reach the correct interpretation, whereas numerical skills are not used in situations where an initial incorrect interpretation of the data appears to confirm one’s beliefs (i.e., motivated numeracy). In the present study, participants were presented with several data problems, some with correct answers confirming their political views and other disconfirming their views. The difficulty of these problems was manipulated to examine how numeracy would influence the rate of correct responses on easier vs. more difficult problems. Results indicated that participants were more likely to answer problems correctly if the correct answer confirmed rather than disconfirmed their political views, and this response pattern did not depend on problem difficulty or numerical skill. Although more numerate participants were more accurate overall, this was true both for problems in which the correct answer confirmed and disconfirmed participants’ political views.
Norway rats (Rattus norvegicus) are considered one of the most significant vertebrate pests globally, because of their impacts on human and animal health. There are legal and moral obligations to minimise the impacts of wildlife management on animal welfare, yet there are few data on the relative welfare impacts of rat trapping and baiting methods used in the UK with which to inform management decisions. Two stakeholder workshops were facilitated to assess the relative welfare impacts of six lethal rat management methods using a welfare assessment model. Fifteen stakeholders including experts in wildlife management, rodent management, rodent biology, animal welfare science, and veterinary science and medicine, participated. The greatest welfare impacts were associated with three baiting methods, anticoagulants, cholecalciferol and non-toxic cellulose baits (severe to extreme impact for days), and with capture on a glue trap (extreme for hours) with concussive killing (mild to moderate for seconds to minutes); these methods should be considered last resorts from a welfare perspective. Lower impacts were associated with cage trapping (moderate to severe for hours) with concussive killing (moderate for minutes). The impact of snap trapping was highly variable (no impact to extreme for seconds to minutes). Snap traps should be regulated and tested to identify those that cause rapid unconsciousness; such traps might represent the most welfare-friendly option assessed for killing rats. Our results can be used to integrate consideration of rat welfare alongside other factors, including cost, efficacy, safety, non-target animal welfare and public acceptability when selecting management methods. We also highlight ways of reducing welfare impacts and areas where more data are needed.
Make-at-home nasal irrigation solutions are often recommended for treating chronic rhinosinusitis. Many patients will store pre-made solution for convenient use. This study investigated the microbiological properties of differing recipes and storage temperatures.
Method
Three irrigation recipes (containing sodium chloride, sodium bicarbonate and sucrose) were stored at 5oC and 22oC. Further samples were inoculated with Staphylococcus aureus and Pseudomonas aeruginosa. Sampling and culturing were conducted at intervals from day 0–12 to examine for bacterial presence or persistence.
Results
No significant bacterial growth was detected in any control solution stored at 5oC. Saline solutions remained relatively bacterial free, with poor survival of inoculated bacteria, which may be related to either lower pH or lower osmolality. Storing at room temperature increased the risk of contamination in control samples, particularly from pseudomonas.
Conclusion
If refrigerated, pre-made nasal irrigation solutions can be stored safely for up to 12 days without risking cross-contamination to irrigation equipment or patients.
Data from neurocognitive assessments may not be accurate in the context of factors impacting validity, such as disengagement, unmotivated responding, or intentional underperformance. Performance validity tests (PVTs) were developed to address these phenomena and assess underperformance on neurocognitive tests. However, PVTs can be burdensome, rely on cutoff scores that reduce information, do not examine potential variations in task engagement across a battery, and are typically not well-suited to acquisition of large cognitive datasets. Here we describe the development of novel performance validity measures that could address some of these limitations by leveraging psychometric concepts using data embedded within the Penn Computerized Neurocognitive Battery (PennCNB).
Methods:
We first developed these validity measures using simulations of invalid response patterns with parameters drawn from real data. Next, we examined their application in two large, independent samples: 1) children and adolescents from the Philadelphia Neurodevelopmental Cohort (n = 9498); and 2) adult servicemembers from the Marine Resiliency Study-II (n = 1444).
Results:
Our performance validity metrics detected patterns of invalid responding in simulated data, even at subtle levels. Furthermore, a combination of these metrics significantly predicted previously established validity rules for these tests in both developmental and adult datasets. Moreover, most clinical diagnostic groups did not show reduced validity estimates.
Conclusions:
These results provide proof-of-concept evidence for multivariate, data-driven performance validity metrics. These metrics offer a novel method for determining the performance validity for individual neurocognitive tests that is scalable, applicable across different tests, less burdensome, and dimensional. However, more research is needed into their application.
To determine the proportion of hospitals that implemented 6 leading practices in their antimicrobial stewardship programs (ASPs). Design: Cross-sectional observational survey.
Setting:
Acute-care hospitals.
Participants:
ASP leaders.
Methods:
Advance letters and electronic questionnaires were initiated February 2020. Primary outcomes were percentage of hospitals that (1) implemented facility-specific treatment guidelines (FSTG); (2) performed interactive prospective audit and feedback (PAF) either face-to-face or by telephone; (3) optimized diagnostic testing; (4) measured antibiotic utilization; (5) measured C. difficile infection (CDI); and (6) measured adherence to FSTGs.
Results:
Of 948 hospitals invited, 288 (30.4%) completed the questionnaire. Among them, 82 (28.5%) had <99 beds, 162 (56.3%) had 100–399 beds, and 44 (15.2%) had ≥400+ beds. Also, 230 (79.9%) were healthcare system members. Moreover, 161 hospitals (54.8%) reported implementing FSTGs; 214 (72.4%) performed interactive PAF; 105 (34.9%) implemented procedures to optimize diagnostic testing; 235 (79.8%) measured antibiotic utilization; 258 (88.2%) measured CDI; and 110 (37.1%) measured FSTG adherence. Small hospitals performed less interactive PAF (61.0%; P = .0018). Small and nonsystem hospitals were less likely to optimize diagnostic testing: 25.2% (P = .030) and 21.0% (P = .0077), respectively. Small hospitals were less likely to measure antibiotic utilization (67.8%; P = .0010) and CDI (80.3%; P = .0038). Nonsystem hospitals were less likely to implement FSTGs (34.3%; P < .001).
Conclusions:
Significant variation exists in the adoption of ASP leading practices. A minority of hospitals have taken action to optimize diagnostic testing and measure adherence to FSTGs. Additional efforts are needed to expand adoption of leading practices across all acute-care hospitals with the greatest need in smaller hospitals.