We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Compared to our appreciation of the epistemology of the sciences and mathematics, we have a relatively poor understanding of the epistemology of logic. This Element highlights three causes of this lack of progress: (i) failure to distinguish between the epistemology of logical theorising and that of good (logical) reasoning; (ii) hesitancy to base epistemology of logic on how logicians actually justify their logics, rather than our own presumptions about logic; and (iii) a presumption that the epistemology of logic must be significantly different to other research areas, such as the recognised sciences. The Element ends by highlighting what can be achieved by avoiding these pitfalls, presenting an account of theory-choice in logic, logical predictivism, motivated by actual logical practice, which suggests that the mechanisms of theory-choice in logic are not that different to those in the recognised sciences. This title is also available as Open Access on Cambridge Core.
Targeting the glutamatergic system is posited as a potentially novel therapeutic strategy for psychotic disorders. While studies in subjects indicate that antipsychotic medication reduces brain glutamatergic measures, they were unable to disambiguate clinical changes from drug effects.
Aims
To address this, we investigated the effects of a dopamine D2 receptor partial agonist (aripiprazole) and a dopamine D2 receptor antagonist (amisulpride) on glutamatergic metabolites in the anterior cingulate cortex (ACC), striatum and thalamus in healthy controls.
Method
A double-blind, within-subject, cross-over, placebo-controlled study design with two arms (n = 25 per arm) was conducted. Healthy volunteers received either aripiprazole (up to 10 mg/day) for 7 days or amisulpride (up to 400 mg/day) and a corresponding period of placebo treatment in a pseudo-randomised order. Magnetic resonance spectroscopy (1H-MRS) was used to measure glutamatergic metabolite levels and was carried out at three different time points: baseline, after 1 week of drug and after 1 week of placebo. Values were analysed as a combined measure across the ACC, striatum and thalamus.
Results
Aripiprazole significantly increased glutamate + glutamine (Glx) levels compared with placebo (β = 0.55, 95% CI [0.15, 0.95], P = 0.007). At baseline, the mean Glx level was 8.14 institutional units (s.d. = 2.15); following aripiprazole treatment, the mean Glx level was 8.16 institutional units (s.d. = 2.40) compared with 7.61 institutional units (s.d. = 2.36) for placebo. This effect remained significant after adjusting for plasma parent and active metabolite drug levels. There was an observed increase with amisulpride that did not reach statistical significance.
Conclusions
One week of aripiprazole administration in healthy participants altered brain Glx levels as compared with placebo administration. These findings provide novel insights into the relationship between antipsychotic treatment and brain metabolites in a healthy participant cohort.
To improve understanding of prototyping practice at the fuzzy front end of the design process, this article presents an analysis of a prototyping dataset captured during the IDEA challenge – a 4-day virtually hosted hackathon – using Pro2booth, a web-based prototype capture tool. The dataset comprised 203 prototypes created by four independent teams working in university labs across Europe supported by interviews carried out with each of the teams after the event. The results of the study provide nine key findings about prototyping at hackathons. These include elucidation of the purposes of prototypes in physical, digital and sketch domains and characterisation of teams’ prototyping practices and strategies. The most successful strategy focused on learning about the problem or solution space, often via physical prototypes rather than following more prescriptive ‘theoretical’ methodologies. Recommendations on prototyping strategies in hackathons or similar scenarios are provided, highlighting the importance of practical strategies that prioritise learning and adaptation. The results of this study raise the broader question to the wider research community of how design research and teaching should balance high-level strategic approaches with more hands-on ‘operational’ prototyping.
Declining labor force participation of older men throughout the 20th century and recent increases in participation have generated substantial interest in understanding the effect of public pensions on retirement. The National Bureau of Economic Research's International Social Security (ISS) Project, a long-term collaboration among researchers in a dozen developed countries, has explored this and related questions. The project employs a harmonized approach to conduct within-country analyses that are combined for meaningful cross-country comparisons. The key lesson is that the choices of policy makers affect the incentive to work at older ages and these incentives have important effects on retirement behavior.
This paper examines a Greek Middle Geometric II pottery assemblage recovered during the Tunisian-Spanish excavations in the ancient city of Utica, Tunisia. The ceramics come from the deposit that sealed Well 200017, which further contained animal bones representing the remains of a ritual collective banquet. The ceramics are mainly of Phoenician, Libyan and Sardinian as well as Greek, Italic and Iberian origin. Most of the sherds come from bowls for consumption of food and drinks; there are also a few vessels for serving food and amphoras, while cooking vessels are very scarce. Based on our radiocarbon evidence, the context dates between 965–903 cal BCE, with a lower interval at 832 cal BCE. Neutron Activation Analysis (NAA) was carried out on forty-five samples mainly of Geometric pottery in two campaigns. This paper presents the NAA results of the pottery from Utica’s well that were sampled during the first campaign in 2015.
In Fused Filament Fabrication, there is increasing interest in the potential of composite filaments for producing complex and load-bearing components. Carbon fibre-filled polyamide currently has highest available strength and stiffness, but promising variants are not in filament form. This paper investigates filament production of commercially available, high-filled PA-CF pellets by modifying a tabletop filament extruder. We show filament production is possible by improving cooling. The FFF printed specimens show an average UTS of 135.5 MPa, higher than most commercially available filaments.
This study investigates the relationship between the number and type of prototypes developed in rapid prototyping contexts, a team's performance self-estimations, and final actual performance. Findings suggest a strong correlation between each of these elements, with the converse also found to be true, motivating the introduction of the concept of Design Delusion - a type of cognitive dissonance due to differences between perceived and actual states. The paper suggests that early prototyping helps identify and mitigate design delusion, improving design decisions and preventing technical debt.
We study cofinal systems of finite subsets of $\omega _1$. We show that while such systems can be NIP, they cannot be defined in an NIP structure. We deduce a positive answer to a question of Chernikov and Simon from 2013: In an NIP theory, any uncountable externally definable set contains an infinite definable subset. A similar result holds for larger cardinals.
We identify a set of essential recent advances in climate change research with high policy relevance, across natural and social sciences: (1) looming inevitability and implications of overshooting the 1.5°C warming limit, (2) urgent need for a rapid and managed fossil fuel phase-out, (3) challenges for scaling carbon dioxide removal, (4) uncertainties regarding the future contribution of natural carbon sinks, (5) intertwinedness of the crises of biodiversity loss and climate change, (6) compound events, (7) mountain glacier loss, (8) human immobility in the face of climate risks, (9) adaptation justice, and (10) just transitions in food systems.
Technical summary
The Intergovernmental Panel on Climate Change Assessment Reports provides the scientific foundation for international climate negotiations and constitutes an unmatched resource for researchers. However, the assessment cycles take multiple years. As a contribution to cross- and interdisciplinary understanding of climate change across diverse research communities, we have streamlined an annual process to identify and synthesize significant research advances. We collected input from experts on various fields using an online questionnaire and prioritized a set of 10 key research insights with high policy relevance. This year, we focus on: (1) the looming overshoot of the 1.5°C warming limit, (2) the urgency of fossil fuel phase-out, (3) challenges to scale-up carbon dioxide removal, (4) uncertainties regarding future natural carbon sinks, (5) the need for joint governance of biodiversity loss and climate change, (6) advances in understanding compound events, (7) accelerated mountain glacier loss, (8) human immobility amidst climate risks, (9) adaptation justice, and (10) just transitions in food systems. We present a succinct account of these insights, reflect on their policy implications, and offer an integrated set of policy-relevant messages. This science synthesis and science communication effort is also the basis for a policy report contributing to elevate climate science every year in time for the United Nations Climate Change Conference.
Social media summary
We highlight recent and policy-relevant advances in climate change research – with input from more than 200 experts.
Bacterial antimicrobial resistance (AMR) is among the leading global health challenges of the century. Animals and their products are known contributors to the human AMR burden, but the extent of this contribution is not clear. This systematic literature review aimed to identify studies investigating the direct impact of animal sources, defined as livestock, aquaculture, pets, and animal-based food, on human AMR. We searched four scientific databases and identified 31 relevant publications, including 12 risk assessments, 16 source attribution studies, and three other studies. Most studies were published between 2012 and 2022, and most came from Europe and North America, but we also identified five articles from South and South-East Asia. The studies differed in their methodologies, conceptual approaches (bottom-up, top-down, and complex), definitions of the AMR hazard and outcome, the number and type of sources they addressed, and the outcome measures they reported. The most frequently addressed animal source was chicken, followed by cattle and pigs. Most studies investigated bacteria–resistance combinations. Overall, studies on the direct contribution of animal sources of AMR are rare but increasing. More recent publications tailor their methodologies increasingly towards the AMR hazard as a whole, providing grounds for future research to build on.
The International Design Engineering Annual (IDEA) Challenge is a virtually hosted hackathon for Engineering Design researchers with aims of: i) generating open access datasets; ii) fostering community between researchers; and, iii) applying great design minds to develop solutions to real design problems. This paper presents the 2022 IDEA challenge and elements of the captured dataset with the aim of providing insights into prototyping behaviours at virtually hosted hackathons, comparing it with the 2021 challenge dataset and providing reflections and learnings from two years of running the challenge. The dataset is shown to provide valuable insights into how designers spend their time at hackathon events and how, why and when prototypes are used during their design processes. The dataset also corroborates the findings from the 2021 dataset, demonstrating the complementarity of physical and sketch prototypes. With this paper, we also invite the wider community to contribute to the IDEA Challenge in future years, either as participants or in using the platform to run their own design studies.
Corded Ware is one of the main archaeological phenomena of the third millennium before the common era (BCE), with a wide geographic spread across much of central and northeastern Europe, from Denmark, the Rhineland, and Switzerland in the west to the Baltic and Western Russia in the east, and broadly restricted to the temperate, continental zones north of the Alps, the Carpathians, and the steppe/forest steppe border to the east (Glob 1944; Strahm and Buchvaldek 1991; Furholt 2014).
The surface of the Greenland Ice Sheet is darkening, which accelerates its surface melt. The role of glacier ice algae in reducing surface albedo is widely recognised but not well quantified and the feedbacks between the algae and the weathering crust remain poorly understood. In this letter, we summarise recent advances in the study of the biological darkening of the Greenland Ice Sheet and highlight three key research priorities that are required to better understand and forecast algal-driven melt: (i) identifying the controls on glacier ice algal growth and mortality, (ii) quantifying the spatio-temporal variability in glacier ice algal biomass and processes involved in cell redistribution and (iii) determining the albedo feedbacks between algal biomass and weathering crust characteristics. Addressing these key research priorities will allow us to better understand the supraglacial ice-algal system and to develop an integrated model incorporating the algal and physical controls on ice surface albedo.
The Eurasian beaver has returned to Britain, presenting fundamental challenges and opportunities for all involved. Beavers will inevitably expand throughout British freshwater systems and provide significant benefits. Unofficial releases have presented challenges in terms of sourcing and genetics, health status and disease risks, the risk of introducing the non-native North American beaver species, and the lack of engagement with communities and resulting conflict. Agreed approaches require development using multi-stakeholder approaches to recognise and promote benefits whilst sensitively managing beavers’ impacts on people’s livelihoods.
Considerable attention recently has been paid to anti-exceptionalism about logic (AEL), the thesis that logic is more similar to the sciences in important respects than traditionally thought. One of AEL’s prominent claims is that logic’s methodology is similar to that of the recognised sciences, with part of this proposal being that logics provide explanations in some sense. However, insufficient attention has been given to what this proposal amounts to, and the challenges that arise in providing an account of explanations in logic. This paper clarifies these challenges, and shows how the practice-based approach is best placed to meet them.
Yarkoni's analysis clearly articulates a number of concerns limiting the generalizability and explanatory power of psychological findings, many of which are compounded in infancy research. ManyBabies addresses these concerns via a radically collaborative, large-scale and open approach to research that is grounded in theory-building, committed to diversification, and focused on understanding sources of variation.
Gallium (Ga) and germanium (Ge) are technologically important critical elements. Lead blast furnace slags from Tsumeb, Namibia, comprise over two million metric tons of material that contains high levels of Ga (135–156 ppm) and Ge (128–441 ppm) in addition to significant Zn concentrations (up to 11 wt.%) and represent a potential resource for these elements. A combination of mineralogical and chemical methods (PXRD, FEG-SEM-EPMA and LA-ICP-MS) indicated different partitioning of Ga and Ge within the individual slag phases. Gallium is predominantly bound in small euhedral crystals of Zn–Fe–Al spinels (<10 μm in size), exhibiting concentrations in the range of 480–1370 ppm (up to 0.004 atoms per formula unit, apfu). Concentrations of Ga in other phases (e.g. melilite) are systematically below 90 ppm. The principal host of Ge is the silicate glass and, to a lesser extent, silicates (melilite and olivine group phases). Concentrations of Ge in glass attained a concentration of 470 ppm (EPMA), but the LA-ICP-MS analysis of glass matrix containing submicrometre spinel crystallites indicated that average Ge levels vary in the range of 113–394 ppm. In the potential extraction of Ga and Ge, the results indicate that ultrafine milling is needed to liberate the Ga- and Ge-hosting phases prior to metallurgical processing of the slag.
In the era of widespread resistance, there are 2 time points at which most empiric prescription errors occur among hospitalized adults: (1) upon admission (UA) when treating patients at risk of multidrug-resistant organisms (MDROs) and (2) during hospitalization, when treating patients at risk of extensively drug-resistant organisms (XDROs). These errors adversely influence patient outcomes and the hospital’s ecology.
Design and setting:
Retrospective cohort study, Shamir Medical Center, Israel, 2016.
Patients:
Adult patients (aged >18 years) hospitalized with sepsis.
Methods:
Logistic regressions were used to develop predictive models for (1) MDRO UA and (2) nosocomial XDRO. Their performances on the derivation data sets, and on 7 other validation data sets, were assessed using the area under the receiver operating characteristic curve (ROC AUC).
Results:
In total, 4,114 patients were included: 2,472 patients with sepsis UA and 1,642 with nosocomial sepsis. The MDRO UA score included 10 parameters, and with a cutoff of ≥22 points, it had an ROC AUC of 0.85. The nosocomial XDRO score included 7 parameters, and with a cutoff of ≥36 points, it had an ROC AUC of 0.87. The range of ROC AUCs for the validation data sets was 0.7–0.88 for the MDRO UA score and was 0.66–0.75 for nosocomial XDRO score. We created a free web calculator (https://assafharofe.azurewebsites.net).
Conclusions:
A simple electronic calculator could aid with empiric prescription during an encounter with a septic patient. Future implementation studies are needed to evaluate its utility in improving patient outcomes and in reducing overall resistances.
During the last few decades, bed-elevation profiles from radar sounders have been used to quantify bed roughness. Various methods have been employed, such as the ‘two-parameter’ technique that considers vertical and slope irregularities in topography, but they struggle to incorporate roughness at multiple spatial scales leading to a breakdown in their depiction of bed roughness where the relief is most complex. In this article, we describe a new algorithm, analogous to wavelet transformations, to quantify the bed roughness at multiple scales. The ‘Self-Adaptive Two-Parameter’ system calculates the roughness of a bed profile using a frequency-domain method, allowing the extraction of three characteristic factors: (1) slope, (2) skewness and (3) coefficient of variation. The multi-scale roughness is derived by weighted-summing of these frequency-related factors. We use idealized bed elevations to initially validate the algorithm, and then actual bed-elevation data are used to compare the new roughness index with other methods. We show the new technique is an effective tool for quantifying bed roughness from radar data, paving the way for improved continental-wide depictions of bed roughness and incorporation of this information into ice flow models.