We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study outlines design principles for a generic thermal management system (TMS) using a multi-design point approach. Four TMS configurations were analysed for a regional turboprop and short-haul type aircraft, focusing on total system gross power as an indicator of cost and environmental impact. Mechanisms were introduced to prevent coolant freezing. Results highlight the puller fan configuration as the most beneficial, leveraging temperature differentials and using the Meredith effect for increased operating capability. The ram-air configuration is slightly more efficient than the puller fan configuration for regional aircraft, but only for high system efficiencies and with operational constraints (taxiing in hot day conditions). Dual fan configurations offer significant thrust but also increased mass. The dual fan configuration shows comparable total system gross power to the puller fan for short-haul aircraft in cruise conditions, but not for regional aircraft. The pusher fan is not optimal for both aircraft types since the radiator significantly increases the heat exchanger inlet temperature, which results in higher drawbacks in terms of mass and total system gross power. In conclusion, the study emphasises the necessity to consider all relevant effects in the TMS design, such as drag, mass and efficiency, to allow the design of an optimal overall system.
The monotone homogeneity model (MHM—also known as the unidimensional monotone latent variable model) is a nonparametric IRT formulation that provides the underpinning for partitioning a collection of dichotomous items to form scales. Ellis (Psychometrika 79:303–316, 2014, doi:10.1007/s11336-013-9341-5) has recently derived inequalities that are implied by the MHM, yet require only the bivariate (inter-item) correlations. In this paper, we incorporate these inequalities within a mathematical programming formulation for partitioning a set of dichotomous scale items. The objective criterion of the partitioning model is to produce clusters of maximum cardinality. The formulation is a binary integer linear program that can be solved exactly using commercial mathematical programming software. However, we have also developed a standalone branch-and-bound algorithm that produces globally optimal solutions. Simulation results and a numerical example are provided to demonstrate the proposed method.
Dynamic programming methods for matrix permutation problems in combinatorial data analysis can produce globally-optimal solutions for matrices up to size 30×30, but are computationally infeasible for larger matrices because of enormous computer memory requirements. Branch-and-bound methods also guarantee globally-optimal solutions, but computation time considerations generally limit their applicability to matrix sizes no greater than 35×35. Accordingly, a variety of heuristic methods have been proposed for larger matrices, including iterative quadratic assignment, tabu search, simulated annealing, and variable neighborhood search. Although these heuristics can produce exceptional results, they are prone to converge to local optima where the permutation is difficult to dislodge via traditional neighborhood moves (e.g., pairwise interchanges, object-block relocations, object-block reversals, etc.). We show that a heuristic implementation of dynamic programming yields an efficient procedure for escaping local optima. Specifically, we propose applying dynamic programming to reasonably-sized subsequences of consecutive objects in the locally-optimal permutation, identified by simulated annealing, to further improve the value of the objective function. Experimental results are provided for three classic matrix permutation problems in the combinatorial data analysis literature: (a) maximizing a dominance index for an asymmetric proximity matrix; (b) least-squares unidimensional scaling of a symmetric dissimilarity matrix; and (c) approximating an anti-Robinson structure for a symmetric dissimilarity matrix.
Several authors have touted the p-median model as a plausible alternative to within-cluster sums of squares (i.e., K-means) partitioning. Purported advantages of the p-median model include the provision of “exemplars” as cluster centers, robustness with respect to outliers, and the accommodation of a diverse range of similarity data. We developed a new simulated annealing heuristic for the p-median problem and completed a thorough investigation of its computational performance. The salient findings from our experiments are that our new method substantially outperforms a previous implementation of simulated annealing and is competitive with the most effective metaheuristics for the p-median problem.
Although the K-means algorithm for minimizing the within-cluster sums of squared deviations from cluster centroids is perhaps the most common method for applied cluster analyses, a variety of other criteria are available. The p-median model is an especially well-studied clustering problem that requires the selection of p objects to serve as cluster centers. The objective is to choose the cluster centers such that the sum of the Euclidean distances (or some other dissimilarity measure) of objects assigned to each center is minimized. Using 12 data sets from the literature, we demonstrate that a three-stage procedure consisting of a greedy heuristic, Lagrangian relaxation, and a branch-and-bound algorithm can produce globally optimal solutions for p-median problems of nontrivial size (several hundred objects, five or more variables, and up to 10 clusters). We also report the results of an application of the p-median model to an empirical data set from the telecommunications industry.
The clique partitioning problem (CPP) requires the establishment of an equivalence relation for the vertices of a graph such that the sum of the edge costs associated with the relation is minimized. The CPP has important applications for the social sciences because it provides a framework for clustering objects measured on a collection of nominal or ordinal attributes. In such instances, the CPP incorporates edge costs obtained from an aggregation of binary equivalence relations among the attributes. We review existing theory and methods for the CPP and propose two versions of a new neighborhood search algorithm for efficient solution. The first version (NS-R) uses a relocation algorithm in the search for improved solutions, whereas the second (NS-TS) uses an embedded tabu search routine. The new algorithms are compared to simulated annealing (SA) and tabu search (TS) algorithms from the CPP literature. Although the heuristics yielded comparable results for some test problems, the neighborhood search algorithms generally yielded the best performances for large and difficult instances of the CPP.
Compound-specific radiocarbon analysis (CSRA) provides the possibility to date sample material at a molecular level. N-alkanes are considered as specific compounds with high potential to CSRA. As these compounds originate from plant waxes, their radiocarbon (14C) analysis can provide valuable information about the age and origin of organic materials. This helps to reconstruct and understand environmental conditions and changes in vegetation in the past. However, CSRA has two main challenges: The small sample size of CSRA samples, making them extremely sensitive to blank effects, and the input of unknown amounts of extraneous carbon during the analytical procedure. According to the previous study from Sun and co-workers, we used different-sized aliquots of leaves Fagus sylvatica (nC27, nC29) and Festuca rubra agg (nC31, nC33) as modern standards and two commercial standards (nC26, nC28) as fossil standards for blank determination. A third commercial standard (nC27) with predetermined radiocarbon content of F14C = 0.71 (14C age of 2700 BP) serves to evaluate the blank correction. We found that the blank assessment of Sun and co-workers is also applicable to n-alkanes, with a minimum sample size of 15 µg C for dependable CSRA dates. We determined that the blank introduced during the analytical procedure has a mass of (4.1 ± 0.7) µg carrying a radiocarbon content of F14C = 0.25 ± 0.05. Applying the blank correction to a sediment sample from Lake Holzmaar (Germany) shows that all four isolated n-alkanes have similar 14C ages. However, the bulk material of the sediment and branches found in the sediment core are younger than the CSRA dates. We conclude that the disparity between the actual age of analysed organic material and the age inferred from radiocarbon results, which can occur in sediment traps due to delayed deposition, is the reason for the CSRA age.
Globally, forests are net carbon sinks that partly mitigates anthropogenic climate change. However, there is evidence of increasing weather-induced tree mortality, which needs to be better understood to improve forest management under future climate conditions. Disentangling drivers of tree mortality is challenging because of their interacting behavior over multiple temporal scales. In this study, we take a data-driven approach to the problem. We generate hourly temperate weather data using a stochastic weather generator to simulate 160,000 years of beech, pine, and spruce forest dynamics with a forest gap model. These data are used to train a generative deep learning model (a modified variational autoencoder) to learn representations of three-year-long monthly weather conditions (precipitation, temperature, and solar radiation) in an unsupervised way. We then associate these weather representations with years of high biomass loss in the forests and derive weather prototypes associated with such years. The identified prototype weather conditions are associated with 5–22% higher median biomass loss compared to the median of all samples, depending on the forest type and the prototype. When prototype weather conditions co-occur, these numbers increase to 10–25%. Our research illustrates how generative deep learning can discover compounding weather patterns associated with extreme impacts.
Personality functioning, self-disorders and their relationship to psychotic symptoms on a continuum from mild attenuated experiences to manifest psychotic symptoms in psychotic disorders are highly relevant for psychopathology, course of illness and treatment planning in psychotic disorders, but empirical data is sparse.
Objectives
This study aims at exploring personality functioning and self-disorders in individuals at ultra-high risk for psychosis (UHR) and with first-episode psychosis (FEP), compared to a clinical control group of subjects with borderline personality disorder (BPD) and healthy controls (HC).
Methods
Personality functioning was measured in 107 participants (24 UHR, 29 FEP, and 27 BPD and 27 HC) using the Structured Interview for Personality Organization (STIPO) and the Level of Personality Functioning Scale (LPFS), and self-disorders were assessed using the Examination of Anomalous Self-Experience (EASE). A hierarchical cluster analysis was performed based on the seven STIPO dimensions.
Results
Significant impairment in personality functioning was found in UHR (M = 4.29, SD = .908), FEP (M = 4.83, SD = 1.002), and BPD individuals (M=4.70, SD=.542) compared with HC (M = 1.63, SD = .565). FEP patients showed significantly worse overall personality functioning compared to UHR patients (p = .037). Patients with manifest psychosis (FEP) also exhibited significantly higher levels of self-disorders compared to BPD patients (p = .019). Self-disturbances in patients with milder forms of psychotic symptoms (UHR) were intermediate between the other diagnostic groups (FEP and BPD). Regardless of the main diagnoses, the three clusters of patients were found to differ in levels of personality functioning and self-disorder.
Conclusions
Impairment of personality functioning varies in different stages of psychotic disorders. The level of self-disorders may allow differentiation between manifest psychosis and borderline personality disorder. An in-depth assessment of personality functioning and self-disorders could be helpful in differentiating diagnoses, treatment planning, and establishing foci for psychotherapeutic treatment modalities.
Disclosure of Interest
M. Gruber: None Declared, J. Alexopoulos: None Declared, K. Feichtinger: None Declared, K. Parth: None Declared, A. Wininger: None Declared, N. Mossaheb: None Declared, F. Friedrich: None Declared, Z. Litvan: None Declared, B. Hinterbuchinger: None Declared, S. Doering: None Declared, V. Blüml Grant / Research support from: Grant / Research support from: Heigl-Foundation, Köhler-Foundation, International Psychoanalytical Association (IPA)
In early 2017, the University Medical Center Groningen, the Netherlands, had an outbreak of 2 strains of vancomycin-resistant enterococci (VRE) that spread to various wards. In the summer of 2018, the hospital was again hit by a VRE outbreak, which was detected and controlled early. However, during both outbreaks, fewer patients were admitted to the hospital and various costs were incurred. We quantified the costs of the 2017 and 2018 VRE outbreaks.
Design:
Using data from various sources in the hospital and interviews, we identified and quantified the costs of the 2 outbreaks, resulting from tests, closed beds (opportunity costs), cleaning, additional personnel, and patient isolation.
Setting:
The University Medical Center Groningen, an academic hospital in the Netherlands.
Results:
The total costs associated with the 2017 outbreak were estimated to be €335,278 (US $356,826); the total costs associated with the 2018 outbreak were estimated at €149,025 (US $158,602).
Conclusions:
The main drivers of the costs were the opportunity costs due to the reduction in admitted patients, testing costs, and cleaning costs. Although the second outbreak was considerably shorter, the costs per day were similar to those of the first outbreak. Major investments are associated with the VRE control measures, and an outbreak of VRE can lead to considerable costs for a hospital. Aggressively screening and isolating patients who may be involved in an outbreak of VRE may reduce the overall costs and improve the continuity of care within the hospital.
The aim of this study was to assess the interobserver reliability of the measures forming the Welfare Quality® animal welfare assessment protocol for sows and piglets. The study was carried out at nine farms in Northern Germany. Two trained observers evaluated identical animals simultaneously but independently in 40 joint farm visits. Interobserver reliability was calculated at individual animal level using Cohen's kappa, weighted kappa and the prevalence-adjusted, bias-adjusted kappa (PABAK) and at farm level using Spearman's rank correlation coefficient (RS), the intraclass correlation coefficient (ICC), smallest detectable change (SDC) and limits of agreement (LoA). While a direct comparison of the adjectives of the qualitative behaviour assessment showed poor interobserver reliability, a Principal Component Analysis detected good interobserver reliability. The assessment of social and exploratory behaviours showed acceptable interobserver reliability, while the assessment of stereotypies displayed good interobserver reliability. The human-animal relationship test showed only poor interobserver reliability at individual animal and farm levels. In most cases, measures of health and physical state assessed in sows and piglets exhibited acceptable or good interobserver reliability. In conclusion, after some measures are revised, particularly those examining the human-animal relationship, the Welfare Quality® protocol for sows and piglets will represent a reliable approach in terms of interobserver reliability to assess the welfare of sows and piglets.
The FU Orionis (FUor) and EX Lupi (EXor) type objects are rare pre-main sequence low-mass stars undergoing accretion outbursts. Maser emission is widespread and is a powerful probe of mass accretion and ejection on small scales in star forming region. However, very little is known about the overall prevalence of water masers towards FUors/Exors. We present results from our survey using the Effelsberg 100-m telescope to observe the largest sample of FUors and EXors, plus additional Gaia alerted sources (with the potential nature of being eruptive stars), a total of 51 targets, observing the 22.2 GHz H2O maser, while simultaneously covering the NH3 23 GHz.
Uchiyama et al. present a dual inheritance framework for conceptualizing how behavioural genetics and cultural evolution interact and affect heritability. We posit that to achieve a holistic and nuanced representation of the cultural environment and evolution against which genetic effects should be evaluated, it is imperative to consider the multiple geographic cultural layers impacting individuals and genetic heritability.
As refugees and asylum seekers are at high risk of developing mental disorders, we assessed the effectiveness of Self-Help Plus (SH + ), a psychological intervention developed by the World Health Organization, in reducing the risk of developing any mental disorders at 12-month follow-up in refugees and asylum seekers resettled in Western Europe.
Methods
Refugees and asylum seekers with psychological distress (General Health Questionnaire-12 ⩾ 3) but without a mental disorder according to the Mini International Neuropsychiatric Interview (M.I.N.I.) were randomised to either SH + or enhanced treatment as usual (ETAU). The frequency of mental disorders at 12 months was measured with the M.I.N.I., while secondary outcomes included self-identified problems, psychological symptoms and other outcomes.
Results
Of 459 participants randomly assigned to SH + or ETAU, 246 accepted to be interviewed at 12 months. No difference in the frequency of any mental disorders was found (relative risk [RR] = 0.841; 95% confidence interval [CI] 0.389–1.819; p-value = 0.659). In the per protocol (PP) population, that is in participants attending at least three group-based sessions, SH + almost halved the frequency of mental disorders at 12 months compared to ETAU, however so few participants and events contributed to this analysis that it yielded a non-significant result (RR = 0.528; 95% CI 0.180–1.544; p-value = 0.230). SH + was associated with improvements at 12 months in psychological distress (p-value = 0.004), depressive symptoms (p-value = 0.011) and wellbeing (p-value = 0.001).
Conclusions
The present study failed to show any long-term preventative effect of SH + in refugees and asylum seekers resettled in Western European countries. Analysis of the PP population and of secondary outcomes provided signals of a potential effect of SH + in the long-term, which would suggest the value of exploring the effects of booster sessions and strategies to increase SH + adherence.
The radiocarbon (14C) calibration curve so far contains annually resolved data only for a short period of time. With accelerator mass spectrometry (AMS) matching the precision of decay counting, it is now possible to efficiently produce large datasets of annual resolution for calibration purposes using small amounts of wood. The radiocarbon intercomparison on single-year tree-ring samples presented here is the first to investigate specifically possible offsets between AMS laboratories at high precision. The results show that AMS laboratories are capable of measuring samples of Holocene age with an accuracy and precision that is comparable or even goes beyond what is possible with decay counting, even though they require a thousand times less wood. It also shows that not all AMS laboratories always produce results that are consistent with their stated uncertainties. The long-term benefits of studies of this kind are more accurate radiocarbon measurements with, in the future, better quantified uncertainties.
Radiocarbon (14C) ages cannot provide absolutely dated chronologies for archaeological or paleoenvironmental studies directly but must be converted to calendar age equivalents using a calibration curve compensating for fluctuations in atmospheric 14C concentration. Although calibration curves are constructed from independently dated archives, they invariably require revision as new data become available and our understanding of the Earth system improves. In this volume the international 14C calibration curves for both the Northern and Southern Hemispheres, as well as for the ocean surface layer, have been updated to include a wealth of new data and extended to 55,000 cal BP. Based on tree rings, IntCal20 now extends as a fully atmospheric record to ca. 13,900 cal BP. For the older part of the timescale, IntCal20 comprises statistically integrated evidence from floating tree-ring chronologies, lacustrine and marine sediments, speleothems, and corals. We utilized improved evaluation of the timescales and location variable 14C offsets from the atmosphere (reservoir age, dead carbon fraction) for each dataset. New statistical methods have refined the structure of the calibration curves while maintaining a robust treatment of uncertainties in the 14C ages, the calendar ages and other corrections. The inclusion of modeled marine reservoir ages derived from a three-dimensional ocean circulation model has allowed us to apply more appropriate reservoir corrections to the marine 14C data rather than the previous use of constant regional offsets from the atmosphere. Here we provide an overview of the new and revised datasets and the associated methods used for the construction of the IntCal20 curve and explore potential regional offsets for tree-ring data. We discuss the main differences with respect to the previous calibration curve, IntCal13, and some of the implications for archaeology and geosciences ranging from the recent past to the time of the extinction of the Neanderthals.
We undertook a strengths, weaknesses, opportunities, and threats (SWOT) analysis of Northern Hemisphere tree-ring datasets included in IntCal20 in order to evaluate their strategic fit with the demands of archaeological users. Case studies on wiggle-matching single tree rings from timbers in historic buildings and Bayesian modeling of series of results on archaeological samples from Neolithic long barrows in central-southern England exemplify the archaeological implications that arise when using IntCal20. The SWOT analysis provides an opportunity to think strategically about future radiocarbon (14C) calibration so as to maximize the utility of 14C dating in archaeology and safeguard its reputation in the discipline.
Research with psychiatric patients raises frequently discussed, ethical questions, one of which is: Can psychiatric patients give consent to participation in research at all? To answer this and similar questions adequately, it is - according to our thesis - necessary to analyze first, which theoretical assumptions are made in established practice.
To solve the question after the possibility of consent, compatible understandings of ‘disease’, ‘illness’ and ‘autonomy’ are crucial, but there is no consensual use of these terms in philosophy. Therefore we first are going to explain different concepts of ‘autonomy’ and ‘disease’. Subsequent to this we will test how the different conceptualizations of ‘autonomy’ and ‘disease’ can be related to each other and how the reasonable combinations shape possible answers to the opening question. It will become apparent that an adequate analysis of ‘autonomy’ and ‘disease’ raises ethical dilemma in psychiatry, for which we shall suggest possible solutions.