We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
If current food consumption patterns continue, the agriculture sector must provide significantly more food in the coming years from the available land area. Some livestock systems engage in feed–food competition as arable land is used for livestock feed rather than as crops for food; reducing the global supply of food. There is a growing argument that to meet future-food demands sustainably, feed–food competition must be minimized. To this end, we evaluated the effectiveness of two refined metrics to quantify feed–food competition in three livestock systems; dairy and its beef, suckler beef and pig production in Ireland. The metrics are edible protein conversion ratio (EPCR) and the land-use ratio (LUR). The EPCR compares the amount of human digestible protein (HDP) in livestock feed against the amount of HDP the livestock produced, calculating how efficiently it produces HDP. However, the LUR compares the potential HDP from a crop system on the land used to produce the livestock's feed against the HDP the livestock system produced. In both metrics, a value <1 demonstrates an efficient system. The EPCR values for dairy beef (0.22) and suckler beef (0.29) systems consider them efficient producers, whereas pig production (1.51) is inefficient. The LUR values designate that only the dairy beef (0.58) is a net positive producer of HDP from the land used for its feed, with crop production producing more HDP than suckler beef (1.34) and pig production (1.73). Consequently, the LUR can be deemed to be more suitable to represent feed–food competition in livestock production.
Formally, the Farming/Language Dispersal hypothesis as applied to Japan relates to the introduction of agriculture and spread of the Japanese language (between ca. 500 BC–AD 800). We review current data from genetics, archaeology, and linguistics in relation to this hypothesis. However, evidence bases for these disciplines are drawn from different periods. Genetic data have primarily been sampled from present-day Japanese and prehistoric Jōmon peoples (14,000–300 BC), preceding the introduction of rice agriculture. The best archaeological evidence for agriculture comes from western Japan during the Yayoi period (ca. 900 BC–AD 250), but little is known about northeastern Japan, which is a focal point here. And despite considerable hypothesizing about prehistoric language, the spread of historic languages/ dialects through the islands is more accessible but difficult to relate to prehistory. Though the lack of Yayoi skeletal material available for DNA analysis greatly inhibits direct study of how the pre-agricultural Jōmon peoples interacted with rice agriculturalists, our review of Jōmon genetics sets the stage for further research into their relationships. Modern linguistic research plays an unexpected role in bringing Izumo (Shimane Prefecture) and the Japan Sea coast into consideration in the populating of northeastern Honshu by agriculturalists beyond the Kantō region.
A new fossil site in a previously unexplored part of western Madagascar (the Beanka Protected Area) has yielded remains of many recently extinct vertebrates, including giant lemurs (Babakotia radofilai, Palaeopropithecus kelyus, Pachylemur sp., and Archaeolemur edwardsi), carnivores (Cryptoprocta spelea), the aardvark-like Plesiorycteropus sp., and giant ground cuckoos (Coua). Many of these represent considerable range extensions. Extant species that were extirpated from the region (e.g., Prolemur simus) are also present. Calibrated radiocarbon ages for 10 bones from extinct primates span the last three millennia. The largely undisturbed taphonomy of bone deposits supports the interpretation that many specimens fell in from a rock ledge above the entrance. Some primates and other mammals may have been prey items of avian predators, but human predation is also evident. Strontium isotope ratios (87Sr/86Sr) suggest that fossils were local to the area. Pottery sherds and bones of extinct and extant vertebrates with cut and chop marks indicate human activity in previous centuries. Scarcity of charcoal and human artifacts suggests only occasional visitation to the site by humans. The fossil assemblage from this site is unusual in that, while it contains many sloth lemurs, it lacks ratites, hippopotami, and crocodiles typical of nearly all other Holocene subfossil sites on Madagascar.
Studies investigating the underlying mechanisms of hallucinations in patients with schizophrenia suggest that an imbalance in top-down expectations v. bottom-up processing underlies these errors in perception. This study evaluates this hypothesis by testing if individuals drawn from the general population who have had auditory hallucinations (AH) have more misperceptions in auditory language perception than those who have never hallucinated.
Methods
We used an online survey to determine the presence of hallucinations. Participants filled out the Questionnaire for Psychotic Experiences and participated in an auditory verbal recognition task to assess both correct perceptions (hits) and misperceptions (false alarms). A hearing test was performed to screen for hearing problems.
Results
A total of 5115 individuals from the general Dutch population participated in this study. Participants who reported AH in the week preceding the test had a higher false alarm rate in their auditory perception compared with those without such (recent) experiences. The more recent the AH were experienced, the more mistakes participants made. While the presence of verbal AH (AVH) was predictive for false alarm rate in auditory language perception, the presence of non-verbal or visual hallucinations were not.
Conclusions
The presence of AVH predicted false alarm rate in auditory language perception, whereas the presence of non-verbal auditory or visual hallucinations was not, suggesting that enhanced top-down processing does not transfer across modalities. More false alarms were observed in participants who reported more recent AVHs. This is in line with models of enhanced influence of top-down expectations in persons who hallucinate.
The role that vitamin D plays in pulmonary function remains uncertain. Epidemiological studies reported mixed findings for serum 25-hydroxyvitamin D (25(OH)D)–pulmonary function association. We conducted the largest cross-sectional meta-analysis of the 25(OH)D–pulmonary function association to date, based on nine European ancestry (EA) cohorts (n 22 838) and five African ancestry (AA) cohorts (n 4290) in the Cohorts for Heart and Aging Research in Genomic Epidemiology Consortium. Data were analysed using linear models by cohort and ancestry. Effect modification by smoking status (current/former/never) was tested. Results were combined using fixed-effects meta-analysis. Mean serum 25(OH)D was 68 (sd 29) nmol/l for EA and 49 (sd 21) nmol/l for AA. For each 1 nmol/l higher 25(OH)D, forced expiratory volume in the 1st second (FEV1) was higher by 1·1 ml in EA (95 % CI 0·9, 1·3; P<0·0001) and 1·8 ml (95 % CI 1·1, 2·5; P<0·0001) in AA (Prace difference=0·06), and forced vital capacity (FVC) was higher by 1·3 ml in EA (95 % CI 1·0, 1·6; P<0·0001) and 1·5 ml (95 % CI 0·8, 2·3; P=0·0001) in AA (Prace difference=0·56). Among EA, the 25(OH)D–FVC association was stronger in smokers: per 1 nmol/l higher 25(OH)D, FVC was higher by 1·7 ml (95 % CI 1·1, 2·3) for current smokers and 1·7 ml (95 % CI 1·2, 2·1) for former smokers, compared with 0·8 ml (95 % CI 0·4, 1·2) for never smokers. In summary, the 25(OH)D associations with FEV1 and FVC were positive in both ancestries. In EA, a stronger association was observed for smokers compared with never smokers, which supports the importance of vitamin D in vulnerable populations.
The discovery of the first electromagnetic counterpart to a gravitational wave signal has generated follow-up observations by over 50 facilities world-wide, ushering in the new era of multi-messenger astronomy. In this paper, we present follow-up observations of the gravitational wave event GW170817 and its electromagnetic counterpart SSS17a/DLT17ck (IAU label AT2017gfo) by 14 Australian telescopes and partner observatories as part of Australian-based and Australian-led research programs. We report early- to late-time multi-wavelength observations, including optical imaging and spectroscopy, mid-infrared imaging, radio imaging, and searches for fast radio bursts. Our optical spectra reveal that the transient source emission cooled from approximately 6 400 K to 2 100 K over a 7-d period and produced no significant optical emission lines. The spectral profiles, cooling rate, and photometric light curves are consistent with the expected outburst and subsequent processes of a binary neutron star merger. Star formation in the host galaxy probably ceased at least a Gyr ago, although there is evidence for a galaxy merger. Binary pulsars with short (100 Myr) decay times are therefore unlikely progenitors, but pulsars like PSR B1534+12 with its 2.7 Gyr coalescence time could produce such a merger. The displacement (~2.2 kpc) of the binary star system from the centre of the main galaxy is not unusual for stars in the host galaxy or stars originating in the merging galaxy, and therefore any constraints on the kick velocity imparted to the progenitor are poor.
To characterize meal patterns across ten European countries participating in the European Prospective Investigation into Cancer and Nutrition (EPIC) calibration study.
Design
Cross-sectional study utilizing dietary data collected through a standardized 24 h diet recall during 1995–2000. Eleven predefined intake occasions across a 24 h period were assessed during the interview. In the present descriptive report, meal patterns were analysed in terms of daily number of intake occasions, the proportion reporting each intake occasion and the energy contributions from each intake occasion.
Setting
Twenty-seven centres across ten European countries.
Subjects
Women (64 %) and men (36 %) aged 35–74 years (n 36 020).
Results
Pronounced differences in meal patterns emerged both across centres within the same country and across different countries, with a trend for fewer intake occasions per day in Mediterranean countries compared with central and northern Europe. Differences were also found for daily energy intake provided by lunch, with 38–43 % for women and 41–45 % for men within Mediterranean countries compared with 16–27 % for women and 20–26 % for men in central and northern European countries. Likewise, a south–north gradient was found for daily energy intake from snacks, with 13–20 % (women) and 10–17 % (men) in Mediterranean countries compared with 24–34 % (women) and 23–35 % (men) in central/northern Europe.
Conclusions
We found distinct differences in meal patterns with marked diversity for intake frequency and lunch and snack consumption between Mediterranean and central/northern European countries. Monitoring of meal patterns across various cultures and populations could provide critical context to the research efforts to characterize relationships between dietary intake and health.
People with dementia may benefit from palliative care which specifically addresses the needs of patients and families affected by this life-limiting disease. On behalf of the European Association for Palliative Care (EAPC), we recently performed a Delphi study to define domains for palliative care in dementia and to provide recommendations for optimal care. An international panel of experts in palliative care, dementia care or both, achieved consensus on almost all domains and recommendations, but the domain concerning the applicability of palliative care to dementia required revision.
Methods:
To examine in detail, the opinions of the international panel of 64 experts around the applicability of palliative care, we explored feedback they provided in the Delphi process. To examine which experts found it less important or less applicable, ordinal regression analyses related characteristics of the panelists to ratings of overall importance of the applicability domain, and to agreement with the domain's four recommendations.
Results:
Some experts expressed concerns about bringing up end-of-life issues prematurely and about relabeling dementia care as palliative care. Multivariable analyses with the two outcomes of importance and agreement with applicability indicated that younger or less experienced experts and those whose expertise was predominantly in dementia care found palliative care in dementia less important and less applicable.
Conclusions:
Benefits of palliative care in dementia are acknowledged by experts worldwide, but there is some controversy around its early introduction. Further studies should weigh concerns expressed around care receiving a “palliative” label versus the benefits of applying palliative care early.
High dietary Na intake is associated with multiple health risks, making accurate assessment of population dietary Na intake critical. In the present study, reporting accuracy of dietary Na intake was evaluated by 24 h urinary Na excretion using the EPIC-Soft 24 h dietary recall (24-HDR). Participants from a subsample of the European Food Consumption Validation study (n 365; countries: Belgium, Norway and Czech Republic), aged 45–65 years, completed two 24 h urine collections and two 24-HDR. Reporting accuracy was calculated as the ratio of reported Na intake to that estimated from the urinary biomarker. A questionnaire on salt use was completed in order to assess the discretionary use of table and cooking salt. The reporting accuracy of dietary Na intake was assessed using two scenarios: (1) a salt adjustment procedure using data from the salt questionnaire; (2) without salt adjustment. Overall, reporting accuracy improved when data from the salt questionnaire were included. The mean reporting accuracy was 0·67 (95 % CI 0·62, 0·72), 0·73 (95 % CI 0·68, 0·79) and 0·79 (95 % CI 0·74, 0·85) for Belgium, Norway and Czech Republic, respectively. Reporting accuracy decreased with increasing BMI among male subjects in all the three countries. For women from Belgium and Norway, reporting accuracy was highest among those classified as obese (BMI ≥ 30 kg/m2: 0·73, 95 % CI 0·67, 0·81 and 0·81, 95 % CI 0·77, 0·86, respectively). The findings from the present study showed considerable underestimation of dietary Na intake assessed using two 24-HDR. The questionnaire-based salt adjustment procedure improved reporting accuracy by 7–13 %. Further development of both the questionnaire and EPIC-Soft databases (e.g. inclusion of a facet to describe salt content) is necessary to estimate population dietary Na intakes accurately.
Metacognitive training (MCT) for patients with psychosis is a psychological group intervention that aims to educate patients about common cognitive biases underlying delusion formation and maintenance, and to highlight their negative consequences in daily functioning.
Method
In this randomized controlled trial, 154 schizophrenia spectrum patients with delusions were randomly assigned to either MCT + treatment as usual (TAU) or TAU alone. Both groups were assessed at baseline, and again after 8 and 24 weeks. The trial was completed fully by 111 patients. Efficacy was measured with the Psychotic Symptom Rating Scales (PSYRATS) Delusions Rating Scale (DRS), and with specific secondary measures referring to persecutory ideas and ideas of social reference (the Green Paranoid Thoughts Scale, GPTS), cognitive insight (the Beck Cognitive Insight Scale, BCIS), subjective experiences of cognitive biases (the Davos Assessment of Cognitive Biases Scale, DACOBS) and metacognitive beliefs (the 30-item Metacognitions Questionnaire, MCQ-30). Economic analysis focused on the balance between societal costs and health outcomes (quality-adjusted life years, QALYs).
Results
Both conditions showed a decrease of delusions. MCT was not more efficacious in terms of reducing delusions, nor did it change subjective paranoid thinking and ideas of social reference, cognitive insight or subjective experience of cognitive biases and metacognitive beliefs. The results of the economic analysis were not in favour of MCT + TAU.
Conclusions
In the present study, MCT did not affect delusion scores and self-reported cognitive insight, or subjective experience of cognitive biases and metacognitive beliefs. MCT was not cost-effective.
Multilocus sequence types (STs) were determined for 232 and 737 Campylobacter jejuni/coli isolates from Dutch travellers and domestically acquired cases, respectively. Putative risk factors for travel-related campylobacteriosis, and for domestically acquired campylobacteriosis caused by exotic STs (putatively carried by returning travellers), were investigated. Travelling to Asia, Africa, Latin America and the Caribbean, and Southern Europe significantly increased the risk of acquiring campylobacteriosis compared to travelling within Western Europe. Besides eating chicken, using antacids, and having chronic enteropathies, we identified eating vegetable salad outside Europe, drinking bottled water in high-risk destinations, and handling/eating undercooked pork as possible risk factors for travel-related campylobacteriosis. Factors associated with domestically acquired campylobacteriosis caused by exotic STs involved predominantly person-to-person contacts around popular holiday periods. We concluded that putative determinants of travel-related campylobacteriosis differ from those of domestically acquired infections and that returning travellers may carry several exotic strains that might subsequently spread to domestic populations even through limited person-to-person transmission.
HERMES is a new high-resolution multi-object spectrograph on the Anglo Australian Telescope. The primary science driver for HERMES is the GALAH survey, GALactic Archaeology with HERMES. We are planning a spectroscopic survey of about a million stars, aimed at using chemical tagging techniques to reconstruct the star-forming aggregates that built up the disk, the bulge and halo of the Galaxy. This project will benefit greatly from the stellar distances and transverse motions from the Gaia mission.
A recent study by Wylie et al. (2006) has revealed that s-process element abundances are enhanced relative to iron in both red giant branch and asymptotic giant branch stars of 47 Tuc. A more detailed investigation into s-process element abundances throughout the colour-magnitude diagram of 47 Tuc is vital in order to determine whether the observed enhancements are intrinsic to the cluster. This paper explores this possibility through observational and theoretical means. The visibility of s- and r-process element lines in synthetic spectra of giant and dwarf stars throughout the colour magnitude diagram of 47 Tuc has been explored. It was determined that a resolving power of 10 000 was sufficient to observe s-process element abundance variations in globular cluster giant-branch stars. These synthetic results were compared with the spectra of eleven 47 Tuc giant branch stars observed during the performance verification of the Robert Stobie Spectrograph on the Southern African Large Telescope. Three s-process elements (Zr, Ba and Nd) and one r-process element (Eu) were investigated. No abundance variations were found such that [X/Fe] = 0.0 ± 0.5 dex. It was concluded that this resolving power, R ∼ 5000, was not sufficient to obtain exact abundances but upper limits on the s-process element abundances could be determined.
The global debate on poverty alleviation is increasingly framed in terms of enabling economic opportunities for the poor, in order to create sustainable economic growth in developing countries (World Resources Institute, 2007). Perhaps the most significant consequence of this shift is the increasing conviction that the private sector should be engaged in the challenge to create economic growth in developing countries. Economic and political developments, in particular, globalization and the increased influence of markets and private investments worldwide, have added to the belief that mobilizing existing private sector financial and intellectual resources is vital in order to achieve sustainable development, reduce poverty and reach ambitious development targets such as the Millennium Development Goals (MDGs) (Dicken, 2003; Wheeler and McKague, 2002).
This conviction, however, is not new, nor is it based on idealism. In the 1994 World Investment Report for example, multinational corporations (MNCs) are described as the main vehicle for the achievement of economic stability and prosperity in developing nations, as they stimulate growth and improve the host countries’ international competitiveness (UNCTAD, 1994). A relevant indicator of the importance of the private sector for developing countries is the fact that private sector investment in these countries has been growing for decades. In recent years, Foreign Direct Investment (FDI) by MNCs in developing countries has increased rapidly. For example, it increased from 20 billion USD in 1990 to 240 billion USD in 2000. In the years that followed FDI declined until 2003, but is currently on the rise again. In contrast, Official Development Assistance (ODA) to developing countries today totals about 55 billion USD annually, and has been declining slightly over the last decade. In the mid-1990s, FDI surpassed ODA, and today the sheer scale of foreign direct investment versus ODA has demanded that the role of MNCs in development be taken seriously (Wheeler and McKague, 2002; Dicken, 2003).
The private sector has merited further action in development for a long time. However, a catalyzing moment did not occur until the World Summit on Sustainable Development in Johannesburg in 2002, when emphasis was placed on the role of the private and public sectors as key partners in solving problems on a global scale and improving the standard of living of the world's poor.
Infectious gastroenteritis causes a considerable burden of disease worldwide. Effective control should be targeted at diseases with the highest burden and costs. Therefore, an accurate understanding of the relative importance of the different microorganisms is needed. The objective of this study was to determine the incidence and aetiology of gastroenteritis in adults requiring hospital admission in The Netherlands. Five hospitals enrolled patients admitted with gastroenteritis for about 1 year during the period May 2008 to November 2009. Participants completed questionnaires and provided a faecal sample. The hospital completed a clinical questionnaire. In total, 44 adults hospitalized for gastroenteritis were included in the study. The cases had serious symptoms, with 31% subsequently developing kidney failure. One or more pathogens were found in 59% of cases. Overall, rotavirus (22%) was the most common infection. Co-infections were observed relatively often (22%). This study emphasizes that rotavirus can also cause serious illness in adults.
Studies using 24 h urine collections need to incorporate ways to validate the completeness of the urine samples. Models to predict urinary creatinine excretion (UCE) have been developed for this purpose; however, information on their usefulness to identify incomplete urine collections is limited. We aimed to develop a model for predicting UCE and to assess the performance of a creatinine index using para-aminobenzoic acid (PABA) as a reference. Data were taken from the European Food Consumption Validation study comprising two non-consecutive 24 h urine collections from 600 subjects in five European countries. Data from one collection were used to build a multiple linear regression model to predict UCE, and data from the other collection were used for performance testing of a creatinine index-based strategy to identify incomplete collections. Multiple linear regression (n 458) of UCE showed a significant positive association for body weight (β = 0·07), the interaction term sex × weight (β = 0·09, reference women) and protein intake (β = 0·02). A significant negative association was found for age (β = − 0·09) and sex (β = − 3·14, reference women). An index of observed-to-predicted creatinine resulted in a sensitivity to identify incomplete collections of 0·06 (95 % CI 0·01, 0·20) and 0·11 (95 % CI 0·03, 0·22) in men and women, respectively. Specificity was 0·97 (95 % CI 0·97, 0·98) in men and 0·98 (95 % CI 0·98, 0·99) in women. The present study shows that UCE can be predicted from weight, age and sex. However, the results revealed that a creatinine index based on these predictions is not sufficiently sensitive to exclude incomplete 24 h urine collections.
The NIR Ca II triplet has proven to be an important tool for quantitative spectroscopy. Here we present results of synthetic spectral analysis for the Ca II triplet for low-metallicity red giant stars, combined with observational data. Our results start to deviate strongly from the widely-used and linear empirical calibrations below [Fe/H] = −2. We provide a new calibration for Ca II triplet studies which is valid down until [Fe/H] = −4 and apply this new calibration to current data sets. We suggest that the classical dwarf galaxies are not so devoid of extremely low-metallicity stars as was previously thought and discuss preliminary results and possibilities for follow-up observations of these extremely low-metallicity candidates.
The use of two non-consecutive 24 h recalls using EPIC-Soft for standardised dietary monitoring in European countries has previously been proposed in the European Food Consumption Survey Method consortium. Whether this methodology is sufficiently valid to assess nutrient intake in a comparable way, among populations with different food patterns in Europe, is the subject of study in the European Food Consumption Validation consortium. The objective of the study was to compare the validity of usual protein and K intake estimated from two non-consecutive standardised 24 h recalls using EPIC-Soft between five selected centres in Europe. A total of 600 adults, aged 45–65 years, were recruited in Belgium, the Czech Republic, France, The Netherlands and Norway. From each participant, two 24 h recalls and two 24 h urines were collected. The mean and distribution of usual protein and K intake, as well as the ranking of intake, were compared with protein and K excretions within and between centres. Underestimation of protein (range 2–13 %) and K (range 4–17 %) intake was seen in all centres, except in the Czech Republic. We found a fair agreement between prevalences estimated based on the intake and excretion data at the lower end of the usual intake distribution ( < 10 % difference), but larger differences at other points. Protein and K intake was moderately correlated with excretion within the centres (ranges = 0·39–0·67 and 0·37–0·69, respectively). These were comparable across centres. In conclusion, two standardised 24 h recalls (EPIC-Soft) appear to be sufficiently valid for assessing and comparing the mean and distribution of protein and K intake across five centres in Europe as well as for ranking individuals.
Previous research has shown that a preoperative assessment clinic enhances hospital cost-efficiency. However, the differences in organization of the patient flow have not been analysed. In this descriptive study, we evaluated the consequences of the organization of the patient flow of a preoperative assessment clinic on its performance, by analysing two Dutch university hospitals, which are organized essentially differently.
Methods
In the final analysis, the study included 880 patients who visited either academic centre. The performance of the two preoperative assessment clinics was evaluated by measuring patient flow time, various procedure times and the total waiting time. Patients’ age, ASA physical status and any preoperative tests requested by the physician were also recorded.
Results
There was a significant difference in patient flow time between the two preoperative assessment clinics. More time was needed for the preoperative assessment when patients’ ASA class was higher. The patient flow time was longer when electrocardiogram and venepuncture were performed at the general outpatient laboratory than when they were performed at the preoperative assessment clinic due to longer waiting times. More tests were requested when they were performed at the preoperative assessment clinic.
Conclusions
This study shows that the organization of patient flow is an important aspect of the logistic processes of the preoperative assessment clinic. It might influence patient flow times as well as the number of preoperative tests requested. Together with other aspects of logistic performance, patient satisfaction and quality of medical assessment, patient flow logistics can be used to assess the quality of a preoperative assessment clinic.