We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Low vitamin D status (circulating 25-hydroxyvitamin D [25(OH)D] concentration < 50 nmol/L) affects nearly one in four Australian adults(1). The primary source of vitamin D is sun exposure; however, a safe level of sun exposure for optimal vitamin D production has not been established. As supplement use is uneven, increasing vitamin D in food is the logical option for improving vitamin D status at a population level. The dietary supply of vitamin D is low since few foods are naturally rich in vitamin D. While there is no Australia-specific estimated average requirement (EAR) for vitamin D, the Institute of Medicine recommends an EAR of 10 μg/day for all ages. Vitamin D intake is low in Australia, with mean usual intake ranging from 1.8–3.2 μg/day across sex/age groups(2), suggesting a need for data-driven nutrition policy to improve the dietary supply of vitamin D. Food fortification has proven effective in other countries. We aimed to model four potential vitamin D fortification scenarios to determine an optimal strategy for Australia. We used food consumption data for people aged ≥ 2 years (n = 12,153) from the 2011–2012 National Nutrition and Physical Activity Survey, and analytical food composition data for vitamin D3, 25(OH)D3, vitamin D2 and 25(OH)D2(3). Certain foods are permitted for mandatory or voluntary fortification in Australia. As industry uptake of the voluntary option is low, Scenario 1 simulated addition of the maximum permitted amount of vitamin D to all foods permitted under the Australia New Zealand Food Standards Code (dairy products/plant-based alternatives, edible oil spreads, formulated beverages and permitted ready-to-eat breakfast cereals (RTEBC)). Scenarios 2–4 modelled higher concentrations than those permitted for fluid milk/alternatives (1 μg/100 mL) and edible oil spreads (20 μg/100 g) within an expanding list of food vehicles: Scenario 2—dairy products/alternatives, edible oil spreads, formulated beverages; Scenario 3—Scenario 2 plus RTEBC; Scenario 4—Scenario 3 plus bread (which is not permitted for vitamin D fortification in Australia). Usual intake was modelled for the four scenarios across sex and age groups using the National Cancer Institute Method(4). Assuming equal bioactivity of the D vitamers, the range of mean usual vitamin D intake across age groups for males for Scenarios 1 to 4, respectively, was 7.2–8.8, 6.9–8.3, 8.0–9.7 and 9.3–11.3 μg/day; the respective values for females were 5.8–7.5, 5.8–7.2, 6.4–8.3 and 7.5–9.5 μg/day. No participant exceeded the upper level of intake (80 μg/day) under any scenario. Systematic fortification of all foods permitted for vitamin D fortification could substantially improve vitamin D intake across the population. However, the optimal strategy would require permissions for bread as a food vehicle, and addition of higher than permitted concentrations of vitamin D to fluid milks/alternatives and edible oil spreads.
With the food system estimated to be responsible for approximately one-third of greenhouse gas emissions(1) there is an urgent need to transition to healthy and more environmentally sustainable diets. Plant-based ‘milks’ are associated with lower greenhouse gas emissions than dairy milks(2) and many Australian consumers are making the substitution(3). The 2013 Australian Dietary Guidelines advise that plant-based ‘milks’ fortified with at least 100 mg of calcium per 100 ml (e.g., soy, rice or other cereal) can replace dairy milk in the diet(4). This study aimed to assess the likely population-wide nutritional implications of replacement of dairy milk with the main categories of plant ‘milks’ available in Australian supermarkets in November 2023. We used computer simulation modelling of data from the 2011–2 National Nutrition and Physical Activity Survey (n = 12,153 persons aged 2+ years)(5). Dairy milk (including from hot drinks) was replaced with each category of plant ‘milk’ and the likely impact on usual intake of key nutrients supplied by dairy milk was assessed across eight age groups (National Cancer Institute method). Mean usual protein intake was relatively unchanged when dairy milk was replaced by soy ‘milk’ but replacement by rice ‘milk’ led to reductions of 4–5% in older adults (71+ years), increasing the proportion of older men with an inadequate intake from 14% (95% margin of error 5.1) to 20% (8.1). Nine out of 11 categories of plant ‘milks’ were not fortified with riboflavin. Replacement of dairy milk with these products would likely reduce mean usual riboflavin intake by 11% in older adults, increasing the proportion with an inadequate usual intake from 20% (6.2, 5.8) to 30–31% (9.9, 6.3). Nine out of 11 plant milk categories were not fortified with vitamin B12, and replacement of dairy milk with these products would likely reduce usual intake by 10-49% depending on the population group, leading to the proportion of females aged at least 14 years with an inadequate usual intake of vitamin B12 to increase from between 5 (2.2) and 8% (4.0), depending on age, to between 11 (3.4) and 17% (5.4). All categories of plant milks were unfortified with iodine. As a result, replacement of dairy milk with plant ‘milks’ by females aged at least 14 years would likely reduce mean intake by 7–15% and increase the proportion with an inadequate intake from between 6 (4.2) and 12% (4.7), depending on age, to 15 (8.1) to 24% (6.0). In conclusion, replacement of dairy milk with most types of plant-based milk has the potential to adversely impact protein, riboflavin, vitamin B12 and iodine intakes by the Australian population. Advice about switching to plant-based milks needs to consider the population group concerned and a range of nutrients, not just calcium.
Considered a staple of the French press since at least the nineteenth century, the fait divers—a catch-all category for short, often sensational news items such as murders, petty crimes, and suicides—has been taken up and transformed in West African cultural production. This essay focuses on the transformations and transpositions of the fait divers tradition in the work of Senegalese writer Aminata Maïga Ka (1940–2005), arguing that her short stories and novels inflect earlier treatments of the journalistic genre while staging a broader critique of the liberalization of the media in Senegal during the 1970s and 1980s. Ka’s works offer a window onto the entangled histories of postcolonial literary production and the emergent popular press in Senegal. Specifically, she updates and expands Ousmane Sembène’s rescripting of the French fait divers in his short story “La Noire de …” (1961/1962) and the landmark film from 1966 by the same title.
Slavery persisted in Morocco well into the twentieth century and throughout the French Protectorate (1912–56), long after it was abolished in other French-occupied territories (1848). While work by historians has illuminated a previously shadowy history of race and slavery in Morocco, less attention has been paid to the growing corpus of literary texts representing enslaved subjectivities under the Protectorate. Through their literary excavations of the slave past, such works retell the history of Moroccan slavery from the perspective of those most affected. This essay takes translator Nouzha Fassi Fihri’s Dada l’Yakout (2010) as a case in point. Although marketed as a novel, the text is also a dense oral history that channels the voice of an enslaved woman who really existed: Jmia, who was abducted as a child at the beginning of the twentieth century and died in 1975. Considered as “Moroccan other-archive” (El Guabli 2023) and imaginative archeology, literary works chart a way forward for reckoning with the enduring legacies of slavery and the slave trade in Morocco.
The Positive and Negative Syndrome Scale (PANSS) has been used as a universal instrument for clinical assessment of psychopathology in schizophrenia. Different studies have analyzed the factorial structure of this scale and have suggested a five-factor model: positive, negative, excited, depressive, and cognitive/disorganized factors. Two of the most used models are the Marder´s solution and the Wallwork´s one.
Objectives
The aim of this work was to study the correlations of the two cognitive factors (Marder and Wallwork) with a cognitive assessment performed with a standard cognitive battery, in a sample of patients with first psychotic episode of schizophrenia.
Methods
Seventy four patients with first psychotic episode of schizophrenia (26.9, SD:7.8 years old; 70.3% male) were included. The cognitive assessment was performed with the MATRICS Consensus Cognitive Battery (MCCB). The MCCB present seven cognitive domains: Speed of processing, Working memory, Attention/Vigilance, Verbal Learning, Visual Learning, Reasoning and Problem Solving, and Social cognition). Pearson correlations were performed between MCCB scores and Marder´s PANSS cognitive factor (P2, N5, G5, G10, G11, G13, G15) and Wallwork´s one (P2, N5, G11).
Results
Correlation between MCCB scores and cognitive factors of Marder and Wallwork can be seen in the table.
Marder´s cognitive factor
Wallwork´s cognitive factor
Speed of processing
r = -0.461; p<0.001
r = -0.455; p<0.001
Attention/Vigilance
r = -0.414; p<0.001
r = -0.415; p<0.001
Working memory
r = -0.449; p<0.001
r = -0.468; p<0.001
Verbal Learning
r = -0.511; p<0.001
r = -0.405; p<0.001
Visual Learning
r = -0.252; p=0.024
r = -0.254; p=0.029
Reasoning and Problem Solving
r = -0.244; p=0.036
r = -0.272; p=0.019
Social cognition
r = -0.268; p=0.024
r = -0.202; p=0.091
Conclusions
Both PANSS cognition factors show a moderate correlations with Speed of processing, Working memory, Attention/Vigilance and Verbal Learning assessed by MCCB. More discrete correlations were found with Visual Learning, Reasoning and Problem Solving, and with Social cognition (in fact, non-significant correlation with Wallwork´s cognitive factor was found).
Acknowledgements. This study has been funded by Instituto de Salud Carlos III (ISCIII) through the project PI19/00766 and co-funded by the European Union.
Negative symptoms has been classically associated with cognition, psychosocial functioning and quality of life in patients with schizophrenia. But negative symptoms are not a unitary construct, encompassing two different factors: diminished expression, and motivation and pleasure. Few works have studied the relationship between these two different negative symptoms factors and cognition (neuro and social cognition), psychosocial functioning and quality of life, jointly, in patients with a first psychotic episode of schizophrenia.
Objectives
The objective of the present work was to study, in a sample of patients with a first psychotic episode of schizophrenia, the relationship between the negative symptoms (diminished expression and motivation and pleasure) and neurocognition, social cognition, functioning and quality of life.
Methods
The study was carried out with 82 outpatients with a first psychotic episode of schizophrenia from two Spanish hospitals (“12 de Octubre” University Hospital, Madrid and “Virgen de la Luz” Hospital, Cuenca). The patients were assessed with the Clinical Assessment Interview for Negative Symptoms (CAINS) for evaluating diminished expression (EXP) and motivation and pleasure (MAP) symptoms, the MATRICS Consensus Cognitive Battery (MCCB) for evaluating neurocognition and social cognition, the Social and Occupational Functioning Assessment Scale (SOFAS), and the Quality of Life Scale (QLS).
Results
A negative correlation was found between neurocognition and the two negative symptoms subscales: CAINS-EXP (r=-0.458, p<0.001) and CAINS-MAP (r=-0.374, p<0.001); but with social cognition only CAINS-EXP was correlated (r=-0.236, p=0.033). Also, it was found a high negative correlation between SOFAS scores and CAINS-MAP (r=-0.717, p<0.001); and a medium negative correlation with CAINS-EXP (r=-0.394, p<0.001). Finally, QLS score was high correlated with both CAINS subscales: CAINS-EXP (r=-0.681, p<0.001) and CAINS-MAP (r=-0.770, p<0.001).
Conclusions
This study found a relationship between negative symptoms and neurocognition, social cognition, functioning and quality of life in a sample of patients with a first psychotic episode of schizophrenia. But the two different negative symptom factors, diminished expression, and motivation and pleasure, are associated differently with psychosocial functioning, but especially with social cognition where the relationship was only found with diminished expression symptoms.
Clark and Fischer's three levels of depiction of social robots can be conceptualized as cognitive schemas. When interacting with social robots, humans shift between schemas similarly to how they shift between identity category schemas when interacting with other humans. Perception of mind, context cues, and individual differences underlie perceptions of which level of depiction is most situationally relevant.
The Centre for Advanced Laser Applications in Garching, Germany, is home to the ATLAS-3000 multi-petawatt laser, dedicated to research on laser particle acceleration and its applications. A control system based on Tango Controls is implemented for both the laser and four experimental areas. The device server approach features high modularity, which, in addition to the hardware control, enables a quick extension of the system and allows for automated data acquisition of the laser parameters and experimental data for each laser shot. In this paper we present an overview of our implementation of the control system, as well as our advances in terms of experimental operation, online supervision and data processing. We also give an outlook on advanced experimental supervision and online data evaluation – where the data can be processed in a pipeline – which is being developed on the basis of this infrastructure.
From 2016–2019, dry bulb onions were the suspected cause of three multistate outbreaks in the United States. We investigated a large multistate outbreak of Salmonella Newport infections that caused illnesses in both the United States and Canada in 2020. Epidemiologic, laboratory and traceback investigations were conducted to determine the source of the infections, and data were shared among U.S. and Canadian public health officials. We identified 1127 U.S. illnesses from 48 states with illness onset dates ranging from 19 June to 11 September 2020. Sixty-six per cent of ill people reported consuming red onions in the week before illness onset. Thirty-five illness sub-clusters were identified during the investigation and seventy-four per cent of sub-clusters served red onions to customers during the exposure period. Traceback for the source of onions in illness sub-clusters identified a common onion grower in Bakersfield, CA as the source of red onions, and onions were recalled at this time. Although other strains of Salmonella Newport were identified in environmental samples collected at the Bakersfield, CA grower, extensive environmental and product testing did not yield the outbreak strain. This was the third largest U.S. foodborne Salmonella outbreak in the last 30 years. It is the first U.S. multistate outbreak with a confirmed link to dry bulb onions, and it was nearly 10-fold larger than prior outbreaks with a suspected link to onions. This outbreak is notable for its size and scope, as well as the international data sharing that led to implication of red onions as the primary cause of the outbreak. Although an environmental assessment at the grower identified several factors that likely contributed to the outbreak, no main reason was identified. The expedient identification of the outbreak vehicle and response of multiple public health agencies allowed for recall and removal of product from the marketplace, and rapid messaging to both the public and industry on actions to protect consumers; these features contributed to a decrease in cases and expeditious conclusion of the outbreak.
Bovine tuberculosis (bTB) is a chronic, infectious and zoonotic disease of domestic and wild animals caused mainly by Mycobacterium bovis. This study investigated farm management factors associated with recurrent bTB herd breakdowns (n = 2935) disclosed in the period 23 May 2016 to 21 May 2018 and is a follow-up to our 2020 paper which looked at long duration bTB herd breakdowns. A case control study design was used to construct an explanatory set of farm-level management factors associated with recurrent bTB herd breakdowns. In Northern Ireland, a Department of Agriculture Environment and Rural Affairs (DAERA) Veterinarian investigates bTB herd breakdowns using standardised guidelines to allocate a disease source. In this study, source was strongly linked to carryover of infection, suggesting that the diagnostic tests had failed to clear herd infection during the breakdown period. Other results from this study associated with recurrent bTB herd breakdowns were herd size and type (dairy herds 43% of cases), with both these variables intrinsically linked. Other associated risk factors were time of application of slurry, badger access to silage clamps, badger setts in the locality, cattle grazing silage fields immediately post-harvest, number of parcels of land the farmer associated with bTB, number of land parcels used for grazing and region of the country.
The Variables and Slow Transients Survey (VAST) on the Australian Square Kilometre Array Pathfinder (ASKAP) is designed to detect highly variable and transient radio sources on timescales from 5 s to $\sim\!5$ yr. In this paper, we present the survey description, observation strategy and initial results from the VAST Phase I Pilot Survey. This pilot survey consists of $\sim\!162$ h of observations conducted at a central frequency of 888 MHz between 2019 August and 2020 August, with a typical rms sensitivity of $0.24\ \mathrm{mJy\ beam}^{-1}$ and angular resolution of $12-20$ arcseconds. There are 113 fields, each of which was observed for 12 min integration time, with between 5 and 13 repeats, with cadences between 1 day and 8 months. The total area of the pilot survey footprint is 5 131 square degrees, covering six distinct regions of the sky. An initial search of two of these regions, totalling 1 646 square degrees, revealed 28 highly variable and/or transient sources. Seven of these are known pulsars, including the millisecond pulsar J2039–5617. Another seven are stars, four of which have no previously reported radio detection (SCR J0533–4257, LEHPM 2-783, UCAC3 89–412162 and 2MASS J22414436–6119311). Of the remaining 14 sources, two are active galactic nuclei, six are associated with galaxies and the other six have no multi-wavelength counterparts and are yet to be identified.
Underrepresented minorities have higher attrition from the professoriate and have experienced greater negative impacts of the COVID-19 pandemic. The purpose of this study was to compare the impact of COVID-19 on the lives of 196 early-career physician-scientists versus PhD researchers who are underrepresented in biomedical research. Participants in the Building Up study answered questions on the impact of the COVID-19 pandemic on their personal and professional lives, and a mixed-methods approach was used to conduct the analysis. While most participants experienced increases in overall stress (72% of PhD researchers vs 76% of physician-scientists), physician-scientists reported that increased clinical demands, research delays, and the potential to expose family members to SARS-CoV-2 caused psychological distress, specifically. PhD researchers, more than physician-scientists, reported increased productivity (27% vs 9%), schedule flexibilities (49% vs 25%), and more quality time with friends and family (40% vs 24%). Future studies should consider assessing the effectiveness of programs addressing COVID-19-related challenges experienced by PhD researchers and physician-scientists, particularly those from underrepresented backgrounds.
Of those with schizophrenia, one third develop treatment-resistant illness. Nearly 60% of these benefit from clozapine- the only antipsychotic medication licensed in this group.
Objectives
As treatment-resistant illness developed in the follow-up of a first-episode psychosis (FEP) cohort, clozapine was prescribed. This study retrospectively compared the clozapine prescribing patterns, within this cohort, to National Institute for Health and Care Excellence (NICE) guidelines. In addition, impact on hospitalisation, physical health monitoring and augmentation strategies employed following clozapine initiation were examined. Factors delaying initiation of clozapine treatment or contributing to its discontinuation were also explored.
Methods
The study included 339 individuals resident within an Irish community mental health team catchment area, referred with FEP from 1 January 2005 to 31 August 2016. Data were extracted from electronic medical records.
Results
Within the cohort, clozapine was prescribed to 32 individuals (9.4%). The mean number of adequate trials of antipsychotic prior to starting clozapine was 2.74 (SD 1.13; range 1–5). The mean time to clozapine trial was 2.1 years (SD 1.95; range 0.17–6.25). Following initiation of clozapine, mean hospital admissions per year fell from 2.3 to 0.3 (p=0.00). Mean inpatient days pre- and post-clozapine also decreased (147 vs. 53; p=0.00). In all, 18 patients ceased use of clozapine, 5 temporarily and 13 permanently.
Conclusions
Patients are being prescribed clozapine earlier than previously demonstrated. However, delayed treatment remains common, and many patients discontinue clozapine. Further research is necessary to describe and address factors which contribute to its discontinuation.
A recent genome-wide association study (GWAS) identified 12 independent loci significantly associated with attention-deficit/hyperactivity disorder (ADHD). Polygenic risk scores (PRS), derived from the GWAS, can be used to assess genetic overlap between ADHD and other traits. Using ADHD samples from several international sites, we derived PRS for ADHD from the recent GWAS to test whether genetic variants that contribute to ADHD also influence two cognitive functions that show strong association with ADHD: attention regulation and response inhibition, captured by reaction time variability (RTV) and commission errors (CE).
Methods
The discovery GWAS included 19 099 ADHD cases and 34 194 control participants. The combined target sample included 845 people with ADHD (age: 8–40 years). RTV and CE were available from reaction time and response inhibition tasks. ADHD PRS were calculated from the GWAS using a leave-one-study-out approach. Regression analyses were run to investigate whether ADHD PRS were associated with CE and RTV. Results across sites were combined via random effect meta-analyses.
Results
When combining the studies in meta-analyses, results were significant for RTV (R2 = 0.011, β = 0.088, p = 0.02) but not for CE (R2 = 0.011, β = 0.013, p = 0.732). No significant association was found between ADHD PRS and RTV or CE in any sample individually (p > 0.10).
Conclusions
We detected a significant association between PRS for ADHD and RTV (but not CE) in individuals with ADHD, suggesting that common genetic risk variants for ADHD influence attention regulation.
Late Pleistocene and Early Holocene aeolian deposits in Tasmania are extensive in the present subhumid climate zone but also occur in areas receiving >1000 mm of rain annually. Thermoluminescence, optically stimulated luminescence, and radiocarbon ages indicate that most of the deposits formed during periods of cold climate. Some dunes are remnants of longitudinal desert dunes sourced from now-inundated continental shelves which were previously semi-arid. Others formed near source, often in the form of lunettes east of seasonally-dry lagoons in the previously semi-arid Midlands and southeast of Tasmania, or as accumulations close to floodplains of major rivers, or as sandsheets in exposed areas. Burning of vegetation by the Aboriginal population after 40 ka is likely to have influenced sediment supply. A key site for determining climate variability in southern Tasmania is Maynes Junction which records three periods of aeolian deposition (at ca. 90, 32 and 20 ka), interspersed with periods of hillslope instability. Whether wind speeds were higher than at present during the last glacial period is uncertain, but shells in the Mary Ann Bay sandsheet near Hobart and particle size analysis of the Ainslie dunes in northeast Tasmania suggest stronger winds during the last glacial period than at present.
Background: Infection prevention surveillance for cross transmission is often performed by manual review of microbiologic culture results to identify geotemporally related clusters. However, the sensitivity and specificity of this approach remains uncertain. Whole-genome sequencing (WGS) analysis can help provide a gold-standard for identifying cross-transmission events. Objective: We employed a published WGS program, the Philips IntelliSpace Epidemiology platform, to compare accuracy of two surveillance methods: (i.) a virtual infection practitioner (VIP) with perfect recall and automated analysis of antibiotic susceptibility testing (AST), sample collection timing, and patient location data and (ii) a novel clinical matching (CM) algorithm that provides cluster suggestions based on a nuanced weighted analysis of AST data, timing of sample collection, and shared location stays between patients. Methods: WGS was performed routinely on inpatient and emergency department isolates of Enterobacter cloacae, Enterococcus faecium, Klebsiella pneumoniae, and Pseudomonas aeruginosa at an academic medical center. Single-nucleotide variants (SNVs) were compared within core genome regions on a per-species basis to determine cross-transmission clusters. Moreover, one unique strain per patient was included within each analysis, and duplicates were excluded from the final results. Results: Between May 2018 and April 2019, clinical data from 121 patients were paired with WGS data from 28 E. cloacae, 21 E. faecium, 61 K. pneumoniae, and 46 P. aeruginosa isolates. Previously published SNV relatedness thresholds were applied to define genomically related isolates. Mapping of genomic relatedness defined clusters as follows: 4 patients in 2 E. faecium clusters and 2 patients in 1 P. aeruginosa cluster. The VIP method identified 12 potential clusters involving 28 patients, all of which were “pseudoclusters.” Importantly, the CM method identified 7 clusters consisting of 27 patients, which included 1 true E. faecium cluster of 2 patients with genomically related isolates. Conclusions: In light of the WGS data, all of the potential clusters identified by the VIP were pseudoclusters, lacking sufficient genomic relatedness. In contrast, the CM method showed increased sensitivity and specificity: it decreased the percentage of pseudoclusters by 14% and it identified a related genomic cluster of E. faecium. These findings suggest that integrating clinical data analytics and WGS is likely to benefit institutions in limiting expenditure of resources on pseudoclusters. Therefore, WGS combined with more sophisticated surveillance approaches, over standard methods as modeled by the VIP, are needed to better identify and address true cross-transmission events.
Funding: This study was supported by Philips Healthcare.
This study determined farm management factors associated with long-duration bovine tuberculosis (bTB) breakdowns disclosed in the period 23 May 2016 to 21 May 2018; a study area not previously subject to investigation in Northern Ireland. A farm-level epidemiological investigation (n = 2935) was completed when one or more Single Intradermal Comparative Cervical Test (SICCT) reactors or when one or more confirmed (positive histological and/or bacteriological result) lesion at routine slaughter were disclosed. A case-control study design was used to construct an explanatory set of management factors associated with long-duration bTB herd breakdowns; with a case (n = 191) defined as an investigation into a breakdown of 365 days or longer. Purchase of infected animal(s) had the strongest association as the most likely source of infection for long-duration bTB herd breakdowns followed by badgers and then cattle-to-cattle contiguous herd spread. However, 73.5% (95% CI 61.1–85.9%) of the herd type contributing to the purchase of infection source were defined as beef fattening herds. This result demonstrates two subpopulations of prolonged bTB breakdowns, the first being beef fattening herds with main source continuous purchase of infected animals and a second group of primary production herds (dairy, beef cows and mixed) with risk from multiple sources.
Environment early in life may have a long-lasting impact on mental health through epigenetic mechanisms. We studied the effect of early life adversity (ELA) on high risk subjects for Depression (MDD). 20 unaffected first degree relatives (FHP) and 20 controls (FHN) underwent high resolution MRI. We used CTQ questionnaire to assess ELA. Manual tracing of hippocampal subregions and voxel-based morphometry (VBM) analysis were used. We concluded that FHP individuals had reduced volume of those brain areas of emotional processing, in particular if they had a history of ELA. This suggests that ELA might influence brain structure via epigenetic mechanisms and structural changes may precede MDD.
We determined how the brain-derived neurotrphic factor (BDNF) Val66Met polymorphism and ELA affect volumetric measures of hippocampus. 62 MDD patients and 71 healthy controls underwent high-resolution MRI. We manually teaced hippocampi, assessed childhood adversity with CTQ and genotyped Val66Met BDNF. Met-allele carriers showed significantly smaller hippocampal volumes when they had a history of ELA, both in patients and controls. Our results highlight how relevant stress-gene interactions are for hippocampal volume reductions.
Another 37 patients with MDD and 42 healthy participants underwent Diffussion Tensor Imaging (DTI). Deterministic tractography was applied and Val66Met BDNF polymorphism genotyped. Patients carrying the BDNF met-allele had smaller FA in Uncinate Fasciculus (UF) compared to homozygous for val-allele and controls. The met allele of the BDNF polymorphism seems to render subjects more vulnerable for dysfunctions associated with the UF, a brain region which is very closely related to emotional and cognitive function.
The intrinsic oxygen fugacity of a planet profoundly influences a variety of its geochemical and geophysical aspects. Most rocky bodies in our solar system formed with oxygen fugacities approximately five orders of magnitude higher than that corresponding to a hydrogen-rich gas of solar composition. Here we derive oxygen fugacities of extrasolar rocky bodies from the elemental abundances in 15 white dwarf (WD) stars polluted by accretion of rocks. We find that the intrinsic oxygen fugacities of rocks accreted by the WDs are similar to those of terrestrial planets and asteroids in our solar system. This result suggests that at least some rocky exoplanets are geophysically and geochemically similar to Earth.