We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A primary goal of prejudice and stereotyping research is to reduce intergroup disparities arising from various forms of bias. For the last thirty years, much, perhaps most, of this research has focused on implicit bias as the crucial construct of interest. There has been, however, considerable confusion and debate about what this construct is, how to measure it, whether it predicts behavior, how much it contributes to intergroup disparities, and what would signify successful intervention against it. We argue that this confusion arises in part because much work in this area has focused narrowly on the automatic processes of implicit bias without sufficient attention to other relevant psychological constructs and processes, such as people’s values, goals, knowledge, and self-regulation (Devine, 1989). We believe that basic research on implicit bias itself is important and can contribute to reducing intergroup disparities, but those potential contributions diminish if and when the research disregards controlled processes and the personal dilemma faced by sincerely nonprejudiced people who express bias unintentionally. We advocate a renewed focus on this personal dilemma as an important avenue for progress.
Previous studies in rodents suggest that mismatch between fetal and postnatal nutrition predisposes individuals to metabolic diseases. We hypothesized that in nonhuman primates (NHP), fetal programming of maternal undernutrition (MUN) persists postnatally with a dietary mismatch altering metabolic molecular systems that precede standard clinical measures. We used unbiased molecular approaches to examine response to a high fat, high-carbohydrate diet plus sugar drink (HFCS) challenge in NHP juvenile offspring of MUN pregnancies compared with controls (CON). Pregnant baboons were fed ad libitum (CON) or 30% calorie reduction from 0.16 gestation through lactation; weaned offspring were fed chow ad libitum. MUN offspring were growth restricted at birth. Liver, omental fat, and skeletal muscle gene expression, and liver glycogen, muscle mitochondria, and fat cell size were quantified. Before challenge, MUN offspring had lower body mass index (BMI) and liver glycogen, and consumed more sugar drink than CON. After HFCS challenge, MUN and CON BMIs were similar. Molecular analyses showed HFCS response differences between CON and MUN for muscle and liver, including hepatic splicing and unfolded protein response. Altered liver signaling pathways and glycogen content between MUN and CON at baseline indicate in utero programming persists in MUN juveniles. MUN catchup growth during consumption of HFCS suggests increased risk of obesity, diabetes, and cardiovascular disease. Greater sugar drink consumption in MUN demonstrates altered appetitive drive due to programming. Differences in blood leptin, liver glycogen, and tissue-specific molecular response to HFCS suggest MUN significantly impacts juvenile offspring ability to manage an energy rich diet.
This article explores the sudden spate of stories concerning the so-called “blue gum negro” (the Blue Gum) that circulated in the national press from the late 1880s to the late 1890s. These reports concerned purportedly blue-gummed, Black assailants, whose bite was alleged to be poisonous, and of whom African Americans were supposedly terrified. This article argues that, although these narratives reinforced white notions of Black criminality and credulity, they marked a particular moment of racialization, in which fears of bodily contagion, generated by the recent revolution in germ theory, were harnessed to notions of embodied racial difference, to express and galvanize white anxieties about racial impurity. Because Blue Gums embodied dysgenic menace, white journalists and writers were often reluctant to disavow their existence, instead capitalizing on the slippage between figurative and literal language that characterized discourse on race. However, in appropriating Black culture and presenting a figure from folklore as a racial type, white writers betrayed not only the essentially superstitious character of racial thought but also the interwoven nature of dominant and subjugated cultures in the United States.
For the past 25 years, European badgers (Meles meles) have been subject to culling in Britain in attempts to limit the spread of tuberculosis (TB) to cattle. As part of a far-reaching evaluation of the effectiveness and acceptability of badger culling as a TB control measure, this paper assesses one aspect of the welfare of badger populations subjected to culling: the killing of breeding females, which risks leaving their unweaned cubs to starve in the den. To avoid this possibility, a three-month closed season was adopted, running from 1st February to 30th April, based on the best available estimates of the timing of birth and weaning in British badgers. During May 1999–2003, when a total of 4705 adult badgers were culled, field teams failed to capture 12 unweaned litters when their mothers were despatched. In 31 other cases, lactating females were culled but litters of almost-weaned cubs were also caught and despatched at the same dens, usually within a day of capture of the mother. The number of unweaned cubs missed by culling teams — estimated at approximately nine per year on average — was dramatically lower than that projected by a badger welfare lobby group. Our data suggest that the closed season is effective in reducing the suffering of unweaned cubs in badger populations subject to culling, and we recommend that this measure be maintained should badger culling form a component of any future TB control policy.
For over 25 years, European badgers (Meles meles) have been subject to culling in Britain in attempts to limit the spread of tuberculosis (TB) to cattle. As part of a far-reaching evaluation of the effectiveness and acceptability of badger culling as a TB control measure, this paper assesses one aspect of the welfare of badger populations subjected to culling: the risk of badgers confined to cage traps prior to despatch becoming injured as a result of rubbing or biting on the cage. In a large-scale field trial, 88% of badgers received no detectable injuries as a result of being confined in the trap. Of those that were injured, 72% received only minor skin abrasions. A minority (1.8% of the total) acquired damage to the teeth or jaws that may have caused serious pain. Although trap rounds were commenced in the early morning, badgers were no more likely to sustain injuries when they remained in traps until later in the day. Coating of cage traps, intended to give the wire mesh a smoother surface, was associated with a reduction in the incidence of minor skin abrasions, although it may have slightly increased the frequency of less common but more serious abrasions. Modification of the door design reduced tooth damage. Traps will be further modified if appropriate. However, all aspects of the conduct of trapping operations must balance badger welfare with concerns for the health and safety of field staff.
Library cooperation on regional and statewide levels has helped to expand significantly the collections of individual research libraries through cooperative purchasing policies, cataloging networks, union lists, interlibrary loan services and other means. However, access to the contents of these materials is also essential if this vast body of information is to be utilized fully. Books on specific topics can be found with relative ease by searching library subject catalogs, printed bibliographies, and specialized data bases. On the other hand, the wealth of information that appears regularly in periodical literature is all but lost unless it is adequately indexed.
Theories of early cooperation in human society often draw from a small sample of ethnographic studies of surviving populations of hunter–gatherers, most of which are now sedentary. Borneo hunter–gatherers (Punan, Penan) have seldom figured in comparative research because of a decades-old controversy about whether they are the descendants of farmers who adopted a hunting and gathering way of life. In 2018 we began an ethnographic study of a group of still-nomadic hunter–gatherers who call themselves Punan Batu (Cave Punan). Our genetic analysis clearly indicates that they are very unlikely to be the descendants of neighbouring agriculturalists. They also preserve a song language that is unrelated to other languages of Borneo. Dispersed travelling groups of Punan Batu with fluid membership use message sticks to stay in contact, co-operate and share resources as they journey between rock shelters and forest camps. Message sticks were once widespread among nomadic Punan in Borneo, but have largely disappeared in sedentary Punan villages. Thus the small community of Punan Batu offers a rare glimpse of a hunting and gathering way of life that was once widespread in the forests of Borneo, where prosocial behaviour extended beyond the face-to-face community, facilitating successful collective adaptation to the diverse resources of Borneo's forests.
Cognitive impairment is common in individuals presenting to alcohol and other drug (AOD) settings and the presence of biopsychosocial complexity and health inequities can complicate the experience of symptoms and access to treatment services. A challenge for neuropsychologists in these settings is to evaluate the likely individual contribution of these factors to cognition when providing an opinion regarding diagnoses such as acquired brain injury (ABI). This study therefore aimed to identify predictors of cognitive functioning in AOD clients attending for neuropsychological assessment.
Methods:
Clinical data from 200 clients with AOD histories who attended for assessment between 2014 and 2018 were analysed and a series of multiple regressions were conducted to explore predictors of cognitive impairment including demographic, diagnostic, substance use, medication, and mental health variables.
Results:
Regression modelling identified age, gender, years of education, age of first use, days of abstinence, sedative load, emotional distress and diagnoses of ABI and developmental disorders as contributing to aspects of neuropsychological functioning. Significant models were obtained for verbal intellectual functioning (Adj R2 = 0.19), nonverbal intellectual functioning (Adj R2 = 0.10), information processing speed (Adj R2 = 0.20), working memory (Adj R2 = 0.05), verbal recall (Adj R2 = 0.08), visual recall (Adj R2 = 0.22), divided attention (Adj R2 = 0.14), and cognitive inhibition (Adj R2 = 0.07).
Conclusions:
These findings highlight the importance of careful provision of diagnoses in clients with AOD histories who have high levels of unmet clinical needs. They demonstrate the interaction of premorbid and potentially modifiable comorbid factors such as emotional distress and prescription medication on cognition. Ensuring that modifiable risk factors for cognitive impairment are managed may reduce experiences of cognitive impairment and improve diagnostic clarity.
The objectives of this study were to develop and refine EMPOWER (Enhancing and Mobilizing the POtential for Wellness and Resilience), a brief manualized cognitive-behavioral, acceptance-based intervention for surrogate decision-makers of critically ill patients and to evaluate its preliminary feasibility, acceptability, and promise in improving surrogates’ mental health and patient outcomes.
Method
Part 1 involved obtaining qualitative stakeholder feedback from 5 bereaved surrogates and 10 critical care and mental health clinicians. Stakeholders were provided with the manual and prompted for feedback on its content, format, and language. Feedback was organized and incorporated into the manual, which was then re-circulated until consensus. In Part 2, surrogates of critically ill patients admitted to an intensive care unit (ICU) reporting moderate anxiety or close attachment were enrolled in an open trial of EMPOWER. Surrogates completed six, 15–20 min modules, totaling 1.5–2 h. Surrogates were administered measures of peritraumatic distress, experiential avoidance, prolonged grief, distress tolerance, anxiety, and depression at pre-intervention, post-intervention, and at 1-month and 3-month follow-up assessments.
Results
Part 1 resulted in changes to the EMPOWER manual, including reducing jargon, improving navigability, making EMPOWER applicable for a range of illness scenarios, rearranging the modules, and adding further instructions and psychoeducation. Part 2 findings suggested that EMPOWER is feasible, with 100% of participants completing all modules. The acceptability of EMPOWER appeared strong, with high ratings of effectiveness and helpfulness (M = 8/10). Results showed immediate post-intervention improvements in anxiety (d = −0.41), peritraumatic distress (d = −0.24), and experiential avoidance (d = −0.23). At the 3-month follow-up assessments, surrogates exhibited improvements in prolonged grief symptoms (d = −0.94), depression (d = −0.23), anxiety (d = −0.29), and experiential avoidance (d = −0.30).
Significance of results
Preliminary data suggest that EMPOWER is feasible, acceptable, and associated with notable improvements in psychological symptoms among surrogates. Future research should examine EMPOWER with a larger sample in a randomized controlled trial.
This article traces the postbellum development and dissemination of the notion of “negro superstition.” By the end of Reconstruction, many whites across the nation, both liberal and conservative, shared in the belief that credulity was the keystone of African American culture. The formulation of superstition as innate racial trait served the conjoined causes of sectional reconciliation and white supremacy, eroding white support for black citizenship. As liberal estimations of black Christianity declined and conservative depictions of African American magical beliefs proliferated, “voodoo” gained traction as a potent imaginary, shorthand for racial atavism, unreason, and dangerous sexuality.
In this paper we consider the pricing and hedging of financial derivatives in a model-independent setting, for a trader with additional information, or beliefs, on the evolution of asset prices. In particular, we suppose that the trader wants to act in a way which is independent of any modelling assumptions, but that she observes market information in the form of the prices of vanilla call options on the asset. We also assume that both the payoff of the derivative, and the insider’s information or beliefs, which take the form of a set of impossible paths, are time-invariant. In this way we accommodate drawdown constraints, as well as information/beliefs on quadratic variation or on the levels hit by asset prices. Our setup allows us to adapt recent work of [12] to prove duality results and a monotonicity principle. This enables us to determine geometric properties of the optimal models. Moreover, for specific types of information, we provide simple conditions for the existence of consistent models for the informed agent. Finally, we provide an example where our framework allows us to compute the impact of the information on the agent’s pricing bounds.
Bisphenol-A (BPA) is associated with adverse health outcomes and is found in many canned foods. It is not understood if some BPA contamination can be washed away by rinsing. The objective of this single-blinded crossover experiment was to determine whether BPA exposure, as measured by urinary concentrations, could be decreased by rinsing canned beans prior to consumption. Three types of hummus were prepared from dried beans, rinsed, and unrinsed canned beans. Fourteen healthy participants ate two samples of each hummus over six experimental days and collected spot urine specimens for BPA measurement. The geometric mean BPA levels for dried beans BPA (GM = 0.97 ng/ml, 95%CI = 0.74,1.26) was significantly lower than rinsed (GM = 1.89 ng/ml, 1.37,2.59) and unrinsed (GM = 2.46 ng/ml, 1.44,4.19). Difference-in-difference estimates showed an increase in GM BPA from pre- to post-hummus between unrinsed and rinsed canned beans of 1.39 ng/ml, p-value = 0.0400. Rinsing canned beans was an effective method to reduce BPA exposure.
Calls to diversify second language acquisition (SLA) (e.g., Ortega, 2013) have led to increased interest in multilingualism and inclusion of groups less represented in samples of university students, such as individuals at older ages. Nevertheless, we still have more questions than we do answers. This article outlines a research agenda targeting older adult language learning and multilinguals at older ages, both in and beyond the classroom. Since a key difference between young and older adults is cognitive aging, I follow a cognitive approach, focusing on how individual differences in cognition may affect language and vice versa, and how relevant sociocultural factors add to the interplay between language and cognition. Notably, this is not always a story of decline and deficits, but instead of both strengths and weaknesses that differ from those of young adults.
Compound heterozygotes occur when different variants at the same locus on both maternal and paternal chromosomes produce a recessive trait. Here we present the tool VarCount for the quantification of variants at the individual level. We used VarCount to characterize compound heterozygous coding variants in patients with epileptic encephalopathy and in the 1000 Genomes Project participants. The Epi4k data contains variants identified by whole exome sequencing in patients with either Lennox-Gastaut Syndrome (LGS) or infantile spasms (IS), as well as their parents. We queried the Epi4k dataset (264 trios) and the phased 1000 Genomes Project data (2504 participants) for recessive variants. To assess enrichment, transcript counts were compared between the Epi4k and 1000 Genomes Project participants using minor allele frequency (MAF) cutoffs of 0.5 and 1.0%, and including all ancestries or only probands of European ancestry. In the Epi4k participants, we found enrichment for rare, compound heterozygous variants in six genes, including three involved in neuronal growth and development – PRTG (p = 0.00086, 1% MAF, combined ancestries), TNC (p = 0.022, 1% MAF, combined ancestries) and MACF1 (p = 0.0245, 0.5% MAF, EU ancestry). Due to the total number of transcripts considered in these analyses, the enrichment detected was not significant after correction for multiple testing and higher powered or prospective studies are necessary to validate the candidacy of these genes. However, PRTG, TNC and MACF1 are potential novel recessive epilepsy genes and our results highlight that compound heterozygous variants should be considered in sporadic epilepsy.
Much bilingualism research includes some consideration of codeswitching, which may be measured via self-report, an experimental task, or sociolinguistic interview; however, there is little triangulation across measures in either psycholinguistic or sociolinguistic approaches. To consider possible differences between self-report and oral production of codeswitching, Spanish–English bilinguals completed a codeswitching questionnaire and oral production in an autobiographical memory task. They also completed proficiency and executive function tests. We found that broad measures of self-reported and orally produced codeswitches were positively correlated, although relationships with proficiency and executive function were more complex. These findings may direct future studies’ operationalization of codeswitching.
Neutron diffraction texture goniometry indicates that naturally deformed polycrystalline pyrite ores from Mt. Lyell (Tasmania) and Degtiarka (Ural Mountains) have weak lattice preferred orientations. During experimental deformation involving dislocation flow at elevated temperatures and pressures, these initial fabrics have been modified to produce new lattice preferred orientations.
Polycrystalline pyrite form Mt. Lyell (B-1) has an initial <111> - fibre texture perpendicular to a grain-size layering. After 24% shortening perpendicular to the <111> - fibre axis at 700°C a new, but weak <100> texture has developed parallel to the shortening axis. The Degtiarka pyrite (PN-6) initially has two weak fibre components. The somewhat stronger component is a <100>-fibre texture, similar to that in the experimentally deformed B-1 pyrite. The other one is a <111> - fibre texture similar to the intital B-1 preferred orientation. After 30% shortening oblique to both initial fibre axes at 600°C weak <110>- and <111>-fibre textures have developed. The experimentally produced fabrics have developed during deformation involving dislocation flow, dynamic recrystallisation and some microcracking. Intergranular sliding may also have been involved. Differences between lattice preferred orientations developed in the 600°C and 700°C experiments are interpreted to indicate a change in the dominant flow mechanism with changing temperature.
In comparison with other cubic minerals that have been deformed experimentally by dislocation flow mechanisms, the pyrite shows an unusually weak preferred orientation which can be detected only by means of neutron diffraction texture goniometry.
We introduce a path theoretic framework for understanding the representation theory of (quantum) symmetric and general linear groups and their higher-level generalizations over fields of arbitrary characteristic. Our first main result is a ‘super-strong linkage principle’ which provides degree-wise upper bounds for graded decomposition numbers (this is new even in the case of symmetric groups). Next, we generalize the notion of homomorphisms between Weyl/Specht modules which are ‘generically’ placed (within the associated alcove geometries) to cyclotomic Hecke and diagrammatic Cherednik algebras. Finally, we provide evidence for a higher-level analogue of the classical Lusztig conjecture over fields of sufficiently large characteristic.
The period 1875–1925 was remarkable in the history of parasitology mainly for the elucidation of the life cycles of parasites causing important parasitic diseases and the incrimination of vectors in their transmission. These discoveries were made by a small number of scientists working in the tropics a number of whom were Scots. Sir Patrick Manson, the discoverer of the mosquito transmission of filarial worms, was instrumental in directly or indirectly encouraging other Scots including Douglas Argyll-Robertson, David Blacklock, David Bruce, David Cunningham, Robert Leiper, William Leishman, George Low, Muriel Robertson and Ronald Ross, who all made significant discoveries across a wide spectrum of tropical diseases. Among these, William Leishman, Robert Leiper and Muriel Robertson were all graduates of the University of Glasgow and their achievements in the fields of leishmaniasis, schistosomiasis, dracunculiasis and African sleeping sickness, together with subsequent developments in these fields, are the subjects of the ten papers in this Special Issue of Parasitology.