We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Critical action – action to dismantle oppression and seek justice – is often motivated by and in response to being subjected to racism. Indeed, critical action can be an adaptive coping response to racism, such that critical action might reduce the negative impacts of racism on the individual. Further, the goal of critical action, at its core, is to eliminate racism and its coconspiring forms of oppression, eradicating the root source of harm to marginalized individuals and communities. In this chapter, we provide an overview of current research that has examined how racism is related to critical action for racially marginalized youth. We consider racism as a system of oppression that manifests through culture, institutions, and individuals, along with stress responses to racism. We then provide recommendations for future research and practice to extend our understanding of if, when, and how experiencing racism motivates or detracts from youth critical action.
Presents the latest research on the causes and consequences of British population change from the medieval period to the eve of the Industrial Revolution, in both town and countryside
The American crocodile Crocodylus acutus occurs across the Americas, with its northernmost distribution being in South Florida, USA. This species has undergone severe declines across its range and is categorized globally as Vulnerable on the IUCN Red List and as Threatened on the U.S. Federal Endangered Species List. Long-term monitoring studies in the USA have documented a shift in American crocodile nesting activity and an expansion of its range throughout the southern and eastern coasts of South Florida. However, no successful American crocodile nests have been recorded until now on the west coast of South Florida. Here we document the American crocodile nest monitoring conducted during 1997–2021 at Rookery Bay National Estuarine Research Reserve and the first successful nest from the west coast of South Florida for C. acutus. Marco Airport and McIlvane Marsh are the two main American crocodile nesting areas identified at the Reserve, with 92 nests and 3,586 eggs recorded during 1997–2021. We found most nests at Marco Airport (95.7%) and only four nests (4.3%) at McIlvane Marsh. To date, none of the nests found at Marco Airport have produced successful hatchlings. In contrast, hatchlings have been produced at McIlvane Marsh since nests were first documented there in 2020. We discuss the implications of our findings in terms of the future conservation of the species.
Lithic technologies dominate understanding of early humans, yet natural processes can fracture rock in ways that resemble artefacts made by Homo sapiens and other primates. Differentiating between fractures made by natural processes and primates is important for assessing the validity of early and controversial archaeological sites. Rather than depend on expert authority or intuition, the authors propose a null model of conchoidally fractured Antarctic rocks. As no primates have ever occupied the continent, Antarctica offers a laboratory for generating samples that could only have been naturally fractured. Examples that resemble artefacts produced by primates illustrate the potential of ‘archaeological’ research in Antarctica for the evaluation of hominin sites worldwide.
An enduring problem in North American archaeology concerns the nature of the transition between the Clovis and Folsom Paleoindian complexes in the West. Traditional models indicate a temporal hiatus between the two complexes implying that Folsom was a population replacement for Clovis. Alternatively, if Folsom was an innovation that occurred within Clovis populations and subsequently spread, we would expect to see a temporal overlap. Here, we test these hypotheses using high-quality radiocarbon dates and Bayesian statistics to infer the temporal boundaries of the complexes. We show that the Folsom complex initially appears between 12,900 and 12,740 cal BP, whereas Clovis disappears between 12,720 and12,490 cal BP. Therefore, Folsom may have appeared about 200 years before Clovis disappeared, and so the two complexes likely co-occurred in the West for nearly eight generations. This finding suggests that Folsom was a successful adaptive innovation that diffused through the western Clovis population, eventually going to fixation over multiple generations.
This was a longitudinal study utilising the Irish Longitudinal Study on Ageing (n 3849 aged ≥ 50 years) and investigated the relationship between blood plasma folate and B12 levels at baseline (wave 1) and incident depressive symptoms at 2 and 4 years (waves 2 and 3). A score ≥ 9 on the Center for Epidemiological Studies Depression Scale-8 at wave 2 or 3 was indicative of incident depressive symptoms. B12 status profiles (pmol/l) were defined as < 185, deficient low; 185 to < 258, low normal; > 258–601, normal and > 601 high. Folate status profiles (nmol/l) were defined as ≤ 10·0, deficient low; > 10–23·0, low normal; > 23·0–45·0, normal; >45·0, high. Logistic regression models were used to analyse the longitudinal associations. Both B12 and folate plasma concentrations were lower in the group with incident depressive symptoms v. non-depressed (folate: 21·4 v. 25·1 nmol/l; P = 0·0003; B12:315·7 v. 335·9 pmol/l; P = 0·0148). Regression models demonstrated that participants with deficient-low B12 status at baseline had a significantly higher likelihood of incident depression 4 years later (OR 1·51, 95 % CI 1·01, 2·27, P = 0·043). This finding remained robust after controlling for relevant covariates. No associations of folate status with incident depression were observed. Older adults with deficient-low B12 status had a 51 % increased likelihood of developing depressive symptoms over 4 years. The findings highlight the need to further explore the low-cost benefits of optimising vitamin B12 status for depression in older adults.
This study reviewed all rhinology clinical negligence claims in the National Health Service in England between 2013 and 2018.
Method
All clinical negligence claims held by National Health Service Resolution relating to rhinology in England between 1 April 2013 and 1 April 2018 were reviewed.
Results
There were 171 rhinology related claims with a total estimated potential cost of £13.6 million. There were 119 closed claims (70 per cent) with a total cost of £2.3 million, of which 55 claims resulted in payment of damages. Over three quarters of all rhinology claims were associated with surgery (n = 132). Claims associated with endoscopic sinus surgery had the highest mean cost per claim (£172 978). Unnecessary pain (33.9 per cent) and unnecessary operation (28.1 per cent) were the most commonly cited patient injuries.
Conclusion
Patient education and consent have been highlighted as key areas for improvement from this review of rhinology related clinical negligence claims. A shift in clinical practice towards shared decision making could reduce litigation in rhinology.
Litigation in the National Health Service continues to rise with a 9.4 per cent increase in clinical negligence claims from the period 2018 and 2019 to the period 2019 and 2020. The cost of these claims now accounts for 1.8 per cent of the National Health Service 2019 to 2020 budget. This study aimed to identify the characteristics of clinical negligence claims in the subspecialty of otology.
Methods
This study was a retrospective review of all clinical negligence claims in otology in England held by National Health Service Resolution between April 2013 and April 2018.
Results
There were 171 claims in otology, 24 per cent of all otolaryngology claims, with a potential cost of £24.5 million. Over half of these were associated with hearing loss. Stapedectomy was the highest mean cost per claim operation at £769 438. The most common reasons for litigation were failure or delay in treatment (23 per cent), failure or delay in diagnosis (20 per cent), intra-operative complications (15 per cent) and inadequate consent (13 per cent).
Conclusion
There is a risk of high-cost claims in otology, especially with objective injuries such as hearing loss and facial nerve injury.
Both patient composition and medical care received in clinical trials may not be representative of clinical practice, yet health technology assessments (HTAs) commonly use extrapolation results from trials to estimate incremental benefit. Due to data limitations, external validation of trial extrapolations are uncommon. With the goal of better estimating the benefit of new therapies in practice, we compared long-term survival estimated from real-world patients who received therapy similar to the comparator arm of the OAK trial, a phase III study of patients with advanced non-small cell lung cancer (aNSCLC) who progressed following initial chemotherapy, to standard estimation approaches.
Methods
We estimated long-term survival from: (i) direct extrapolation of trial survival curves; and (ii) aNSCLC patients from the United States Flatiron Health Electronic Health Record ()-derived de-identified database diagnosed between January 2011 and August 2019 who received docetaxel monotherapy after platinum-doublet and had adequate organ function as well as functional status. Patients with unknown organ function and functional status were also included. Standard parametric extrapolations were applied and selected based on visual inspection and goodness-of-fit tests for each cohort.
Results
Using a log-logistic model to extrapolate the trial comparator arm (N = 425), estimated lifetime mean overall survival was 19.2 months (95% confidence interval [95% CI]: 16.5–22.6), and 14.4 months (95% CI: 12.4–17.0) for the real-world cohort (N = 415). Estimated 5-year overall survival rates were 5.4 percent (95% CI: 3.9–7.3) for the trial patients, compared to 3.7 percent (95% CI: 2.6–5.0) among real-world cohort patients.
Conclusions
Our results suggest that directly extrapolating observed survival for trial patients may overestimate the long-term survival compared to the experience of patients treated in routine practice. Our findings have implications for those wishing to estimate the incremental benefit for novel versus established treatments. We plan to compare our results to a generic patient cohort from national cancer registry. Further EHR-based studies utilizing real world data are needed to confirm our findings and to extend beyond this use case for other cancer types and anti-neoplastic therapies.
Type 2 diabetes results mainly from weight gain in adult life and affects one in twelve people worldwide. In the Diabetes REmission Clinical Trial (DiRECT), the primary care-led Counterweight-Plus weight management program achieved remission of type 2 diabetes (for up to six years) for forty-six percent of patients after one year and thirty-six percent after two years. The objective of this study was to estimate the implementation costs of the program, as well as its two-year within-trial cost effectiveness and lifetime cost effectiveness.
Methods
Within-trial cost effectiveness included the Counterweight-Plus costs (including training, practitioner appointments, and low-energy diet), medications, and all routine healthcare contacts, combined with achieved remission rates. Lifetime cost per quality-adjusted life-year (QALY) was estimated according to projected durations of remissions, assuming continued relapse rates as seen in year two of DiRECT and the consequent life expectancy, quality of life and healthcare costs.
Results
The two-year intervention cost was EUR 1,580 per participant, with over eighty percent of the costs incurred in year one. Compared with the control group, medication savings were EUR 259 (95% confidence interval [CI]: 166–352) for anti-diabetes drugs and EUR 29 (95% CI: 12–47) for anti-hypertensive medications. The intervention was modeled with a lifetime horizon to achieve a mean 0.06 (95% CI: 0.04–0.09) gain in QALYs for the DiRECT population and a mean total lifetime cost saving per participant of EUR 1,497 (95% CI: 755–2,331), with the intervention becoming cost-saving within six years.
Conclusions
The intensive weight loss and maintenance program reduced the cost of anti-diabetes drugs through improved metabolic control, achieved diabetes remission in over one-third of participants, and reduced total healthcare contacts and costs over two years. A substantial lifetime healthcare cost saving is anticipated from periods of diabetes remission and delaying complications. Healthcare resources could be shifted cost effectively to establish diabetes remission services, using the existing DiRECT intervention, even if remissions are only maintained for limited durations. However, more research investment is needed to further improve weight-loss maintenance and extend remissions.
Amphibian populations are experiencing declines globally, many of which are driven by the fungal pathogen Batrachochytrium dendrobatidis (Bd), but different species of amphibians, as well as divergent populations of the same species, can show drastically different responses to Bd invasion. We answer three questions: what are the potential trajectories of amphibian host populations following Bd invasion; how are each of these trajectories influenced by the transmission dynamics and load dynamics governing an amphibian–Bd system; and) how do ecological, evolutionary, and environmental factors affect both Bd transmission and load dynamics, which influence the amphibian hosts’ population levels? We build a general framework that identifies eight population-level trajectories that amphibian populations can take upon Bd invasion that are a result of five different branch points. Each of these branch points is affected by either the transmission dynamics or the load dynamics underlying the system. Integrating relevant disease ecology theory and empirical data, this framework can be used to guide context-dependent management strategies for amphibian populations infected with Bd.
Recent work has demonstrated that Goshen points overlap in time with another group of unfluted lanceolate points from the Plains, Plainview points. This has raised the question of whether the two types should be kept separate or consolidated into a single type. We sought to resolve this issue by applying geometric morphometric methods to a sample of points from well-documented Goshen and Plainview assemblages. We found that their shapes were statistically indistinguishable, which indicates that Goshen and Plainview points should be assigned to the same type. Because Plainview points were recognized before Goshen points, it is the latter type name that should be dropped. Sinking Goshen into Plainview allows us to move beyond taxonomic issues and toward understanding both the spatiotemporal variation that exists among Plainview assemblages and what it can tell us about the adaptations and social dynamics of Plainview groups.
Preventing suicide and self-harm is a global health priority. Although there is a growing evidence base for the effectiveness of psychoanalytic and psychodynamic psychotherapies for a range of disorders, to date there has been no systematic review of its effectiveness in reducing suicidal and self-harming behaviours.
Aims
To systematically review randomised controlled trials of psychoanalytic and psychodynamic psychotherapies for suicidal attempts and self-harm.
Method
We searched PubMed, PsycINFO, Psycharticles, CINAHL, EMBASE and the Cochrane Central Register of Controlled Trials for randomise controlled trials of psychoanalytic and psychodynamic psychotherapies for reducing suicide attempts and self-harm.
Results
Twelve trials (17 articles) were included in the meta-analyses. Psychoanalytic and psychodynamic therapies were effective in reducing the number of patients attempting suicide (pooled odds ratio, 0.469; 95% CI 0.274–0.804). We found some evidence for significantly reduced repetition of self-harm at 6-month but not 12-month follow-up. Significant treatment effects were also found for improvements in psychosocial functioning and reduction in number of hospital admissions.
Conclusions
Psychoanalytic and psychodynamic psychotherapies are indicated to be effective in reducing suicidal behaviour and to have short-term effectiveness in reducing self-harm. They can also be beneficial in improving psychosocial well-being. However, the small number of trials and moderate quality of the evidence means further high-quality trials are needed to confirm our findings and to identity which specific components of the psychotherapies are effective.
The preservation of compounds of biological origin (nucleic acids, proteins, carbohydrates, lipids, and resistant biopolymers) in terrigenous fossils and the chemical and structural changes that they undergo during fossilization are discussed over three critical stratigraphic levels or “time slices.” The youngest of these is the archeological record (e.g., <10 k.y. B.P.), when organic matter from living organisms undergoes the preliminary stages of fossilization (certain classes of biomolecule are selectively preserved while others undergo rapid degradation). The second time slice is the Tertiary. Well-preserved fossils of this age retain diagenetically modified biomarkers and biopolymers for which a product-precursor relationship with the original biological materials can still be identified. The final time slice is the Carboniferous. Organic material of this age has generally undergone such extensive diagenetic degradation that only the most resistant biopolymers remain and these have undergone substantial modification. Trends through time in the taphonomy and utility of ancient biomolecules in terrigenous fossils affect their potential for studies that involve chemosystematic and environmental data.
To identify the intracochlear electrode position in cochlear implant recipients and determine the correlation to speech perception for two peri-modiolar electrode arrays.
Methods
Post-operative cone-beam computed tomography images of 92 adult recipients of the ‘CI512’ electrode and 18 adult recipients of the ‘CI532’ electrode were analysed. Phonemes scores were recorded pre-implantation, and at 3 and 12 months post-implantation.
Results
All CI532 electrodes were wholly within scala tympani. Of the 79 CI512 electrodes intended to be in scala tympani, 58 (73 per cent) were in scala tympani, 14 (17 per cent) were translocated and 7 (9 per cent) were wholly in scala vestibuli. Thirteen CI512 electrodes were deliberately inserted into scala vestibuli. Speech perception scores for post-lingual recipients were higher in the scala tympani group (69.1 per cent) compared with the scala vestibuli (54.2 per cent) and translocation (50 per cent) groups (p < 0.05). Electrode location outside of scala tympani independently resulted in a 10.5 per cent decrease in phoneme scores.
Conclusion
Cone-beam computed tomography was valuable for demonstrating electrode position. The rate of scala tympani insertion was higher in CI532 than in CI512 electrodes. Scala vestibuli insertion and translocation were associated with poorer speech perception outcomes.
It has long been assumed that Folsom points are more standardized than Clovis points, although an adequate test of this proposition has yet to be undertaken. Here, we address that deficiency by using data from a sample of Folsom and Clovis points recovered from sites across the western United States. We used geometric morphometric techniques to capture point shape and then conducted statistical analyses of variability associated with Clovis and Folsom point bases and blades. Our results demonstrate that Folsom bases and blades are less variable than those on earlier Clovis points, indicating an increase in point standardization during the Early Paleoindian period. In addition, despite published claims to the contrary, Clovis and Folsom point bases are no more variable than blades. Based on these results, we conducted additional analyses to examine the modularity and size of Clovis and Folsom points. The results suggest Clovis points have more integrated base and blade segments than Folsom points. We suggest that several classes of Clovis points—intended for different functions—might have been in use during the Clovis period and that the later Folsom points might have served only as weapon tips, the shape of which were constrained by the fluting process.
Short-term hunter-gatherer residential camps have been a central feature of human settlement patterns and social structure for most of human evolutionary history. Recent analyses of ethnohistoric hunter-gatherer data show that across different environments, the average size of hunter-gatherer bands is remarkably constant and that bands are commonly formed by a small number of coresident families. Using ethnoarchaeological data, we examine the relationship between the physical infrastructure of camps and their social organization. We compiled a dataset of 263 ethnoarchaeologically observed hunter-gatherer camps from 13 studies in the literature. We focus on both the scale of camps, or their average size, structure, and composition, and the dynamics that governed their variation. Using a combination of inferential statistics and linear models, we show that the physical infrastructure of camps, measured by the number of household features, reflects the internal social organization of hunter-gatherer bands. Using scaling analyses, we then show that the variation among individual camps is related to a predictable set of dynamics between camp area, infrastructure, the number of occupants, and residence time. Moreover, the scale and dynamics that set the statistical variance in camp sizes are similar across different environments and have important implications for reconstructing prehistoric hunter-gatherer social organization and behavior from the archaeological record.
Kaolins have been separated from sandstones from the Pugu Hills deposit, Tanzania, using a 15 mm glass hydrocyclone and a laboratory-scale multiple unit consisting of six 10 mm cyclones. Two types of sandstone were treated—containing a disordered kaolin (Pugu D) and a well-ordered variety (Pugu K)—to see whether products could be obtained that met particle-size specifications for paper manufacture. Pugu D gave a product that would meet specifications for both paper-filling and -coating applications with minimal processing, although the extremely fine-grained nature of this kaolin could lead to high viscosities in suspension. Processing of Pugu K gave products containing up to 97% kaolin and particle-size distributions that would meet specifications for filler and, possibly, coating clay. However, it was impossible to eliminate 3% or so of fine-grained quartz from these products and this could militate against use in paper manufacture.
To investigate the effectiveness and usability of automated procedural guidance during virtual temporal bone surgery.
Methods:
Two randomised controlled trials were performed to evaluate the effectiveness, for medical students, of two presentation modalities of automated real-time procedural guidance in virtual reality simulation: full and step-by-step visual presentation of drillable areas. Presentation modality effectiveness was determined through a comparison of participants’ dissection quality, evaluated by a blinded otologist, using a validated assessment scale.
Results:
While the provision of automated guidance on procedure improved performance (full presentation, p = 0.03; step-by-step presentation, p < 0.001), usage of the two different presentation modalities was vastly different (full presentation, 3.73 per cent; step-by-step presentation, 60.40 per cent).
Conclusion:
Automated procedural guidance in virtual temporal bone surgery is effective in improving trainee performance. Step-by-step presentation of procedural guidance was engaging, and therefore more likely to be used by the participants.
The corticogeniculate circuit is an evolutionarily conserved pathway linking the primary visual cortex with the visual thalamus in the feedback direction. While the corticogeniculate circuit is anatomically robust, the impact of corticogeniculate feedback on the visual response properties of visual thalamic neurons is subtle. Accordingly, discovering the function of corticogeniculate feedback in vision has been a particularly challenging task. In this review, the morphology, organization, physiology, and function of corticogeniculate feedback is compared across mammals commonly studied in visual neuroscience: primates, carnivores, rabbits, and rodents. Common structural and organizational motifs are present across species, including the organization of corticogeniculate feedback into parallel processing streams in highly visual mammals.