We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Around 1000 years ago, Madagascar experienced the collapse of populations of large vertebrates that ultimately resulted in many species going extinct. The factors that led to this collapse appear to have differed regionally, but in some ways, key processes were similar across the island. This review evaluates four hypotheses that have been proposed to explain the loss of large vertebrates on Madagascar: Overkill, aridification, synergy, and subsistence shift. We explore regional differences in the paths to extinction and the significance of a prolonged extinction window across the island. The data suggest that people who arrived early and depended on hunting, fishing, and foraging had little effect on Madagascar’s large endemic vertebrates. Megafaunal decline was triggered initially by aridification in the driest bioclimatic zone, and by the arrival of farmers and herders in the wetter bioclimatic zones. Ultimately, it was the expansion of agropastoralism across both wet and dry regions that drove large endemic vertebrates to extinction everywhere.
This article examines the death of Colin Roach in Stoke Newington Police Station, Hackney, in 1983, and explores the emotional politics of the campaigns that followed his death. These campaigns were focused on both determining the circumstances of Roach's death and highlighting tensions between the police and the Black community of Hackney. Using hitherto unpublished archival sources, local newspapers, and visual sources, the article documents racial politics in Hackney in the early 1980s and examines the relationship between race and policing at that time. The article argues that the experience and expression of grief and anger were critical to understanding the political problem of race and policing in London in the 1980s, to forming and mobilizing political communities, and to interrogating the power of the state. The article also argues that a critical element of the emotional economy around race in Hackney in 1983 was the indifference and lack of empathy of the police in Stoke Newington to ethnic minority communities. This lack of empathy not only illustrated the problem of race within the police force at this time but further fueled local campaigns to make the police accountable. This links the Roach case to a later turning point—the 1999 Macpherson inquiry into the murder of Stephen Lawrence, which characterized the Metropolitan Police as institutionally racist.
To evaluate the potential superiority of donanemab vs. aducanumab on the percentage of participants with amyloid plaque clearance (≤24.1 Centiloids [CL]) at 6 months in patients with early symptomatic Alzheimer's disease (AD) in phase 3 TRAILBLAZER-ALZ-4 study. The amyloid cascade in AD involves the production and deposition of amyloid beta (Aβ) as an early and necessary event in the pathogenesis of AD.
Methods
Participants (n = 148) were randomized 1:1 to receive donanemab (700 mg IV Q4W [first 3 doses], then 1400 mg IV Q4W [subsequent doses]) or aducanumab (per USPI: 1 mg/kg IV Q4W [first 2 doses], 3 mg/kg IV Q4W [next 2 doses], 6 mg/kg IV Q4W [next 2 doses] and 10 mg/kg IV Q4W [subsequent doses]).
Results
Baseline demographics and characteristics were well-balanced across treatment arms (donanemab [N = 71], aducanumab [N = 69]). Twenty-seven donanemab-treated and 28 aducanumab-treated participants defined as having intermediate tau.
Upon assessment of florbetapir F18 PET scans (6 months), 37.9% donanemab-treated vs. 1.6% aducanumab-treated participants achieved amyloid clearance (p < 0.001). In the intermediate tau subpopulation, 38.5% donanemab-treated vs. 3.8% aducanumab-treated participants achieved amyloid clearance (p = 0.008).
Percent change in brain amyloid levels were −65.2%±3.9% (baseline: 98.29 ± 27.83 CL) and −17.0%±4.0% (baseline: 102.40 ± 35.49 CL) in donanemab and aducanumab arms, respectively (p < 0.001). In the intermediate tau subpopulation, percent change in brain amyloid levels were −63.9%±7.4% (baseline: 104.97 ± 25.68 CL) and −25.4%±7.8% (baseline: 102.23 ± 28.13 CL) in donanemab and aducanumab arms, respectively (p ≤ 0.001).
62.0% of donanemab-treated and 66.7% of aducanumab-treated participants reported an adverse event (AE), there were no serious AEs due to ARIA in donanemab arm and 1.4% serious AEs (one event) due to ARIA were reported in aducanumab arm.
Conclusion
This study provides the first active comparator data on amyloid plaque clearance in patients with early symptomatic AD. Significantly higher number of participants reached amyloid clearance and amyloid plaque reductions with donanemab vs. aducanumab at 6 months.
Previously presented at the Clinical Trials on Alzheimer's Disease - 15th Conference, 2022.
A terrestrial (lacustrine and fluvial) palaeoclimate record from Hoxne (Suffolk, UK) shows two temperate phases separated by a cold episode, correlated with MIS 11 subdivisions corresponding to isotopic events 11.3 (Hoxnian interglacial period), 11.24 (Stratum C cold interval), and 11.23 (warm interval with evidence of human presence). A robust, reproducible multiproxy consensus approach validates and combines quantitative palaeotemperature reconstructions from three invertebrate groups (beetles, chironomids, and ostracods) and plant indicator taxa with qualitative implications of molluscs and small vertebrates. Compared with the present, interglacial mean monthly air temperatures were similar or up to 4.0°C higher in summer, but similar or as much as 3.0°C lower in winter; the Stratum C cold interval, following prolonged nondeposition or erosion of the lake bed, experienced summers 2.5°C cooler and winters between 5°C and 10°C cooler than at present. Possible reworking of fossils into Stratum C from underlying interglacial assemblages is taken into account. Oxygen and carbon isotopes from ostracod shells indicate evaporatively enriched lake water during Stratum C deposition. Comparative evaluation shows that proxy-based palaeoclimate reconstruction methods are best tested against each other and, if validated, can be used to generate more refined and robust results through multiproxy consensus.
This study aimed to identify a well-fitting and theoretically justified item-level latent factor structure for the Wechsler Memory Scales (WMS)-IV verbal paired associates (VerbalPA) subtest to facilitate the ease and accuracy of score interpretations for patients with lateralized temporal lobe epilepsy (TLE).
Methods:
Archival data were used from 250 heterogeneous neurosciences patients who were administered the WMS-IV as part of a standard neuropsychological assessment. Three theoretically motivated models for the latent structure of VerbalPA were tested using confirmatory factor analysis. The first model, based on cognitive principles of semantic processing from hub-and-spoke theory, tested whether performance is related to specific semantic features of target words. The second, motivated by the Cattell–Horn–Carroll (CHC) model of cognitive abilities, investigated whether the associative properties of items influence performance. A third, Hybrid model tested whether performance is related to both semantic and associative properties of items. The best-fitting model was tested for diagnostic group effects contrasting the heterogeneous neuroscience patients with subsets of left and right TLE (n = 51, n = 26, respectively) patients.
Results:
The Hybrid model was found to have the best fit. Patients with left TLE scored significantly less well than the heterogeneous neurosciences sample on selected semantic factor scores, although the effect size was small.
Conclusions:
Future editions of the WMS may consider implementing a semantic scoring structure for the VerbalPA to facilitate test score interpretation. Additionally, these results suggest that principles of hub-and-spoke theory may be integrated into CHC cognitive ability taxonomy.
Social anxiety disorder (SAD) is common. It usually starts in adolescence, and without treatment can disrupt key developmental milestones. Existing generic treatments are less effective for young people with SAD than with other anxiety disorders, but an adaptation of an effective adult therapy (CT-SAD-A) has shown promising results for adolescents.
Aims:
The aim of this study was to conduct a qualitative exploration to contribute towards the evaluation of CT-SAD-A for adoption into Child and Adolescent Mental Health Services (CAMHS).
Method:
We used interpretative phenomenological analysis (IPA) to analyse the transcripts of interviews with a sample of six young people, six parents and seven clinicians who were learning the treatment.
Results:
Three cross-cutting themes were identified: (i) endorsing the treatment; (ii) finding therapy to be collaborative and active; challenging but helpful; and (iii) navigating change in a complex setting. Young people and parents found the treatment to be useful and acceptable, although simultaneously challenging. This was echoed by the clinicians, with particular reference to integrating CT-SAD-A within community CAMHS settings.
Conclusions:
The acceptability of the treatment with young people, their parents and clinicians suggests further work is warranted in order to support its development and implementation within CAMHS settings.
Food security and school attendance are both important for health, well-being and academic performance of children and adolescents. However, their intersection remains underexamined, especially in the USA. The current study considered the association between elementary school-level absenteeism and household food insecurity.
Design:
The current study linked school-level absenteeism and household food insecurity rates using geographic information system mapping and applied the tobit regression model to examine their association.
Setting:
The Clark County, Nevada, public school district – the fifth largest in the USA and in a state with disproportionate food insecurity and chronic school absenteeism rates.
Participants:
Data consisted of school-level absenteeism rates from 185 elementary schools and census tract-level household food insecurity rates.
Results:
Average daily attendance rates were lower for schools with catchment areas that had higher average household food insecurity (FI), decreasing by −0·0232 % per 1 % increase in FI rate (P-value = 0·022). They were also significantly associated with most absenteeism risk factors. Average daily attendance rate was negatively associated with Free and Reduced Lunch eligibility percentage (−0·010 per 1 % increase in FI, P-value < 0·001) and Individualized Education Program participation percentage (−0·039 % per 1 % increase in FI, P-value = 0·033), but positively associated with parent–teacher conference participation rate (0·006 % per 1 % increase in FI, P-value = 0·025) and white student percentage (0·011 % per 1 % increase in FI, P-value = 0·022).
Conclusions:
The current study suggests a link between household food insecurity and elementary school-level absenteeism. Understanding this link is important for policy and practice because schools are frequent settings for food insecurity mitigation interventions.
Media coverage of non-suicidal self-injury (NSSI) ranges from providing helpful education to displaying graphic images. We offer the first research-informed, consensus-based guidelines for the responsible reporting and depicting of NSSI in the media, while also advising on ideas for dissemination and collaboration between media professionals and healthcare experts.
Background: When control mechanisms such as water temperature and biocide level are insufficient, Legionella, the causative bacteria of Legionnaires’ disease, can proliferate in water distribution systems in buildings. Guidance and oversight bodies are increasingly prioritizing water safety programs in healthcare facilities to limit Legionella growth. However, ensuring optimal implementation in large buildings is challenging. Much is unknown, and sometimes assumed, about whether building and campus characteristics influence Legionella growth. We used an extensive real-world environmental Legionella data set in the Veterans Health Administration (VHA) healthcare system to examine infrastructure characteristics and Legionella positivity. Methods: VHA medical facilities across the country perform quarterly potable water sampling of healthcare buildings for Legionella detection as part of a comprehensive water safety program. Results are reported to a standardized national database. We did an exploratory univariate analysis of facility-reported Legionella data from routine potable water samples taken in 2015 to 2018, in conjunction with infrastructure characteristics available in a separate national data set. This review examined the following characteristics: building height (number of floors), building age (reported construction year), and campus acreage. Results: The final data set included 201,936 water samples from 819 buildings. Buildings with 1–5 floors (n = 634) had a Legionella positivity rate of 5.3%, 6–10 floors (n = 104) had a rate of 6.4%, 11–15 floors (n = 36) had a rate of 8.1%, and 16–22 floors (n = 9) had a rate of 8.8%. All rates were significantly different from each other except 11–15 floors and 16–22 floors (P < .05, χ2). The oldest buildings (1800s) had significantly less (P < .05, χ2) Legionella positivity than those built between 1900 and 1939 and between 1940 and 1979, but they were no different than the newest buildings (Fig. 1). In newer buildings (1980–2019), all decades had buildings with Legionella positivity (Fig. 1 inset). Campus acreage varied from ~3 acres to almost 500 acres. Although significant differences were found in Legionella positivity for different campus sizes, there was no clear trend and campus acreage may not be a suitable proxy for the extent or complexity of water systems feeding buildings. Conclusions: The analysis of this large, real-world data set supports an assumption that taller buildings are more likely to be associated with Legionella detection, perhaps a result of more extensive piping. In contrast, the assumption that newer buildings are less associated with Legionella was not fully supported. These results demonstrate the variability in Legionella positivity in buildings, and they also provide evidence that can inform implementation of water safety programs.
Funding: None
Disclosures: Chetan Jinadatha, principal Investigator/Co-I: Research: NIH/NINR, AHRQ, NSF principal investigator: Research: Xenex Healthcare Services. Funds provided to institution. Inventor: Methods for organizing the disinfection of one or more items contaminated with biological agents. Owner: Department of Veterans Affairs. Licensed to Xenex Disinfection System, San Antonio, TX.
OBJECTIVES/GOALS: The detection of liver fibrotic changes at an early and reversible stage is essential to prevent its progression to end-stage cirrhosis and hepatocellular carcinoma. Liver biopsy, which is the current gold standard for fibrosis assessment, is accompanied by several complications due to its invasive nature in addition to sampling errors and reader variability. In this study, we evaluate the use of quantitative parameters extracted from hybrid ultrasound and photoacoustic imaging to detect and monitor fibrotic changes in a DEN rat model. METHODS/STUDY POPULATION: Liver fibrotic changes were induced in 34 Wistar male rats by oral administration of Diethylnitrosamine (DEN) for 12 weeks. 22 rats were imaged with B-mode ultrasound at 3 different time points (baseline, 10 weeks and 13 weeks) for monitoring liver texture changes. Texture features studied included tissue echointensity (liver brightness normalized to kidney brightness) and tissue heterogeneity. 12 rats were imaged with photoacoustic imaging at 4 time points (baseline, 5 wks, 10 wks, and 13 wks) to look at changes in tissue oxygenation. Hemoglobin oxygen saturation (sO2A) and hemoglobin concentration (HbT) in the right and left lobes of the liver were measured. 8 rats were used as controls. Liver tissue samples were obtained following 13 weeks from DEN start time for METAVIR histopathology staging of fibrosis. RESULTS/ANTICIPATED RESULTS: Texture features studied showed an increase with time in DEN rats. Normalized echointensity increased from 0.28 ± 0.06 at baseline to 0.46 ± 0.10 at 10 weeks (p < 0.0005) and 0.53 ± 0.15 at 13 weeks in DEN rats (p < 0.0005). In the control rats, echointensity remained at an average of 0.25 ± 0.05 (p = 0.31). Tissue heterogeneity increased over time in the DEN-exposed rats from a baseline of 208.7 ± 58.3 to 344.6 ± 52.9 at 10 weeks (p < 0.0005) and 376.8 ± 54.9 at 13 weeks (p = 0.06) however it stayed constant at 225.7 ± 37.6 in control rats (p = 0.58). The quantitative analyses of the photoacoustic signals showed that blood oxygen saturation significantly increased with time. At 5 weeks sO2AvT increased by 53.83 % (± 0.25), and HbT by 35.31 % (± 0.07). Following 10 weeks of DEN; sO2AvT by 92.04 % (± 0.29), and HbT by 55.24 % (± 0.1). All increases were significant p < 0.05. In the 13th week, however, the values of all of these parameters were lower than those in the 10th week, however, the decrease was statistically insignificant. DISCUSSION/SIGNIFICANCE OF IMPACT: Quantitative features from B-mode ultrasound and photoacoustic imaging consistently increased over time corresponding to hepatic damage, inflammation and fibrosis progressed. The use of this hybrid imaging method in clinical practice can help meet the significant need for noninvasive assessment of liver fibrosis.
Stars form in clusters, while planets form in gaseous disks around young stars. Cluster dissolution occurs on longer time scales than disk dispersal. Planet formation thus typically takes place while the host star is still inside the cluster. We explore how the presence of other stars affects the evolution of circumstellar disks. Our numerical approach requires multi-scale and multi-physics simulations where the relevant components and their interactions are resolved. The simulations start with the collapse of a turbulent cloud, from which stars with disks form, which are able to influence each other. We focus on the effect of extinction due to residual cloud gas on the early evolution of circumstellar disks. We find that this extinction protects circumstellar disks against external photoevaporation, but these disks then become vulnerable to dynamic truncation by passing stars. We conclude that circumstellar disk evolution is heavily affected by the early evolution of the cluster.
Opioids are more commonly prescribed for chronic pain in rural settings in the USA, yet little is known about how the rural context influences efforts to improve opioid medication management.
Methods:
The Six Building Blocks is an evidence-based program that guides primary care practices in making system-based improvements in managing patients using long-term opioid therapy. It was implemented at 6 rural and rural-serving organizations with 20 clinic locations over a 15-month period. To gain further insight about their experience with implementing the program, interviews and focus groups were conducted with staff and clinicians at the six organizations at the end of the 15 months and transcribed. Team members used a template analysis approach, a form of qualitative thematic analysis, to code these data for barriers, facilitators, and corresponding subcodes.
Results:
Facilitators to making systems-based changes in opioid management within a rural practice context included a desire to help patients and their community, external pressures to make changes in opioid management, a desire to reduce workplace stress, external support for the clinic, supportive clinic leadership, and receptivity of patients. Barriers to making changes included competing demands on clinicians and staff, a culture of clinician autonomy, inadequate data systems, and a lack of patient resources in rural areas.
Discussion:
The barriers and facilitators identified here point to potentially unique determinants of practice that should be considered when addressing opioid prescribing for chronic pain in the rural setting.
We correlated antibiotic consumption measured by point prevalence survey with defined daily doses (DDD) across multiple hospitals. Point prevalence survey had a higher correlation (1) with monthly DDDs than annual DDDs, (2) in nonsurgical versus surgical wards, and (3) on high- versus low-utilization wards. Findings may be hospital specific due to hospital differences.
The Pueblo population of Chaco Canyon during the Bonito Phase (AD 800–1130) employed agricultural strategies and water-management systems to enhance food cultivation in this unpredictable environment. Scepticism concerning the timing and effectiveness of this system, however, remains common. Using optically stimulated luminescence dating of sediments and LiDAR imaging, the authors located Bonito Phase canal features at the far west end of the canyon. Additional ED-XRF and strontium isotope (87Sr/86Sr) analyses confirm the diversion of waters from multiple sources during Chaco’s occupation. The extent of this water-management system raises new questions about social organisation and the role of ritual in facilitating responses to environmental unpredictability.
Almost 40 years ago Peter deLeon, editor of the journal Policy Sciences, made the following observation:
Throughout the government and private sectors, one hardly finds any office that does not have a staff ‘policy analyst’. Newly graduated baccalaureates engrave that title on their business cards and many senior government officials view themselves primarily as analysts…Clearly, policy analysis can be seen as a growth stock. Yet the pervasiveness of the genre leads one to question the heritage, present condition, and future of the discipline and the profession. (deLeon, 1981, p. 1)
Most of what deLeon wrote in his 1981 editorial remains true today. Although the number of people whose business cards proclaim them to be policy analysts is very difficult to determine, it is conceivable that in both Canada and the United States their numbers approach those for physicians or lawyers. The number of policy analysts has surely grown quite significantly since deLeon described policy analysis as a “growth stock”. However, the strong hint of scepticism that creeps into his conclusion is not entirely fair. I argue that the policy analysis profession is at least as influential as deLeon and other leaders of what was known as the policy sciences movement hoped it would become, but in ways that they did not expect and that probably would have disappointed them.
Even the approximate size of the policy analysis community in Canada is unknown (Howlett, 2009). In this respect, it is quite different from the medical and legal professions which have about 80,000 (CMA, 2017) and 95,000 (FLSC, 2014) members, respectively. Unlike these professions and such others as accountants, engineers, teachers, and nurses, there is no required certification before one can be recognized by others as a policy analyst. This, of course, has to do with the fact that the policy analysis profession is not linked to any particular discipline. Someone whose business card proclaims him or her to be a policy analyst may have training in economics, criminology, public health, women's studies, international security studies or any number of disciplinary backgrounds, some of which are by their very nature multidisciplinary.
Healthy adults (n 30) participated in a placebo-controlled, randomised, double-blinded, cross-over study consisting of two 28 d treatments (β2-1 fructan or maltodextrin; 3×5 g/d) separated by a 14-d washout. Subjects provided 1 d faecal collections at days 0 and 28 of each treatment. The ability of faecal bacteria to metabolise β2-1 fructan was common; eighty-seven species (thirty genera, and four phyla) were isolated using anaerobic medium containing β2-1 fructan as the sole carbohydrate source. β2-1 fructan altered the faecal community as determined through analysis of terminal restriction fragment length polymorphisms and 16S rRNA genes. Supplementation with β2-1 fructan reduced faecal community richness, and two patterns of community change were observed. In most subjects, β2-1 fructan reduced the content of phylotypes aligning within the Bacteroides, whereas increasing those aligning within bifidobacteria, Faecalibacterium and the family Lachnospiraceae. In the remaining subjects, supplementation increased the abundance of Bacteroidetes and to a lesser extent bifidobacteria, accompanied by decreases within the Faecalibacterium and family Lachnospiraceae. β2-1 Fructan had no impact on the metagenome or glycoside hydrolase profiles in faeces from four subjects. Few relationships were found between the faecal bacterial community and various host parameters; Bacteroidetes content correlated with faecal propionate, subjects whose faecal community contained higher Bacteroidetes produced more caproic acid independent of treatment, and subjects having lower faecal Bacteroidetes exhibited increased concentrations of serum lipopolysaccharide and lipopolysaccharide binding protein independent of treatment. We found no evidence to support a defined health benefit for the use of β2-1 fructans in healthy subjects.
Palliative care for nursing home residents with advanced dementia is often sub-optimal due to poor communication and limited care planning. In a cluster randomized controlled trial, registered nurses (RNs) from 10 nursing homes were trained and funded to work as Palliative Care Planning Coordinators (PCPCs) to organize family case conferences and mentor staff. This qualitative sub-study aimed to explore PCPC and health professional perceptions of the benefits of facilitated case conferencing and identify factors influencing implementation.
Method:
Semi-structured interviews were conducted with the RNs in the PCPC role, other members of nursing home staff, and physicians who participated in case conferences. Analysis was conducted by two researchers using a thematic framework approach.
Results:
Interviews were conducted with 11 PCPCs, 18 other nurses, eight allied health workers, and three physicians. Perceived benefits of facilitated case conferencing included better communication between staff and families, greater multi-disciplinary involvement in case conferences and care planning, and improved staff attitudes and capabilities for dementia palliative care. Key factors influencing implementation included: staffing levels and time; support from management, staff and physicians; and positive family feedback.
Conclusion:
The facilitated approach explored in this study addressed known barriers to case conferencing. However, current business models in the sector make it difficult for case conferencing to receive the required levels of nursing qualification, training, and time. A collaborative nursing home culture and ongoing relationships with health professionals are also prerequisites for success. Further studies should document resident and family perceptions to harness consumer advocacy.
We argue that the CLASH model makes a number of questionable assumptions about the harshness and unpredictability of low-latitude environments, calling into question the life history strategy approach used, and that it is inconsistent with more nuanced global patterns of violence. We suggest an alternative account for less violence at high latitudes, based on a greater need for cooperation.
Late glacial and early Holocene summer temperatures were reconstructed based on fossil chironomid assemblages at Lake Brazi (Retezat Mountains) with a joint Norwegian"Swiss transfer function, providing an important addition to the late glacial quantitative climate reconstructions from Europe. The pattern of the late glacial temperature changes in Lake Brazi show both similarities and some differences from the NGRIP δ18O record and other European chironomid-based reconstructions. Our reconstruction indicates that at Lake Brazi (1740 m a.s.l.) summer air temperature increased by ~ 2.8ºC at the Oldest Dryas/Bølling transition (GS-2/GI-1) and reached 8.1–8.7ºC during the late glacial interstade. The onset of the Younger Dryas (GS-1) was characterized by a weak (< 1ºC) decrease in chironomid-inferred temperatures. Similarly, at the GS-1/Holocene transition no major changes in summer temperature were recorded. In the early Holocene, summer temperature increased in two steps and reached ~ 12.0–13.3ºC during the Preboreal. Two short-term cold events were detected during the early Holocene between 11,480–11,390 and 10,350–10,190 cal yr BP. The first cooling coincides with the Preboreal oscillation and shows a weak (0.7ºC) temperature decrease, while the second is characterized by 1ºC cooling. Both cold events coincide with cooling events in the Greenland ice core records and other European temperature reconstructions.