We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We reviewed published research on natural hazards and community disaster resilience to identify how relationships between people and their experiences of disaster interact to shape possibilities for positive transformative change. Research commonly analyzes processes within and across individual and collective or structural spheres of a social system, but rarely investigates interactions across all three. We present a framework focused on ‘spheres of influence’ to address this. The Framework shows how positive relationships that prioritize restoring shared, meaningful and purposeful identities can lead to expansive and incremental capacity for transformative outcomes for sustainability: a process we liken to the butterfly effect.
Technical Summary
Sustainability and disaster resilience frameworks commonly neglect the role of agentive social processes in influencing wider structural transformation for sustainability. We applied relational agency and social practice theory to conceptualize transformative pathways for enhanced sustainability through a review of peer-reviewed literature relating to natural hazards and community disaster resilience. We sought to answer two questions: 1. What are the social practices that influence transformative change for disaster resilience in the context of individual, collective and structural spheres of influence? 2. What are the social influencing processes involved, identified through relational agency? We found that empirical studies tend to focus on individual and collective or structural spheres but rarely offer a relational analysis across all three. Our findings highlight that positive relationships that prioritize restoring shared, meaningful and purposeful identities can act as a resource, which can lead to expansive and incremental transformative outcomes for sustainability: a process we liken to the butterfly effect. We present a Sphere of Influence Framework that highlights socialized practices influenced by relationality, which can be applied as a strategic planning tool to increase capacity for resilience. Future research should explore how socio-political practices (the structural sphere) influence distributed power within collective and individual spheres.
Social media summary
Disasters can generate extraordinary social dynamics. So, how can we optimize these dynamics for enhanced sustainability?
For an increasing proportion of Australian households, the Australian dream of home ownership is no longer an option. Neoliberal housing policy and the financialisation of housing has resulted in a housing affordability crisis. Historically, Australian housing policy has afforded only a limited role to local government. This article analyses the results of a nation-wide survey of Australian local governments’ perceptions of housing affordability in their local government area, the possibilities for their meaningful intervention, the challenges they face, the role of councillors and councils’ perceptions of what levels of government should take responsibility for housing. Almost all of the respondents from Sydney and Melbourne councils were clear that there is a housing affordability crisis in their local government area. We apply a framework analysing housing policy in the context of neoliberalism and the related financialisation of housing in order to analyse the housing affordability crisis in Sydney and Melbourne. We conclude that in order to begin resolving the housing crisis in Australia’s two largest cities there has to be an increasing role for local government, a substantial increase in the building of social and affordable housing and a rollback of policies that encourage residential property speculation.
This book presents a novel interpretation of the nature, causes and consequences of sex inequality in the modern labour market. Employing a sophisticated new theoretical framework, and drawing on original fieldwork, the book develops a subtle account of the phenomenon of sex segregation and offers a major challenge to existing approaches.
During the COVID-19 pandemic, the antimicrobial stewardship module in our electronic medical record was reconfigured for the management of COVID-19 patients. This change allowed our subspecialist providers to review charts quickly to optimize potential therapy and management during the patient surge.
New approaches are needed to safely reduce emergency admissions to hospital by targeting interventions effectively in primary care. A predictive risk stratification tool (PRISM) identifies each registered patient's risk of an emergency admission in the following year, allowing practitioners to identify and manage those at higher risk. We evaluated the introduction of PRISM in primary care in one area of the United Kingdom, assessing its impact on emergency admissions and other service use.
METHODS:
We conducted a randomized stepped wedge trial with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. PRISM was implemented in eleven primary care practice clusters (total thirty-two practices) over a year from March 2013. We analyzed routine linked data outcomes for 18 months.
RESULTS:
We included outcomes for 230,099 registered patients, assigned to ranked risk groups.
Overall, the rate of emergency admissions was higher in the intervention phase than in the control phase: adjusted difference in number of emergency admissions per participant per year at risk, delta = .011 (95 percent Confidence Interval, CI .010, .013). Patients in the intervention phase spent more days in hospital per year: adjusted delta = .029 (95 percent CI .026, .031). Both effects were consistent across risk groups.
Primary care activity increased in the intervention phase overall delta = .011 (95 percent CI .007, .014), except for the two highest risk groups which showed a decrease in the number of days with recorded activity.
CONCLUSIONS:
Introduction of a predictive risk model in primary care was associated with increased emergency episodes across the general practice population and at each risk level, in contrast to the intended purpose of the model. Future evaluation work could assess the impact of targeting of different services to patients across different levels of risk, rather than the current policy focus on those at highest risk.
Emergency admissions to hospital are a major financial burden on health services. In one area of the United Kingdom (UK), we evaluated a predictive risk stratification tool (PRISM) designed to support primary care practitioners to identify and manage patients at high risk of admission. We assessed the costs of implementing PRISM and its impact on health services costs. At the same time as the study, but independent of it, an incentive payment (‘QOF’) was introduced to encourage primary care practitioners to identify high risk patients and manage their care.
METHODS:
We conducted a randomized stepped wedge trial in thirty-two practices, with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. We analysed routine linked data on patient outcomes for 18 months (February 2013 – September 2014). We assigned standard unit costs in pound sterling to the resources utilized by each patient. Cost differences between the two study phases were used in conjunction with differences in the primary outcome (emergency admissions) to undertake a cost-effectiveness analysis.
RESULTS:
We included outcomes for 230,099 registered patients. We estimated a PRISM implementation cost of GBP0.12 per patient per year.
Costs of emergency department attendances, outpatient visits, emergency and elective admissions to hospital, and general practice activity were higher per patient per year in the intervention phase than control phase (adjusted δ = GBP76, 95 percent Confidence Interval, CI GBP46, GBP106), an effect that was consistent and generally increased with risk level.
CONCLUSIONS:
Despite low reported use of PRISM, it was associated with increased healthcare expenditure. This effect was unexpected and in the opposite direction to that intended. We cannot disentangle the effects of introducing the PRISM tool from those of imposing the QOF targets; however, since across the UK predictive risk stratification tools for emergency admissions have been introduced alongside incentives to focus on patients at risk, we believe that our findings are generalizable.
A predictive risk stratification tool (PRISM) to estimate a patient's risk of an emergency hospital admission in the following year was trialled in general practice in an area of the United Kingdom. PRISM's introduction coincided with a new incentive payment (‘QOF’) in the regional contract for family doctors to identify and manage the care of people at high risk of emergency hospital admission.
METHODS:
Alongside the trial, we carried out a complementary qualitative study of processes of change associated with PRISM's implementation. We aimed to describe how PRISM was understood, communicated, adopted, and used by practitioners, managers, local commissioners and policy makers. We gathered data through focus groups, interviews and questionnaires at three time points (baseline, mid-trial and end-trial). We analyzed data thematically, informed by Normalisation Process Theory (1).
RESULTS:
All groups showed high awareness of PRISM, but raised concerns about whether it could identify patients not yet known, and about whether there were sufficient community-based services to respond to care needs identified. All practices reported using PRISM to fulfil their QOF targets, but after the QOF reporting period ended, only two practices continued to use it. Family doctors said PRISM changed their awareness of patients and focused them on targeting the highest-risk patients, though they were uncertain about the potential for positive impact on this group.
CONCLUSIONS:
Though external factors supported its uptake in the short term, with a focus on the highest risk patients, PRISM did not become a sustained part of normal practice for primary care practitioners.
Depression is a prevalent long-term condition that is associated with substantial resource use. Telehealth may offer a cost-effective means of supporting the management of people with depression.
Aims
To investigate the cost-effectiveness of a telehealth intervention (‘Healthlines’) for patients with depression.
Method
A prospective patient-level economic evaluation conducted alongside a randomised controlled trial. Patients were recruited through primary care, and the intervention was delivered via a telehealth service. Participants with a confirmed diagnosis of depression and PHQ-9 score ≥10 were recruited from 43 English general practices. A series of up to 10 scripted, theory-led, telephone encounters with health information advisers supported participants to effect a behaviour change, use online resources, optimise medication and improve adherence. The intervention was delivered alongside usual care and was designed to support rather than duplicate primary care. Cost-effectiveness from a combined health and social care perspective was measured by net monetary benefit at the end of 12 months of follow-up, calculated from incremental cost and incremental quality-adjusted life years (QALYs). Cost–consequence analysis included cost of lost productivity, participant out-of-pocket expenditure and the clinical outcome.
Results
A total of 609 participants were randomised – 307 to receive the Healthlines intervention plus usual care and 302 to receive usual care alone. Forty-five per cent of participants had missing quality of life data, 41% had missing cost data and 51% of participants had missing data on either cost or utility, or both. Multiple imputation was used for the base-case analysis. The intervention was associated with incremental mean per-patient National Health Service/personal social services cost of £168 (95% CI £43 to £294) and an incremental QALY gain of 0.001 (95% CI −0.023 to 0.026). The incremental cost-effectiveness ratio was £132 630. Net monetary benefit at a cost-effectiveness threshold of £20 000 was –£143 (95% CI –£164 to –£122) and the probability of the intervention being cost-effective at this threshold value was 0.30. Productivity costs were higher in the intervention arm, but out-of-pocket expenses were lower.
Conclusions
The Healthlines service was acceptable to patients as a means of condition management, and response to treatment after 4 months was higher for participants randomised to the intervention. However, the positive average intervention effect size was modest, and incremental costs were high relative to a small incremental QALY gain at 12 months. The intervention is not likely to be cost-effective in its current form.
Acting on harmful command hallucinations is a major clinical concern. Our COMMAND CBT trial approximately halved the rate of harmful compliance (OR = 0.45, 95% CI 0.23–0.88, p = 0.021). The focus of the therapy was a single mechanism, the power dimension of voice appraisal, was also significantly reduced. We hypothesised that voice power differential (between voice and voice hearer) was the mediator of the treatment effect.
Methods
The trial sample (n = 197) was used. A logistic regression model predicting 18-month compliance was used to identify predictors, and an exploratory principal component analysis (PCA) of baseline variables used as potential predictors (confounders) in their own right. Stata's paramed command used to obtain estimates of the direct, indirect and total effects of treatment.
Results
Voice omnipotence was the best predictor although the PCA identified a highly predictive cognitive-affective dimension comprising: voices’ power, childhood trauma, depression and self-harm. In the mediation analysis, the indirect effect of treatment was fully explained by its effect on the hypothesised mediator: voice power differential.
Conclusion
Voice power and treatment allocation were the best predictors of harmful compliance up to 18 months; post-treatment, voice power differential measured at nine months was the mediator of the effect of treatment on compliance at 18 months.
A glyphosate-resistant Palmer amaranth biotype was confirmed in central Georgia. In the field, glyphosate applied to 5- to 13-cm-tall Palmer amaranth at three times the normal use rate of 0.84 kg ae ha−1 controlled this biotype only 17%. The biotype was controlled 82% by glyphosate at 12 times the normal use rate. In the greenhouse, I50 values (rate necessary for 50% inhibition) for visual control and shoot fresh weight, expressed as percentage of the nontreated, were 8 and 6.2 times greater, respectively, with the resistant biotype compared with a known glyphosate-susceptible biotype. Glyphosate absorption and translocation and the number of chromosomes did not differ between biotypes. Shikimate was detected in leaf tissue of the susceptible biotype treated with glyphosate but not in the resistant biotype.
Conservation planning often relies on measures such as species richness and abundance to prioritize areas for protection. Nonetheless, alternative metrics such as functional traits have recently been shown to be useful complementary measures for detecting biological change. Timely conservation planning often precludes the collection of such detailed biological data relying instead on remotely-sensed habitat mapping as a surrogate for diversity. While there is evidence that habitat maps may predict taxonomic species richness and diversity in some coastal ecosystems, it is unknown whether similar strong relationships exist for functional traits and functional multimetrics. We compared the performance of physical habitat structural complexity obtained from high definition swath mapping in explaining variation in traditional taxonomic metrics as well as functional traits (e.g., maximum length, trophic level, gregariousness) and functional multimetrics (e.g., functional richness, dispersion) of fish assemblages. Reef complexity measures were good surrogates for fish species richness and abundance but not for functional traits or multimetrics, except functional richness at the scale of 1 m. Remotely sensed habitat maps may not be a good surrogate for predicting functional traits and multimetrics of fish assemblages, and must be used with caution when maximizing such aspects of assemblages is a priority for conservation planning.
The health of the people is the highest law. (Cicero, De Legibus, circa 40bc)
As Jane Austen might have observed, it is a truth universally acknowledged that a government in pursuit of its role will seek to influence the behaviour of its populace. She might have added that this includes behaviours affecting the safety, health and wellbeing of individuals, communities, workforces and society at large. Governments exercise their influence in a variety of ways: through regulation by passing laws; persuasion using social marketing campaigns; incentivisation using fiduciary instruments and other financial inducements; education providing knowledge and skills; the provision of services; ecological changes to the physical, social and economic environments; and many others. Some of these levers are applied at national or international level, others at regional and local level. Many are discussed – and often criticised and questioned – during the various chapters of this book.
This chapter explores some of the ways in which governments use legislation to directly regulate individual behaviour in order to safeguard and promote the health of the people. We consider how the public responds to this approach, and we weigh up some of its benefits and pitfalls.
An historical perspective
Public health and regulation targeted at individuals are familiar bedfellows. They share a long history. Enforced restrictions on personal freedoms have historically been seen as a relatively quick and simple way to ensure changes in behaviour to protect the health and wellbeing of individuals and communities. The earliest examples were almost entirely concerned with health protection, personal and community safety, avoidance of environmental hazards and the prevention and control of communicable disease. In ancient times, religious rites about hand and foot washing and disposal of the dead through burial, embalming or funeral pyre had spin-off benefits for hygiene and health. In ancient Babylon, religious rules forbade the digging of wells near burial grounds and midden heaps. Throughout the Middle East and Europe, lepers were forcibly confined to closed colonies from the earliest times. During the Black Death that swept Europe in the 14th century, bubonic plague sufferers were compelled to stay in their homes, as indeed they were even more forcibly during the Great Plague that ravaged Europe two centuries later.
A computer program was developed to analyse the relative amount of EMG activity in an agonist-antagonist pair of muscles while subjects performed voluntary flexion-extension movements at the wrist to track a visual target. The data were presented to the subjects in the form of a vector display, the angle and length of which was determined from calculation of EMG power in the two muscles.
This new approach to EMG biofeedback was evaluated in two hemiplegic patients and three patients with cerebellar incoordination. Over a training period of several weeks, all the subjects were able to modifythe pattern of EMG activity in the muscles to reduce the amount of inappropriate coactivation of flexors and extensors and to produce more sustained and regular activation of individual muscle groups.
Morphological speech zone asymmetry in man cannot be due to environmental or developmental factors after birth. The functional implication of such a finding is not yet clear. Morphological asymmetry of the human brain is paralleled by electrophysiological evidence of cerebral hemispheric asymmetries. The results of our analysis of 50 infants suggest that clear occipital-temporal coherency asymmetry similar, but not identical to the adult pattern, also exists at or near birth. These asymmetries are generated by stimuli with no verbal content and in infants who presumably have no or an undeveloped capability for language. It is suggested that language is only a part of much more fundamental asymmetries which include the processing of auditory and visual information. Our results, and those of others, are consistent with the assumption that the left hemisphere is more able to relate stimuli to past experience, either short or long-term, while the right hemisphere is more able to process stimuli which are not easily identifiable or referable. These capabilities would not be based on language, and hence would be expected to develop independently and possibly before speech. The demonstration that reversing electrophysiological asymmetries can be generated with non-speech stimuli in the visual and auditory modalities, and in neonates, supports such an assumption.
In this chapter, we discuss how social and biological studies of ageing can converge to provide a meaningful framework for progress in both understanding ageing and dealing with it in a positive manner. We start by discussing the meaning of the term ‘ageing’ and how it is in part defined by social context, and then, how psychosocial factors have an impact on both perception and the biological reality of ageing. From a theoretical perspective, we assess how ageing might have evolved, and how it is measured. The biological impacts of ageing are then described, moving from individual cells through tissues to major organ systems (immune, cardiovascular and nervous systems) (see Figure 2.1). What causes individual cells of the body to age is dealt with at both a cellular and molecular level, and we further discuss how studies of both extremely long-lived and short-lived humans have contributed significantly not only to our understanding of the biological processes of ageing, but also to the possibility of developing therapies to deal with the problems that cause greatest loss of quality of life in older age. We end by assessing the ethical case for intervening in those biological processes underpinning the development of those illnesses that so undermine health in later life.
Given the enormous scope and breadth of material that is covered, and the wide differences in perspectives and language used by the diverse disciplines that contribute to this chapter, we have tried to avoid jargon terms wherever possible, and provide simple definitions of unavoidable terminology as notes at the end of the chapter to assist the reader not specialist in that particular field.
Not ‘what’ is ageing, but ‘how’?
In order to study ageing in any meaningful way, we need to understand how the term ‘ageing’ is being used. To biologists, it can mean damage to molecules of the cell, and to cells of the tissue; to the physiologist, alterations in organ function; to the clinician, increased frailty and accumulation of diverse diseases. For the older person, ageing is felt and experienced, with changes in physical abilities and social activities and status having both positive and negative effects on the quality of their later life. Ageing is thus not so much a thing, but rather multi-dimensional, underpinned by complex social and biological processes, made up of many different mechanisms.
Evaluate antimicrobial stewardship interventions targeted to reduce highly active antiretroviral therapy (HAART)– or opportunistic infection (Ol)–related medication errors and increase error resolution.
Design.
Retrospective before-after study.
Setting.
Academic medical center.
Patients.
Inpatients who were prescribed antiretroviral therapy before the intervention (January 1, 2011, to October 31, 2011) and after the intervention (July 1, 2012, to December 31, 2012). Patients treated with lamivudine or tenofovir monotherapy for hepatitis B were excluded.
Methods.
Antimicrobial stewardship interventions included education, modification of electronic medication records, collaboration with the infectious diseases (ID) department, and prospective audit and review of HAART and OI regimens by an ID clinical pharmacist.
Results.
Data for 162 admissions from the preintervention period and 110 admissions from the postintervention period were included. The number of admissions with a medication error was significantly reduced after the intervention (81 [50%] of 162 admissions vs 37 (34%) of 110 admissions; P < .00)1. A total of 124 errors occurred in the preintervention group (mean no. of errors, 1.5 per admission), and 43 errors occurred in the postintervention group (mean no. of errors, 1.2 per admission). The most common error types were major drug interactions and dosing in the preintervention group and renal adjustment and OI-related errors in the postintervention group. A significantly higher error resolution rate was observed in the postintervention group (36% vs 74%; P < .001). After adjustment for potential confounders with logistic regression, admission in the postintervention group was independently associated with fewer medication errors (odds ratio, 0.4 [95% confidence interval, 0.24-0.77]; P = .005). Overall, presence of an ID consultant demonstrated a higher error resolution rate (32% without a consultation vs 68% with a consultation; P = .002).
Conclusions.
Multifaceted, multidisciplinary stewardship efforts reduced the rate and increased the overall resolution of HAART-related medication errors.