We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Gender quotas are used to elect most of the world’s legislatures. Still, critics contend that quotas are undemocratic, eroding institutional legitimacy. We examine whether quotas diminish citizens’ faith in political decisions and decision-making processes. Using survey experiments in 12 democracies with over 17,000 respondents, we compare the legitimacy-conferring effects of both quota-elected and non-quota elected local legislative councils relative to all-male councils. Citizens strongly prefer gender balance, even when it is achieved through quotas. Though we observe a quota penalty, wherein citizens prefer gender balance attained without a quota relative to quota-elected institutions, this penalty is often small and insignificant, especially in countries with higher-threshold quotas. Quota debates are thus better framed around the most relevant counterfactual: the comparison is not between women’s descriptive representation with and without quotas, but between men’s political dominance and women’s inclusion.
One oft-cited reason for women's political underrepresentation is that women express less political ambition than men. We reframe the puzzle of women's ambition deficit, asking why men have an ambition surplus. Drawing on the concept of symbolic representation, we theorize that political symbols convey to men their capacity for exceptional political leadership. We test our expectations with a US-based survey experiment in which respondents watch one of three ‘two-minute civics lessons’. Men who watched a video featuring the accomplishments of the Founding Fathers reported significantly more political ambition than men assigned to the control group. Additional studies indicate that the effects are specific to the Founding Fathers (as compared to early American statesmen). Men are also more likely than women to identify the Founding Fathers as inspiring figures and to feel pride when considering them. Our findings suggest how history is told contributes to men's persistent political overrepresentation.
A much-circulated image during the Donald Trump administration showed Vice President Mike Pence and members of the Republican House Freedom Caucus discussing the removal of maternity coverage from the Affordable Care Act—with not a single woman or person of color among them. In another image, white men watched approvingly as Trump signed an executive order reinstating the global gag rule, which bans foreign nongovernmental organizations that receive American aid from supporting abortion access. These images contrast with one from early in Joe Biden’s presidency. In his first address to Congress, Biden was backed by two women occupying the second- and third-most-powerful positions in the country, Vice President Kamala Harris and Speaker of the House Nancy Pelosi, respectively. After acknowledging “Madame Speaker, Madame Vice President,” Biden said, “No president ever said those words and it is about time.”
Previous work suggests that observing women officeholders increases women’s political ambition. Yet, jumps in women’s representation in the United States’ “Years of the Woman”—following the Anita Hill testimonies and the election of Donald Trump—are linked to women’s exclusion from political decision-making. Drawing on focus groups with prospective women candidates, we theorize that exclusion when combined with a gendered policy threat increases women’s political ambition. Using survey experiments replicated across different samples, we show that women who read about an all-male city council poised to legislate on women’s rights report increased ambition compared with their pretreatment ambition levels and to women in other treatment groups. Women’s increased sense of political efficacy drives these results. When women’s rights are not under discussion, men’s overrepresentation does not move (or even depresses) women’s ambition. Seeing the policy consequences of their exclusion causes some women to seek a seat at the table.
Politics is increasingly dominated by crises, from pandemics to extreme weather events. These Critical Perspectives essays analyze crises’ gendered implications by focusing on their consequences for women’s descriptive and substantive representation. Covering multiple kinds of crises, including large-scale protests, climate shocks, and war and revolution, the contributions reveal three factors shaping both the theoretical conceptualization and empirical analysis of crisis and women’s representation: (1) the type of crisis, (2) the actors influenced by the crisis, and (3) the aftermath of the crisis. Together, the contributors urge scholars to “think crisis, think gender” far beyond the supply of and demand for women leaders.
As part of a quality improvement project beginning in October 2011, our centre introduced changes to reduce radiation exposure during paediatric cardiac catheterisations. This led to significant initial decreases in radiation to patients. Starting in April 2016, we sought to determine whether these initial reductions were sustained.
Methods:
After a 30-day trial period, we implemented (1) weight-based reductions in preset frame rates for fluoroscopy and angiography, (2) increased use of collimators and safety shields, (3) utilisation of stored fluoroscopy and virtual magnification, and (4) hiring of a devoted radiation technician. We collected patient weight (kg), total fluoroscopy time (min), and procedure radiation dosage (cGy-cm2) for cardiac catheterisations between October, 2011 and September, 2019.
Results:
A total of 1889 procedures were evaluated (196 pre-intervention, 303 in the post-intervention time period, and 1400 in the long-term group). Fluoroscopy times (18.3 ± 13.6 pre; 19.8 ± 14.1 post; 17.11 ± 15.06 long-term, p = 0.782) were not significantly different between the three groups. Patient mean radiation dose per kilogram decreased significantly after the initial quality improvement intervention (39.7% reduction, p = 0.039) and was sustained over the long term (p = 0.043). Provider radiation exposure was also significantly decreased from the onset of this project through the long-term period (overall decrease of 73%, p < 0.01) despite several changes in the interventional cardiologists who made up the team over this time period.
Conclusion:
Introduction of technical and clinical practice changes can result in a significant reduction in radiation exposure for patients and providers in a paediatric cardiac catheterisation laboratory. These reductions can be maintained over the long term.
Background: Antibiotics targeted against Clostridioides difficile bacteria are necessary, but insufficient, to achieve a durable clinical response because they have no effect on C. difficile spores that germinate within a disrupted microbiome. ECOSPOR-III evaluated SER-109, an investigational, biologically derived microbiome therapeutic of purified Firmicute spores for treatment of rCDI. Herein, we present the interim analysis in the ITT population at 8 and 12 weeks. Methods: Adults ≥18 years with rCDI (≥3 episodes in 12 months) were screened at 75 US and CAN sites. CDI was defined as ≥3 unformed stools per day for <48 hours with a positive C. difficile assay. After completion of 10–21 days of vancomycin or fidaxomicin, adults with symptom resolution were randomized 1:1 to SER-109 (4 capsules × 3 days) or matching placebo and stratified by age (≥ or <65 years) and antibiotic received. Primary objectives were safety and efficacy at 8 weeks. Primary efficacy endpoint was rCDI (recurrent toxin+ diarrhea requiring treatment); secondary endpoints included efficacy at 12 weeks after dosing. Results: Overall, 287 participants were screened and 182 were randomized (59.9% female; mean age, 65.5 years). The most common reason for screen failure was a negative C. difficile toxin assay. A significantly lower proportion of SER-109 participants had rCDI after dosing compared to placebo at week 8 (11.1% vs 41.3%, respectively; relative risk [RR], 0.27; 95% confidence interval [CI], 0.15–0.51; p-value <0.001). Efficacy rates were significantly higher with SER-109 vs placebo in both stratified age groups (Figure 1). SER-109 was well-tolerated with a safety profile similar to placebo. The most common treatment-emergent adverse events (TEAEs) were gastrointestinal and were mainly mild to moderate. No serious TEAEs, infections, deaths, or drug discontinuations were deemed related to study drug. Conclusions: SER-109, an oral live microbiome therapeutic, achieved high rates of sustained clinical response with a favorable safety profile. By enriching for Firmicute spores, SER-109 achieves high efficacy while mitigating risk of transmitting infectious agents, beyond donor screening alone. SER-109 represents a major paradigm shift in the clinical management of patients with recurrent CDI. Clinicaltrials.gov Identifier NCT03183128. These data were previously presented as a late breaker at American College of Gastroenterology 2020.
Mental health problems have a significant impact globally in terms of social and economic costs. Increasing access to and uptake of mental health interventions (particularly by men) remains a challenge for service providers. The current study sought to examine the efficacy of a delivering a Stress Control intervention in partnership with a community sporting organisation (the Gaelic Athletic Assocaition, GAA) in ameliorating mental health difficulties in a general population. Measures of anxiety, depression and quality of life were administered before and after the delivery of the 6-week programme. A focus group was conducted afterwards to gather qualitative data on participants’ experiences of the intervention. Statistically significant decreases in depression scores were found following attendance at the course: t (94) = 3.14, p = .002, with a large effect size (0.5) (n = 95). There was an increase in the number of male attendees compared with clinic-based courses. Thematic analysis of the focus group data revealed a number of key themes including increased accessibility in terms of the scale and context of the delivery of the course. Delivering large-scale psychoeducational courses like Stress Control in partnership with the GAA represents a promising avenue for increasing access (for males in particular) to an effective intervention for improving mental health outcomes
Key learning aims
(1) To gain an understanding of the impact of delivering a large-scale psychological intervention in partnership with a community sports organisation on accessibility and stigma reduction for participants.
(2) To become aware of the potential benefits of considering non-clinic-based locations in running public mental health interventions.
(3) To understand the key role of the normalisation of the experience of common mental health problems and the impact on intervention uptake.
Aging is associated with numerous stressors that negatively impact older adults’ well-being. Resilience improves ability to cope with stressors and can be enhanced in older adults. Senior housing communities are promising settings to deliver positive psychiatry interventions due to rising resident populations and potential impact of delivering interventions directly in the community. However, few intervention studies have been conducted in these communities. We present a pragmatic stepped-wedge trial of a novel psychological group intervention intended to improve resilience among older adults in senior housing communities.
Design:
A pragmatic modified stepped-wedge trial design.
Setting:
Five senior housing communities in three states in the US.
Participants:
Eighty-nine adults over age 60 years residing in independent living sector of senior housing communities.
Intervention:
Raise Your Resilience, a manualized 1-month group intervention that incorporated savoring, gratitude, and engagement in value-based activities, administered by unlicensed residential staff trained by researchers. There was a 1-month control period and a 3-month post-intervention follow-up.
Measurements:
Validated self-report measures of resilience, perceived stress, well-being, and wisdom collected at months 0 (baseline), 1 (pre-intervention), 2 (post-intervention), and 5 (follow-up).
Results:
Treatment adherence and satisfaction were high. Compared to the control period, perceived stress and wisdom improved from pre-intervention to post-intervention, while resilience improved from pre-intervention to follow-up. Effect sizes were small in this sample, which had relatively high baseline resilience. Physical and mental well-being did not improve significantly, and no significant moderators of change in resilience were identified.
Conclusion:
This study demonstrates feasibility of conducting pragmatic intervention trials in senior housing communities. The intervention resulted in significant improvement in several measures despite ceiling effects. The study included several features that suggest high potential for its implementation and dissemination across similar communities nationally. Future studies are warranted, particularly in samples with lower baseline resilience or in assisted living facilities.
Wildlife is an essential component of all ecosystems. Most places in the globe do not have local, timely information on which species are present or how their populations are changing. With the arrival of new technologies, camera traps have become a popular way to collect wildlife data. However, data collection has increased at a much faster rate than the development of tools to manage, process and analyse these data. Without these tools, wildlife managers and other stakeholders have little information to effectively manage, understand and monitor wildlife populations. We identify four barriers that are hindering the widespread use of camera trap data for conservation. We propose specific solutions to remove these barriers integrated in a modern technology platform called Wildlife Insights. We present an architecture for this platform and describe its main components. We recognize and discuss the potential risks of publishing shared biodiversity data and a framework to mitigate those risks. Finally, we discuss a strategy to ensure platforms like Wildlife Insights are sustainable and have an enduring impact on the conservation of wildlife.
Important Bird and Biodiversity Areas (IBAs) are sites identified as being globally important for the conservation of bird populations on the basis of an internationally agreed set of criteria. We present the first review of the development and spread of the IBA concept since it was launched by BirdLife International (then ICBP) in 1979 and examine some of the characteristics of the resulting inventory. Over 13,000 global and regional IBAs have so far been identified and documented in terrestrial, freshwater and marine ecosystems in almost all of the world’s countries and territories, making this the largest global network of sites of significance for biodiversity. IBAs have been identified using standardised, data-driven criteria that have been developed and applied at global and regional levels. These criteria capture multiple dimensions of a site’s significance for avian biodiversity and relate to populations of globally threatened species (68.6% of the 10,746 IBAs that meet global criteria), restricted-range species (25.4%), biome-restricted species (27.5%) and congregatory species (50.3%); many global IBAs (52.7%) trigger two or more of these criteria. IBAs range in size from < 1 km2 to over 300,000 km2 and have an approximately log-normal size distribution (median = 125.0 km2, mean = 1,202.6 km2). They cover approximately 6.7% of the terrestrial, 1.6% of the marine and 3.1% of the total surface area of the Earth. The launch in 2016 of the KBA Global Standard, which aims to identify, document and conserve sites that contribute to the global persistence of wider biodiversity, and whose criteria for site identification build on those developed for IBAs, is a logical evolution of the IBA concept. The role of IBAs in conservation planning, policy and practice is reviewed elsewhere. Future technical priorities for the IBA initiative include completion of the global inventory, particularly in the marine environment, keeping the dataset up to date, and improving the systematic monitoring of these sites.
Rapid identification of esophageal intubations is critical to avoid patient morbidity and mortality. Continuous waveform capnography remains the gold standard for endotracheal tube (ETT) confirmation, but it has limitations. Point-of-care ultrasound (POCUS) may be a useful alternative for confirming ETT placement. The objective of this study was to determine the accuracy of paramedic-performed POCUS identification of esophageal intubations with and without ETT manipulation.
Methods
A prospective, observational study using a cadaver model was conducted. Local paramedics were recruited as subjects and each completed a survey of their demographics, employment history, intubation experience, and prior POCUS training. Subjects participated in a didactic session in which they learned POCUS identification of ETT location. During each study session, investigators randomly placed an ETT in either the trachea or esophagus of four cadavers, confirmed with direct laryngoscopy. Subjects then attempted to determine position using POCUS both without and with manipulation of the ETT. Manipulation of the tube was performed by twisting the tube. Descriptive statistics and logistic regression were used to assess the results and the effects of previous paramedic experience.
Results
During 12 study sessions, from March 2014 through December 2015, 57 subjects participated, evaluating a total of 228 intubations: 113 tracheal and 115 esophageal. Subjects were 84.0% male, mean age of 39 years (range: 22 - 62 years), with median experience of seven years (range: 0.6 - 39 years). Paramedics correctly identified ETT location in 158 (69.3%) cases without and 194 (85.1%) with ETT manipulation. The sensitivity and specificity of identifying esophageal location without ETT manipulation increased from 52.2% (95% confidence interval [CI], 43.0-61.0) and 86.7% (95% CI, 81.0-93.0) to 87.0% (95% CI, 81.0-93.0) and 83.2% (95% CI, 0.76-0.90) after manipulation (P<.0001), without affecting specificity (P=.45). Subjects correctly identified 41 previously incorrectly identified esophageal intubations. Paramedic experience, previous intubations, and POCUS experience did not correlate with ability to identify tube location.
Conclusion:
Paramedics can accurately identify esophageal intubations with POCUS, and manipulation improves identification. Further studies of paramedic use of dynamic POCUS to identify inadvertent esophageal intubations are needed.
LemaPC, O’BrienM, WilsonJ, St. JamesE, LindstromH, DeAngelisJ, CaldwellJ, MayP, ClemencyB.Avoid the Goose! Paramedic Identification of Esophageal Intubation by Ultrasound. Prehosp Disaster Med.2018;33(4):406–410
An original cohort study found that over half of the individuals detained under Section 136 (S136) of the Mental Health Act 1983 were discharged home after assessment, and nearly half were intoxicated.
Aims
To investigate whether the cohort was followed up by psychiatric services, characterise those repeatedly detained and assess whether substance use was related to these outcomes.
Method
Data were retrospectively collected from the notes of 242 individuals, who presented after S136 detention to a place of safety over a 6-month period, and were followed up for 1 year.
Results
After 1 year, 48% were in secondary care. Those with psychosis were the most likely to be admitted. Diagnoses of personality disorder or substance use were associated with multiple detentions; however, few were in contact with secondary services.
Conclusions
Crisis and long-term care pathways for these groups need to be developed to reduce repeated and unnecessary police detention.
The primary aim of this study was to determine the characteristics and develop a predictive model describing low acuity users of the emergency department (ED) by patients followed by a family health team (FHT). The secondary aim was to contrast this information with characteristics of high acuity users. We also sought to determine what factors were predictive of leaving without being seen (LWBS).
Methods
This retrospective descriptive correlational study explored characteristics and factors predictive of low acuity ED utilization. The sample included all FHT patients with ED visits in 2011. The last ED record was chosen for review. Sex, age, Canadian Triage and Acuity Scale (CTAS), presenting complaint(s), time of day, day of week, number of visits, and diagnosis were recorded.
Results
Of 1580 patients who visited the ED in 2011, 56% were CTAS 1–3 visits, 24% CTAS 4–5 and 20% had no CTAS recorded. Patients who were older than age 65 were approximately half as likely to have a CTAS level of 4–5 compared to younger patients (OR=0.605, CI=0.441,0.829). Patients older than age 65 were 1.75 times more likely to be CTAS level 1–2 (OR=1.745, CI=1.277, 2.383). Patients who went to the ED during the day were less likely to LWBS compared to night visits (OR=0.697, CI=0.532, 0.912).
Interpretation
Most low acuity ED utilization is by patients under the age of 65, while high acuity ED utilization is more common among patients older than age 65. Patients are more likely to LWBS during late evening and overnight periods (9 pm–7 am).
Depression is a common and costly comorbidity in dementia. There are very few data on the cost-effectiveness of antidepressants for depression in dementia and their effects on carer outcomes.
Aims
To evaluate the cost-effectiveness of sertraline and mirtazapine compared with placebo for depression in dementia.
Method
A pragmatic, multicentre, randomised placebo-controlled trial with a parallel cost-effectiveness analysis (trial registration: ISRCTN88882979 and EudraCT 2006-000105-38). The primary cost-effectiveness analysis compared differences in treatment costs for patients receiving sertraline, mirtazapine or placebo with differences in effectiveness measured by the primary outcome, total Cornell Scale for Depression in Dementia (CSDD) score, over two time periods: 0–13 weeks and 0–39 weeks. The secondary evaluation was a cost-utility analysis using quality-adjusted life years (QALYs) computed from the Euro-Qual (EQ-5D) and societal weights over those same periods.
Results
There were 339 participants randomised and 326 with costs data (111 placebo, 107 sertraline, 108 mirtazapine). For the primary outcome, decrease in depression, mirtazapine and sertraline were not cost-effective compared with placebo. However, examining secondary outcomes, the time spent by unpaid carers caring for participants in the mirtazapine group was almost half that for patients receiving placebo (6.74 v. 12.27 hours per week) or sertraline (6.74 v. 12.32 hours per week). Informal care costs over 39 weeks were £1510 and £1522 less for the mirtazapine group compared with placebo and sertraline respectively.
Conclusions
In terms of reducing depression, mirtazapine and sertraline were not cost-effective for treating depression in dementia. However, mirtazapine does appear likely to have been cost-effective if costing includes the impact on unpaid carers and with quality of life included in the outcome. Unpaid (family) carer costs were lower with mirtazapine than sertraline or placebo. This may have been mediated via the putative ability of mirtazapine to ameliorate sleep disturbances and anxiety. Given the priority and the potential value of supporting family carers of people with dementia, further research is warranted to investigate the potential of mirtazapine to help with behavioural and psychological symptoms in dementia and in supporting carers.
Efficient milking systems, in terms of labour demand, capital investment and cow udder health are critical to successful dairy herd expansion. The objective of this study was to establish the effect of two primary influencing factors on efficient milking performance, i.e. parlour size (number of milking units) and pre-milking routine (full and nil) of spring-calved cows, in a single-operator side-by-side, swing-over milking parlour. Efficiency parameters investigated in a 5×2 factorial design included milk-flow and yield, row time, over-milking duration and operator idle time. Five combinations of parlour size (14, 18, 22, 26 and 30 milking units) each with two different pre-milking routines (Full: spray, strip, wipe, attach clusters, and Nil: attach clusters) were examined with one milking operator. The trial was carried out over 40 milking sessions and cows (up to 120) were randomly assigned to groups (n=14, 18, 22, 26 or 30) before each milking session. Row within a milking session was the experimental unit. The experiment was carried out at both peak and late lactation. The data were analysed with a mixed model using GenStat 13.2. The full pre-milking routine reduced time to milk let-down and milking time, increased average flow rate but did not affect milk yield. As milking unit number increased, the duration of over-milking (defined as time at milk flow rate <0·2 kg/min) increased more with a full compared with nil routine. Thus, the use of pre-milking preparation decreased milking time per cow but as parlour size increased, milking row times, as well as the proportion of cows that were over-milked, also increased, thereby reducing overall efficiency. These results have implications for milking management in single-operator swing-over, tandem and rotary parlours with seasonally calved herds.
Poor weight gain is common in infants after Stage I Norwood operation and can negatively impact outcomes.
Objectives
The purpose of this study was to examine the impact of feeding strategy on interstage weight gain.
Methods
In a multi-centre study, 158 infants discharged following the Norwood operation were enrolled prospectively. Weight and feeding data were obtained at 2-week intervals. Differences between feeding regimens in average daily weight gain and change in weight-for-age z-score between Stage I discharge and Stage II surgery were examined.
Results
Discharge feeding regimens were oral only in 52%, oral with tube supplementation in 33%, and by nasogastric/gastrostomy tube only in 15%. There were significant differences in the average daily interstage weight gain among the feeding groups – oral only 25.0 grams per day, oral/tube 21.4 grams per day, and tube only 22.3 grams per day – p = 0.019. Tube-only-fed infants were significantly older at Stage II (p = 0.004) and had a significantly greater change in weight-for-age z-score (p = 0.007). The overall rate of weight gain was 16–32 grams per day, similar to infant norms. The rate of weight gain declined over time, with earlier decline observed for oral- and oral/tube-fed infants (less than 15 grams per day at 5.4 months) in comparison with tube-only-fed infants (less than 15 grams per day at 8.6 months).
Conclusion
Following Stage I Norwood, infants discharged on oral feeding had better average daily weight gain than infants with tube-assisted feeding. The overall weight gain was within the normal limits in all feeding groups, but the rate of weight gain decreased over time, with an earlier decline in infants fed orally.