We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This chapter offers an overview of the various spaces which have led to the racialisation of rap music in France using tools available to cultural sociology. It relies on several extensive case studies conducted in the past ten years on French rap music, its production, its consumption, and its media treatment. With the help of the “production of culture perspective”, the chapter describes how the music industry seized the opportunity to exploit a commercial niche that would later become a racialised professional segment central in its business. Focusing on the consumption of music, we then contest the representation of rap audiences as exclusively or initially male, non-White and working-class based, and demonstrate how these audiences have been socially diversified from the outset. These empirical findings are not contradictory with the capacity of rap to serve as a formative medium for racial self-understandings in contemporary France. Finally, the sociology of cultural legitimacy offers a framework to examine the political, legal, and mediatic racialisation processes which have incited moral panic relating to rap and rappers, such as lawsuits or attempts to censor their work.
In this study, we conducted an electrical analysis of the effects of cold plasma on the properties of distilled water, using a corona discharge in a tip–plane configuration. The discharge was initiated by applying a voltage of 7.17 kV with a 2 mm gap between the tip and the water surface. We investigated the impact of plasma treatment on the total dissolved solids (TDS) and conductivity of 20 mL of distilled water, with exposure times ranging from 2 to 12 min. The results show that plasma treatment leads to a significant increase in conductivity and TDS, with a proportional increase relative to the exposure time. In addition to these measurements, we performed a detailed electrical analysis to evaluate the energy efficiency of the plasma treatment. This analysis involved calculating the useful power and energy efficiency using an equivalent electrical model of the corona discharge reactor, as direct measurement of these parameters is challenging in this context. The model allowed us to calculate energy consumption and analyse the electrical behaviour of the system throughout the treatment process. This study also enables us to monitor, control and optimize the energy during plasma treatment, providing insights into the energy dynamics involved. The findings have potential applications in improving energy efficiency in industrial and environmental processes.
The production of knowledge in public health involves a systematic approach that combines imagination, science, and social justice, based on context, rigorous data collection, analysis, and interpretation to improve health outcomes and save lives. Based on a comprehensive understanding of health trends and risk factors in populations, research priorities are established. Rigorous study design and analysis are critical to establish causal relationships, ensuring that robust evidence-based interventions guide beneficial health policies and practice. Communication through peer-reviewed publications, community outreach, and stakeholder engagement ensures that insights are co-owned by potential beneficiaries. Continuous monitoring and feedback loops are vital to adapt strategies based on emerging outcomes. This dynamic process advances public health knowledge and enables effective interventions. The process of addressing a complex challenge of preventing HIV infection in young women in sub-Saharan Africa, a demographic with the least social power but the highest HIV risk, highlights the importance of inclusion in knowledge generation, enabling social change through impactful science.
We conducted interviews with state epidemiologists involved in the state-level COVID-19 response to understand the challenges and opportunities that state epidemiologists and state health departments faced during COVID-19 and consider the implications for future pandemic responses.
Methods
As part of a broader study on policymaking during COVID-19, we analyzed 12 qualitative interviews with state-epidemiologists from 11 US states regarding the challenges and opportunities they experienced during the COVID-19 response.
Results
Interviewees described the unprecedented demands COVID-19 placed on them, including increased workloads as well as political and public scrutiny. Decades of under-funding and constraints posed particular challenges for meeting these demands and compromised state responses. Emergency funding contributed to ameliorating some challenges. However, state health departments were unable to absorb the funds quickly, which created added pressure for employees. The emergency funding also did not resolve longstanding resource deficits.
Conclusions
State health departments were not equipped to meet the demands of a comprehensive COVID-19 response, and increased funding failed to address shortfalls. Effective future pandemic responses will require sustained investment and adequate support to manage on-going and surge capacity needs. Increased public interest and skepticism complicated the COVID-19 response, and additional measures are needed to address these factors.
Self-injurious behaviors (SIB) are common in autistic people. SIB is mainly studied as a broad category, rather than by specific SIB types. We aimed to determine associations of distinct SIB types with common psychiatric, emotional, medical, and socio-demographic factors.
Methods
Participants included 323 autistic youth (~50% non−/minimally-speaking) with high-confidence autism diagnoses ages 4–21 years. Data were collected by the Autism Inpatient Collection during admission to a specialized psychiatric inpatient unit (www.sfari.org/resource/autism-inpatient-collection/). Caregivers completed questionnaires about their child, including SIB type and severity. The youth completed assessments with clinicians. Elastic net regressions identified associations between SIB types and factors.
Results
No single factor relates to all SIB types. SIB types have unique sets of associations. Consistent with previous work, more repetitive motor movements and lower adaptive skills are associated with most types of SIB; female sex is associated with hair/skin pulling and self-rubbing/scratching. More attention-deficit/hyperactivity disorder symptoms are associated with self-rubbing/scratching, skin picking, hair/skin pulling, and inserts finger/object. Inserts finger/object has the most medical condition associations. Self-hitting against surface/object has the most emotion dysregulation associations.
Conclusions
Specific SIB types have unique sets of associations. Future work can develop clinical likelihood scores for specific SIB types in inpatient settings, which can be tested with large community samples. Current approaches for SIB focus on the behavior functions, but there is an opportunity to further develop interventions by considering the specific SIB type in assessment and treatment. Identifying factors associated with specific SIB types may aid with screening, prevention, and treatment of these often-impairing behaviors.
Adolescence is the peak life stage for the development of mental illness. Whole-school approaches to mental health and well-being, modelled on the World Health Organization’s Health-Promoting Schools Framework, hold vast potential in this developmentally sensitive period. However, the evidence base for these interventions is inconclusive.
Aims
Our study examines the effectiveness of The Resilience Project School Partnership Program, a whole-school intervention involving students, teachers and parents, centred around concepts of gratitude, empathy, emotional literacy and mindfulness.
Methods
A quasi-experimental study with an intervention and a control arm was used to evaluate the programme in 40 149 students across 102 schools in 2023. Data collected included sociodemographic information and outcomes derived from validated scales, comprising life satisfaction, hope, coping skills, anxiety and depression. Intervention schools were stratified by the number of years they had implemented the programme, and mixed-effects regression models were used to evaluate the programme.
Results
After adjusting for confounders, participants at schools who had been implementing the programme for 6 years or longer demonstrated significantly better outcomes across all five domains (life satisfaction: B = 0.627, 95% CI 0.465–0.789; hope: B = 2.135, 95% CI 0.895–3.347; coping skills: B = 0.438, 95% CI 0.250–0.625; anxiety: odds ratio = 0.658, 95% CI 0.559–0.774; depression: odds ratio = 0.534, 95% CI 0.459–0.620). Only depression was significantly lower among participants at schools in their fourth or fifth year of implementing the programme (odds ratio = 0.941, 95% CI 0.935–0.948).
Conclusions
Our findings indicate that whole-school interventions may require long-term investment to realise their potential and highlight implementation duration as an important consideration for future evaluations of whole-school interventions.
At coastal archaeological sites, measuring erosion rates and assessing artifact loss are vital to understanding the timescale(s) and spatial magnitude of past and future site loss. We describe a straightforward low-tech methodology for documenting shoreline erosion developed by professionals and volunteers over seven years at Calusa Island Midden (8LL45), one of the few remaining sites with an Archaic component in the Pine Island Sound region of coastal Southwest Florida. We outline the evolution of the methodology since its launch in 2016 and describe issues encountered and solutions implemented. We also describe the use of the data to guide archaeological research and document the impacts of major storms at the site. The response to Hurricane Ian in 2022 is one example of how simply collected data can inform site management. This methodology can be implemented easily at other coastal sites at low cost and in collaboration with communities, volunteers, and heritage site managers.
This article critically examines the inequities in the access to COVID-19 vaccine and the lessons for global health law. Despite the rapid development and approval of COVID-19 vaccines, the rollout exposed severe systemic failures rooted in preexisting economic distortions and market inefficiencies. The article argues that addressing vaccine inequity requires more than improved distribution and solidarity, but effective reinvention of the global vaccine supply chain through evidence-based and meaningful market-shaping measures. It calls for a transformative approach to global health governance, emphasising the need for a comprehensive, human rights-compliant policy framework to correct structural problems in international markets, moving beyond superficial exhortations to equity.
Human rights offer to ground global health law in equity and justice. Human rights norms, advocacy, and strategies have proven successes in challenging private and public inequities and in realizing more equitable domestic and global health governance. However, mobilizing human rights within global health law faces enormous political, economic, technological, and epidemiological challenges, including from the corrosive health impacts of power, politics, and commerce. This article focuses on what human rights could bring to three major global health law challenges — health systems strengthening and universal health coverage, the commercial and economic determinants of health, and pandemic disease threats. We argue that human rights offer potentially powerful norms and strategies for achieving equity and justice in these and other key global health domains. The challenge for those working in human rights and global health law is to work nimbly, creatively, and courageously to strengthen the contribution of these instruments to health justice.
The current study is an attempt to explore under-five child malnutrition in a low-income population setting using the Extended Composite Index of Anthropometric Failure (ECIAF).
Design:
Data from the Bangladesh Demographic and Health Survey 2017–2018 were analysed. Malnutrition using ECIAF was estimated using stunting, wasting underweight and overweight. Multilevel logistic regression models identified factors associated with malnutrition. Geospatial analysis was conducted using R programming.
Setting:
Bangladesh.
Participants:
Children under 5 years of age.
Results:
In Bangladesh, as indicated by the ECIAF, approximately 40·8 % (95 % CI: 39·7, 41·9) of children under five experience malnutrition, whereas about 3·3 % (95 % CI: 2·9, 3·7) were overweight. Children of parents with no formal education (56·3 %, 95 % CI: 50·8, 61·8), underweight mothers (53·4 %, 95 % CI: 50·4, 56·3), belonging to the lowest socio-economic strata (50·6 %, 95 % CI: 48·3, 53·0), residing in rural areas (43·3 %, 95 % CI: 41·9, 44·6) and aged below 3 years (47·7 %, 95 % CI: 45·2, 50·2) demonstrated a greater age- and sex-adjusted prevalence of malnutrition. The Sylhet division (Eastern region) exhibited a higher prevalence of malnutrition (> 55·0 %). Mothers with no formal education (adjusted OR (AOR): 1·51, 95 % CI: 1·08, 2·10), underweight mother (AOR: 1·54, 95 % CI: 1·03, 1·83), poorest socio-economic status (AOR: 2·14, 95 % CI: 1·64, 2·81), children aged 24–35 months (AOR: 2·37, 95 % CI: 1·97, 2·85) and fourth and above birth order children (AOR: 1·41, 95 % CI: 1·16, 1·72) were identified key factors associated with childhood malnutrition while adjusting community- and household-level variations.
Conclusions:
In Bangladesh, two out of five children were malnourished, and one in thirty-five children was overweight. Continuous monitoring of the ECIAF over time would facilitate tracking changes in the prevalence of different forms of malnutrition, helping to plan interventions and assess the effectiveness of interventions aimed at addressing both undernutrition and overweight.
This article examines how, why, and with what limitations judges have adopted a gendered perspective (perspectiva de género) in Chile. It addresses why the Supreme Court’s Secretariat of Gender and Nondiscrimination advocates for a particular understanding of the concept, how judges understand and apply it, and the barriers they perceive to its implementation. Drawing on interviews, ethnographic fieldwork, and analysis of court rulings, the study identifies four ways in which judges understand a “gendered perspective”: as a method to detect stereotypes, a tool to analyze context, an instrument to reach a fair result, and a rejection of the notion of loosening evidentiary standards. The article argues that in contemporary Chile, different legal cultures shape disparate understandings about a gendered perspective. There is significant contestation between understandings endorsed by the dominant textualist legal culture and those favored by the emerging interpretive legal culture. By illuminating the limitations Chilean judges face in this evolving area of the law, the study contributes insights of relevance for our understanding of the factors that affect gender and judging in Latin America and beyond.
Describing the evolution of a wind turbine's wake from a top-hat profile near the turbine to a Gaussian profile in the far wake is a central feature of many engineering wake models. Existing approaches, such as super-Gaussian wake models, rely on a set of tuning parameters that are typically obtained from fitting high-fidelity data. In the current study, we present a new engineering wake model that leverages the similarity between the shape of a turbine's wake normal to the streamwise direction and the diffusion of a passive scalar from a disk source. This new wake model provides an analytical expression for a streamwise scaling function that ensures the conservation of linear momentum in the wake region downstream of a turbine. The model also considers the different rates of wake expansion that are known to occur in the near- and far-wake regions. Validation is presented against high-fidelity numerical data and experimental measurements from the literature, confirming a consistent good agreement across a wide range of turbine operating conditions. A comparison is also drawn with several existing engineering wake models, indicating that the diffusion-based model consistently provides more accurate wake predictions. This new unified framework allows for extensions to more complex wake profiles by making adjustments to the diffusion equation. The derivation of the proposed model included the evaluation of analytical solutions to several mathematical integrals that can be useful for other physical applications.
Increased intestinal leakiness and associated systemic inflammation are potential contributors to osteoarthritis (OA) and postural imbalance in the geriatric population. To date, no successful treatment to correct postural imbalance in OA is known. We aimed to explore the effects of a multistrain probiotic upon postural imbalance in OA-affected patients. In this randomised, double-blind trial with a placebo group, 147 patients suffering from knee OA (age span = 64–75 years) were divided into placebo (n 75) and probiotics (n 72) study groups. Vivomix 112 billion, multistrain probiotic was given once a day for 12 weeks. The outcomes of study variables were determined first at baseline and later after 12 weeks of intervention. These were Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC), knee flexion range of motion (ROM), pain intensity by visual analogue scale, handgrip strength (HGS), gait speed and balance control assessed in standing, semi-tandem and tandem stances. We determined plasma zonulin to determine intestinal leak along with c-reactive protein and 8-isoprostanes levels. A total of 136 OA patients taking placebo (n 71) and probiotics (n 65) were analysed. The probiotics group exhibited a reduction in pain intensity, disease severity and WOMAC scores along with improvement in balance scores, HGS and walking speed (P < 0·05 for all), no change in ROM, resting pain and 8-isoprostanes levels. The correlation analysis revealed a robust association of balance scores with plasma markers of intestinal leakiness and inflammation in probiotics but not in the placebo group. Probiotics reduce postural imbalance in OA patients partly due to a reduction in intestinal leakiness.
Endomyocardial biopsy remains the gold standard for cardiac cellular rejection surveillance after heart transplantation. We studied a novel non-invasive index of left ventricular relaxation to detect cardiac cellular rejection in paediatric heart transplant patients.
Methods:
This is a single-centre retrospective study of paediatric heart transplant patients who underwent endomyocardial biopsy from June 2014 to September 2021. Left ventricular relaxation index was calculated as the sum of diastolic tissue Doppler imaging velocities (E) of the left ventricular lateral, septal, and posterior walls divided by the percentage of the left ventricular posterior wall thinning by M-mode. Statistical analysis included t-tests and Mann-Whitney tests to compare means and medians between treatment and non-treatment groups. We used the cut-off with the maximum Youden index to compare the sensitivity and specificity of left ventricular relaxation index to detect rejection.
Results:
The study included 65 patients who underwent 246 cardiac catheterizations and endomyocardial biopsies. Out of 246, 192 procedures were included and 54 were excluded due to recent transplants or lack of echocardiographic data. A total of 114 demonstrated Grade 0R, 68 Grade 1R, 8 Grade 2R, and 2 Grade 3R allograft rejection. The difference in mean left ventricular relaxation index between treatment versus non-treatment groups (2R, 3R vs. 0R, 1R) was not statistically significant (p = 0.917). A left ventricular relaxation index cut-off of 0.73 had the highest Youden index with good sensitivity (100%) and poor specificity (23%) for detecting rejections with grades 2R and 3R.
Conclusion:
Left ventricular relaxation index, a novel index of left ventricular relaxation, was not a sensitive or specific predictor of cardiac cellular rejection in paediatric heart transplants.
This pilot study assessed the feasibility of measuring time to perform pre-identified lifesaving interventions (LSIs) used during mass-casualty incidents (MCIs).
Methods
An observational simulation study involving pre-hospital providers (PHPs) was conducted at London’s Air Ambulance training center. PHPs performed 16 basic-to-advanced LSIs and were video-recorded to capture the LSIs’ time intervals (TTs) (time from picking up equipment to completing the LSIs). TTs are reported in seconds (median and interquartile range [IQR]). Ethical approval was obtained from Queen Mary University.
Results
Seven PHCPs (five paramedics and two physicians) performed 92 LSIs, with paramedics limited to 11 LSIs due to their scope of practice. Physician-only performed LSIs had the longest TT compared to other LSIs, Rapid-sequence intubation 175.00 IQR(162.50–187.50). The longest TT in all LSIs was related to circulation support, with fluid resuscitation taking 99 IQR(88–101) for paramedics and 80 IQR(74.5–85.5) for physicians. LSIs with a median time exceeding 30 seconds were generally characterized by substantial variability, as indicated by a wide IQR.
Conclusion
This pilot study demonstrated the feasibility of recording timings for LSIs. Physician-only performed LSIs had the longest TT but were more complex interventions. Further investigation within a simulated environment is planned.
This paper explores the (de-)routinisation of employment structure in developing countries, through the case of Morocco. We investigate employment (de-)routinisation from an often-overlooked perspective, aiming to elucidate the interplay between the dynamics of occupational employment composition by the level of routine tasks intensity and two structural aspects: premature deindustrialisation and the prevalence of informal labour.
Our findings, based on tertile analysis and regressions, do not fully support the hypothesis of employment structure de-routinisation. At the same time, we could not identify a clear process of routinisation similar to that observed in developing countries undergoing the first stage of the traditional structural transformation process. Rather, we identified an inverted U-shaped pattern in the dynamics of occupational employment, indicative of a rise in intermediate routine-intensive occupations.
We emphasise two key factors, with opposite effects that have contributed to this atypical pattern: The first aspect is premature deindustrialisation, which according to our shift-share decomposition, has adversely affected highly routine-intensive jobs, contrasting with the routinisation trend observed in countries that have experienced a more traditional process of structural transformation. The influence of premature deindustrialisation in terms of de-routinisation is somewhat mitigated by the increasing prevalence of occupations demanding intermediate routine tasks, particularly within the services and construction sector. Regarding the second structural aspect – the prevalence of informal labour – our three-way interaction model indicates a lower susceptibility of informal jobs to de-routinisation compared to their formal counterparts within the same industry. Consequently, the prevalence of informal employment has slowed down the process of de-routinisation of employment structure.
Background: Infection is one of the most common complications of cancer and cancer treatment. Most patients admitted for fever or infection come through the Emergency Department (ED), which is a primary site for blood culture collection. Contamination of blood cultures complicates the diagnoses, compromises quality of care, leads to unnecessary antibiotic exposure and increases financial burdens. It may also lead to unnecessary removal of central venous access devices or delay of critical therapy or procedures. At our institution, the contamination rate of blood cultures drawn in the ED was over twice that of the remainder of the hospital (2.8 versus 0.8), prompting this quality improvement project. Unlike on hospital floors, nurses, instead of phlebotomists, draw most blood cultures due to the urgency of managing suspected sepsis. Our aim was to decrease the ED contamination rate by 20 percent after the first PDSA cycle, and ultimately bring it on par with the remainder of the hospital. Methods: First, we compared ED contamination rates versus other hospital inpatient floors and outpatient centers over a three-month period. We then evaluated the contamination rates of ED nurses versus ED phlebotomists and peripheral versus central line blood draws. Process mapping and fishbone analysis helped identify practices contributing to higher contamination rates. Key drivers of these practices were diagrammed, and potential interventions were ranked on a prioritization matrix. Results: We identified use of alcohol rather than chlorhexidine swabs for peripheral disinfection and inconsistent techniques of blood draw by nurses as critical contributors to increased contamination rates in the ED. Our intervention was creating premade blood culture kits promoting the use of chlorhexidine swabs through availability and easy access in the fast-paced ED environment. Ten cubic centimeter (cc) syringes in the kits encouraged withdrawal of adequate blood samples in compliance with the 7-10 cc guideline. Designated nursing team leaders checked off ED nurses at the bedside, implementing education and adherence in using the blood culture collection kits. The average number of blood cultures in the emergency department was 1,400. A reduction in blood culture contamination from 2.46 percent to 1.89 percent was seen after two months. Conclusions: A guideline-driven, standardized blood culture collection process followed by ED nurses is vital to reducing blood culture contamination. Chlorhexidine is necessary to maintain the lowest contamination rates. Readily available premade blood culture kits improve compliance with materials and techniques associated with best practices.
The formalisation of informal security-providers has important consequences for citizenship, the rule of law, and human rights. We examine these policies in Burkina Faso, where formalisation has led to concerns about vigilante justice and ethnic targeting. Although African governments' reliance on informal security provision is well-documented, less is known about the origins of formalisation policies. To advance theory-building in this domain, this paper examines the political logic of empowering self-defence groups through the study of Burkina Faso's 2022 junta government, with comparisons to two prior regimes. We argue that formalisation is not only a mechanism for overcoming vexing security challenges, but is a tool used by leaders to build legitimacy and strengthen the regime's grip on power. In doing so, the article contributes insights into the origins of governmental policies towards self-defence groups, with implications for the study of political legitimacy, security provision and citizen–state relations.
The primary aim was to analyze three months of admissions to Rowan Ward PICU (February 22 to April 2022) according to NAPICU's 2014 criteria, followed by implementing recommendations and conducting a re-audit (November 2022 to January 2023) to assess their impact. Secondary objectives included examining the link between prior PICU admissions and higher readmission rates, even when not clinically necessary.
Methods
Methods involved assessing each admission against NAPICU's criteria and reviewing the reason for admission (RFA) for appropriateness. Data collection utilized various sources, including SystmOne, Mental Health Act assessments, and referral documents. Collaborative analysis with the PICU consultant was conducted due to the subjective nature of RFA interpretation.
Results
Results from the initial audit revealed that 12 out of 36 patients (33%) were deemed unsuitable for PICU admission, with 8 having prior PICU admissions (67%). Only 22% had documented multidisciplinary team (MDT) discussions. In the subsequent audit, 9 out of 38 patients (24%) were deemed unsuitable for PICU admission, with 2 having prior admissions (22%). Only 3% had documented MDT discussions.
Conclusion
There was a reduction in inappropriate admissions from 33% to 24% in the subsequent cycle. This improvement was linked to the implementation of recommendations from the first audit, such as introducing a standardized referral form, enhancing consultant-to-consultant communications, and forming a PICU outreach team. While the initial findings indicated higher readmission rates for patients with prior PICU admissions, this trend lessened in the subsequent evaluation. However, there is still insufficient documentation of Multidisciplinary Team (MDT) discussions, highlighting the need for a re-audit to accurately assess any changes.