We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Does it make sense to refer to a Chinese “tributary system?” A number of influential Western scholars, including John Wills Jr., James Hevia and Laura Hostetler, have argued that it does not—largely on the grounds that previous China scholars, John K. Fairbank “and his followers” in particular, have overgeneralized its historical significance. The result, however, has been that much of Fairbank's painstaking and valuable research on the structure and functions of the tributary system has been ignored. Although Fairbank may well have overestimated the degree to which Chinese assumptions about tribute shaped Qing policy toward foreigners, it seems unhelpful and misleading to suggest that they were of no consequence whatsoever.
Data from an RCT of IAPT Norway (“Prompt Mental Health Care” [PMHC]) were linked to several administrative registers up to five years following the intervention. The aims were to (1) examine the effects of PMHC compared to treatment-as-usual (TAU) on work-related outcomes and health care use, (2) estimate the cost–benefit of PMHC, and (3) examine whether clinical outcomes at six-month follow-up explained the effects of PMHC on work−/cost–benefit-related outcomes.
Methods
RCTs with parallel assignment were conducted at two PMHC sites (N = 738) during 2016/2017. Eligible participants were considered for admission due to anxiety and/or depression. We used Bayesian estimation with 90% credibility intervals (CI) and posterior probabilities (PP) of effects in favor of PMHC. Primary outcome years were 2018–2022. The cost–benefit analysis estimated the overall economic gain expressed in terms of a benefit–cost ratio and the differences in overall public sector spending.
Results
The PMHC group was more likely than the TAU group to be in regular work without receiving welfare benefits in 2019–2022 (1.27 ≤ OR ≤ 1.43). Some evidence was found that the PMHC group spent less on health care. The benefit–cost ratio in terms of economic gain relative to intervention costs was estimated at 5.26 (90%CI $ - $1.28, 11.8). The PP of PMHC being cost-beneficial for the economy as a whole was 85.9%. The estimated difference in public sector spending was small. PMHC effects on work participation and cost–benefit were largely explained by PMHC effects on mental health.
Conclusions
The results support the societal economic benefit of investing in IAPT-like services.
Co-occurring autism and attention-deficit/hyperactivity disorder (ADHD) have been associated with poorer social skills. Most studies examining the association of ADHD symptoms and social skills in autism employ categorical and cross-sectional designs, which provide a narrow view of the development of ADHD symptoms. Using group-based trajectory modeling, we identified five trajectories of caregiver-reported attention problems in an inception cohort of autistic children (N = 393) followed from age 2–5 years (T1) to age 10.5–11 years (T8): Low-Stable (LS; 15.5% of participants), Low-Decreasing (LD; 25.2%), Low-Increasing (LI; 19.2%), Moderate-Decreasing (MD; 32.9%), and High-Stable (HS; 7.2%). Child FSIQ and caregiver age at baseline were lower and caregiver depression at baseline was higher for participants in the MD group than the LS group. Psychotropic medication use was associated with higher attention problems. The MD and HS groups had similar mean Vineland Adaptive Behavior Scales, Second Edition (VABS-II) Socialization standard scores at T8, which were lower than other groups. The LI group had lower Socialization scores than the LS group. Results support that a decline in caregiver-reported attention problems is common but not universal in autistic children and that even moderate/subclinical attention problems may relate to social skills outcomes in autism.
Artificial intelligence (AI) requires new ways of evaluating national technology use and strategy for African nations. We conduct a survey of existing “readiness” assessments both for general digital adoption and AI policy in particular. We conclude that existing global readiness assessments do not fully capture African states’ progress in AI readiness and lay the groundwork for how assessments can be better used for the African context. We consider the extent to which these indicators map to the African context and what these indicators miss in capturing African states’ on-the-ground work in meeting AI capability. Through case studies of four African nations of diverse geographic and economic dimensions, we identify nuances missed by global assessments and offer high-level policy considerations for how states can best improve their AI readiness standards and prepare their societies to capture the benefits of AI.
Cognitive therapy for PTSD (CT-PTSD) is an efficacious treatment for children and adolescents with post-traumatic stress disorder (PTSD) following single incident trauma, but there is a lack of evidence relating to this approach for youth with PTSD following exposure to multiple traumatic experiences.
Aims:
To assess the safety, acceptability and feasibility of CT-PTSD for youth following multiple trauma, and obtain a preliminary estimate of its pre–post effect size.
Method:
Nine children and adolescents (aged 8–17 years) with multiple-trauma PTSD were recruited to a case series of CT-PTSD. Participants completed a structured interview and mental health questionnaires at baseline, post-treatment and 6-month follow-up, and measures of treatment credibility, therapeutic alliance, and mechanisms proposed to underpin treatment response. A developmentally adjusted algorithm for diagnosing PTSD was used.
Results:
No safety concerns or adverse effects were recorded. Suicidal ideation reduced following treatment. No participants withdrew from treatment or from the study. CT-PTSD was rated as highly credible. Participants reported strong working alliances with their therapists. Data completion was good at post-treatment (n=8), but modest at 6-month follow-up (n=6). Only two participants met criteria for PTSD (developmentally adjusted algorithm) at post-treatment. A large within-subjects treatment effect was observed post-treatment and at follow up for PTSD severity (using self-report questionnaire measures; ds>1.65) and general functioning (CGAS; ds<1.23). Participants showed reduced anxiety and depression symptoms at post-treatment and follow-up (RCADS-C; ds>.57).
Conclusions:
These findings suggest that CT-PTSD is a safe, acceptable and feasible treatment for children with multiple-trauma PTSD, which warrants further evaluation.
Objectives: People with dementia live with unmet needs due to dementia and other conditions. The EMBED-Care Framework is a co-designed app-delivered intervention involving holistic assessment, evidence-based decision- support tools and resources to support its use. Its intention is to empower people with dementia, family and practitioners to assess, monitor and manage needs. We aimed to explore the feasibility and acceptability of the EMBED-Care Framework and develop its underpinning programme theory.
Methods: A six-month single arm mixed-Methods feasibility and process evaluation, underpinned by an initial programme theory which was iteratively developed from previous studies. The settings were two community teams and two long term care facilities (LTCFs). People with dementia and family were recruited to receive the intervention for 12 weeks. Practitioners were recruited to deliver the intervention for six months. Quantitative data included candidate process and outcome measures. Qualitative data comprised interviews, focus groups and observations with people with dementia, family and practitioners. Qualitative and quantitative data were analysed separately and triangulated at the interpretation phase.
Results: Twenty-six people with dementia, 25 family members and 40 practitioners were recruited. Practitioners in both settings recognized the potential benefit for improving care and outcomes for people with dementia, and to themselves in supporting care provision. Family in both settings perceived a role in informing assessment and decisions about care. Family was integral to the intervention in community teams but had limited involvement in LTCFs. In both settings, embedding the intervention into routine care processes was essential to support its use. In community teams, this required aligning app functionality with care processes, establishing processes to monitor alerts, and clarifying team responsibilities. In LTCFs, duplication of care processes and limited time to integrate the intervention into routine care processes, affected its acceptability.
Conclusions: A theoretically informed co-designed digital intervention has potential to improve care processes and outcomes for people with dementia and family, and is acceptable to practitioners in community teams. Further work is required to strengthen the intervention in LTCFs to support integration into care processes and support family involvement. The programme theory detailing key mechanisms and likely outcomes of the EMBED-Care Framework is presented.
In contrast with other works on the history of language learning and teaching, this book is innovative in assigning a much more important role to practice and to the reciprocal relationship of policies and practice (rather than investigating top-down processes from policies to practice). The fourteen contributions highlight various contexts of language education in the twentieth century, combining inside out ('emic') perspectives, drawing on teachers'/learners' experience within the classroom, and outside in ('etic') perspectives, looking at external factors such as the curriculum or education policies and considering how teachers and learners respond to these. Each chapter starts from one perspective, yet at the same time takes into account the reciprocal effects between the two directions of movement (inside out / outside in). This volume asks, how has the practice of language learning and teaching been influenced by policies and context - and vice versa?
Computerized clinical decision support (CDS) assists healthcare professionals in making decisions to improve patient care. In the realms of antimicrobial stewardship (ASP) and infection prevention (IP) programs, CDS interventions can play a crucial role in optimizing antibiotic prescribing practices, reducing healthcare-associated infections, and promoting diagnostic stewardship when optimally designed. This primer article aims to provide ASP and IP professionals with a practical framework for the development, design, and evaluation of CDS interventions.
Setting:
Large academic medical center design: Established frameworks of CDS evaluation, “Five Rights” of CDS and the “Ten Commandments of Effective Clinical Decision Support”, were applied to two real-world examples of CDS tools, a Vancomycin Best Practice Advisory and a Clostridioides Difficile order panel, to demonstrate a structured approach to developing and enhancing the functionality of ASP/IP CDS interventions to promote efficacy and reduce unintended consequences of CDS.
Conclusions:
By outlining a structured approach for the development and evaluation of CDS interventions, with focus on end user engagement, efficiency and feasibility, ASP and IP professionals can leverage CDS to enhance IP/ASP quality improvement initiatives aimed to improve antibiotic utilization, diagnostic stewardship, and adherence to IP protocols.
Archaeological sites in Northwest Africa are rich in human fossils and artefacts providing proxies for behavioural and evolutionary studies. However, these records are difficult to underpin on a precise chronology, which can prevent robust assessments of the drivers of cultural/behavioural transitions. Past investigations have revealed that numerous volcanic ash (tephra) layers are interbedded within the Palaeolithic sequences and likely originate from large volcanic eruptions in the North Atlantic (e.g. the Azores, Canary Islands, Cape Verde). Critically, these ash layers offer a unique opportunity to provide new relative and absolute dating constraints (via tephrochronology) to synchronise key archaeological and palaeoenvironmental records in this region. Here, we provide an overview of the known eruptive histories of the potential source volcanoes capable of widespread ashfall in the region during the last ~300,000 years, and discuss the diagnostic glass compositions essential for robust tephra correlations. To investigate the eruption source parameters and weather patterns required for ash dispersal towards NW Africa, we simulate plausible ashfall distributions using the Ash3D model. This work constitutes the first step in developing a more robust tephrostratigraphic framework for distal ash layers in NW Africa and highlights how tephrochronology may be used to reliably synchronise and date key climatic and cultural transitions during the Palaeolithic.
This study examined the power of theory-derived models to account for the development of PTSD, Complex PTSD (CPTSD), depression, and anxiety in children and adolescents who had experienced a single-event trauma.
Methods
Children (n = 234, aged 8–17 years) recruited from local Emergency Departments were assessed at two and nine weeks post-trauma. Data obtained from self-report questionnaires completed by the child, telephone interviews with parents, and hospital data were used to develop four predictive models of risk factors for PTSD, CPTSD, depression, and Generalized Anxiety Disorder (GAD). ICD-11 proposed diagnostic criteria were used to generate measures for CPTSD and PTSD to assess for risk factors and identify the sample prevalence of these disorders.
Results
At nine weeks post-trauma, 64% did not meet criteria for any disorder, 23.5% met criteria for PTSD, and 5.2% met criteria for CPTSD. 23.9% and 10.7% had developed clinically significant symptoms of depression and GAD, respectively. A cognitive model was the most powerful predictive model, a psychosocial model was weak, and subjective markers of event severity were more powerful than objective measures.
Conclusions
Youth exposed to single-incident trauma may develop different forms of psychopathology, and PTSD and CPTSD are frequently experienced alongside other conditions. The cognitive model of PTSD shows utility in identifying predictors of PTSD, CPTSD, depression, and GAD, particularly the role of trauma-related negative appraisals. This supports the application of cognitive interventions which focus upon re-appraising trauma-related beliefs in youth.
Declining labor force participation of older men throughout the 20th century and recent increases in participation have generated substantial interest in understanding the effect of public pensions on retirement. The National Bureau of Economic Research's International Social Security (ISS) Project, a long-term collaboration among researchers in a dozen developed countries, has explored this and related questions. The project employs a harmonized approach to conduct within-country analyses that are combined for meaningful cross-country comparisons. The key lesson is that the choices of policy makers affect the incentive to work at older ages and these incentives have important effects on retirement behavior.
Severe weather events exacerbate existing health disparities due to poorly managed non-communicable diseases (NCDs). Our objective is to understand the experiences of staff, providers, and administrators (employees) of Federally Qualified Health Centers (FQHCs) in Puerto Rico and the US Virgin Islands (USVI) in providing care to patients living with NCDs in the setting of recent climate-related extreme events.
Methods
We used a convergent mixed-methods study design. A quantitative survey was distributed to employees at 2 FQHCs in Puerto Rico and the USVI, assessing experience with disasters, knowledge of disaster preparedness, the relevance of NCDs, and perceived gaps. Qualitative in-depth interviews explored their experience providing care for NCDs during recent disasters. Quantitative and qualitative data were merged using a narrative approach.
Results
Through the integration of quantitative and qualitative data, we recognize: (1) significant gaps in confidence and preparedness of employees with a need for more training; (2) challenges faced by persons with multiple NCDs, especially cardiovascular and mental health disorders; and (3) most clinicians do not discuss disaster preparedness with patients but recognize their important role in community resilience.
Conclusion
With these results, we recommend strengthening the capacity of FQHCs to address the needs of their patients with NCDs in disasters.
Background: Multidrug-resistant Gram-negative bacteria are a major cause of sepsis among hospitalized neonates globally. Aqueous chlorhexidine gluconate (CHG) skin antisepsis has been shown to be safe for use in infants; however, its sustained effectiveness in preventing Gram-negative pathogen colonization, bloodstream infection (BSI), and mortality is unclear. Methods: We conducted a period prevalence survey, with 26 sampling events over 12 months (18 October 2022 – 31 October 2023) at a 33-bed neonatal unit in a tertiary public hospital in Botswana where ESBL-producing Klebsiella pneumoniae and carbapenem-resistant Acinetobacter baumannii are leading causes of BSI. Perirectal and periumbilical skin swabs were collected every two weeks from all inpatients. Swabs were inoculated onto chromogenic media selective and differential for extended-spectrum beta-lactamase producing Enterobacterales (ESBL-E) and Acinetobacter spp. (CHROMagar™ ESBL, Acinetobacter). Colonization status was determined based on culture growth and colony morphology. Contemporaneous data on all-cause mortality and BSI were abstracted from routine surveillance records. Pre- and post-CHG prevalences were compared using a simple Chi-square test. During the surveillance period, an outbreak of K. pneumoniae linked to contaminated multi-use vials was detected, thus BSIs and deaths during the outbreak period (2 February–6 April, 2023) were excluded. In February 2023, the hospital infection prevention and control (IPC) team introduced twice-weekly whole-body cleansing with commercially available 2% aqueous CHG, performed by caregivers and healthcare workers on neonates >24 hours old and weighing ≥1 kg until discharge. Results: There were significant decreases in ESBL-E and Acinetobacter skin and perirectal colonization following the CHG intervention (Table 1; Figure 1). After the CHG intervention, the incidence of Acinetobacter BSIs declined significantly and there was a trend toward a decline in other BSIs and mortality. No adverse events associated with CHG were reported. Conclusions: Twice-weekly CHG application was temporally associated with significant reductions in neonatal ESBL-E and Actinetobacter skin and perirectal colonization and Acinetobacter BSI. This analysis was limited by a short pre-intervention surveillance period and thus may have been influenced by confounders such as seasonality, and intensified IPC efforts following the outbreak. Analysis of the routine CHG use in other settings and over longer surveillance periods are needed to better understand its effectiveness as an IPC strategy in settings where neonatal sepsis incidence is high. Table 1. Colonization prevalence, BSI incidence, and mortality surrounding introduction of CHG skin cleansing in a neonatal unit, 18 October 2022 – 31 October 2023.
Candida auris is an emerging multidrug-resistant pathogenic yeast capable of causing severe illness in the healthcare environment. It spreads easily amongst patient populations, is often resistant to anti-fungal treatments and can survive on surfaces for prolonged periods. In the current study, 85 sites within hospital settings were screened for surface-contaminated Candida species and C. auris. Surface swab samples were transferred to chromogenic agar media designed to isolate and identify Candida species and were incubated at 35°C for 48 hr. Samples were confirmed using molecular techniques designed to specifically target C. auris from other Candida species. Data was compiled to show prevalence of six key Candida species (C. albicans, C. auris, C. glabrata, C. krusei and C. tropicalis). Survivability on surfaces was performed using CDC B11903 C. auris strain. Plastic, metal and fabric surfaces used were purchased from a medical supply store. Once inoculated with 500 CFU/ml in sterile distilled water, the surfaces were kept in a Class II hood with minimal airflow and ambient conditions (21°C, 60% RH) and sampled daily. Results showed 25 of the 85 (29.4%) tested sites were positive for Candida species, with 3 of those sites positive for C. auris. Anti-fungal resistance among the three isolates (tested using concentration gradient test strips) showed notable resistance to fluconazole, but not to amphotericin B nor micafungin. C. auris survivability was dependent upon surface type, with the C. auris test strain surviving for 39 days on three different types of hospital curtains, and ≥10 days on a variety of non-porous plastic or metal surfaces. With demonstrated survivability of C. auris for long periods of time on hospital surfaces, it becomes critical for healthcare facilities to consider C. auris when developing infection prevention programs.
Background: A vital role of hospital employee health is the management, characterization, and targeted prevention of bloodborne pathogen exposures (BPPE) among healthcare workers. A comprehensive review of a health center’s BPPE over time was conducted to identify areas for improvement and target education and training, given changes in BBPE standard operating procedures (SOPs) over time. Methods: A retrospective descriptive analysis was conducted on deidentified BBPE cases reported to employee health at VA Connecticut Healthcare System from 1995-2023 (N=296) using R statistical software. Results: The highest number of BBPE occurred among trainee physicians (N=103, 34.8%, especially surgery and internal medicine), registered nurses (N=60, 20.3%), and non-trainee physicians (N=45, 15.2%). The most frequently implicated devices were hollow-bore (N=103, 34.8%) and suture needles (N=60, 20.3%). Most BBPE occurred during surgical procedures (N=114, 38.5%) or medication administration (N=52, 17.6%). Over half of BBPE occurred during afternoons/nights (N=172, 58.1%). Over half occurred with use of personal protective equipment (PPE) (N=181, 61.1%). The majority of BBPE implicated finger injuries (N=220, 74.3%). Blood was the most frequently reported exposure (N=127, 42.9%), a similar percentage of records did not specifically name a body fluid type (N=121) or whether PPE was used (N=110). In most cases, the source patient was identified (N=282, 95.3%) and tested (N=272, 91.9%). Forty-three sources (14.5%) had positive BBP testing, which included HIV (N=14, 4.7%), hepatitis C (N=23, 7.8%), and hepatitis B (N=6, 2.0%). Most employees presented to employee health for initial evaluation (N=231, 78%) and underwent post-exposure testing (N=266, 89.9%); most had evidence of immunity to hepatitis B (N=246, 83.1%). Eighty-three employees (28%) received HIV PEP (average=1.9 days). Most records did not indicate if this was a first-time BBPE (N=250, 84.5%). No employee records indicated seroconversion for a bloodborne pathogen. Conclusions: Physicians and RNs, those performing surgical procedures and administering medications, and those on second and third shifts are at highest risk and may benefit from additional interventions such as exposure assessment or education. Required recordkeeping has been variable over time. Updated national SOPs have been adapted to employee health, though additional details could be considered for quality improvement purposes, such as duration of employment, level of training, and prior BBPE prevention education. It is unclear if some information such as history of BBPE or PPE use was elicited but not documented – this information could be helpful in management of BBPEs.
The arts were loosely defined by a plethora of ‘-isms’ in the second half of the nineteenth and the early twentieth centuries. None is more often associated with Debussy than Impressionism. Even recent scholarship is still disposed to position him as an Impressionist composer. Whilst much work has been done to disentangle Debussy from the tag and align him in relation to, among others, Hellenistic paintings (around the time of the Prélude à l’Après-midi d’un faune [it.]), Symbolist painting, and the English Pre-Raphaelites, it is important to understand what has been intended by the term ‘musical Impressionism’, how it came to be associated with Debussy, and his usually hostile response to being thus categorised.
The formation of iddingsite by the oxidative weathering of Fo80 olivine begins by solution of Mg from planar fissures, 20 Å wide and spaced 200 Å apart, parallel to (001). Oxidation of Fe within the remaining olivine provides nuclei for the topotactic growth of goethite. Cleavage cracks < 50 Å in diameter allow Na, Al, and Ca from adjacent minerals, particularly plagioclase, to enter the altering olivine while Mg and Si diffuse away. In the early stages of weathering, strips of Fe-rich smectite (saponite), 20–50 Å wide and 1–7 layers thick, form bridges 50–100 Å long across the planar fissures. Dioctahedral smectite crystallizes on the margins of wider cleavage-controlled fissures; with further weathering halloysite is formed away from the fissure walls. In the ultimate stages of alteration, the saponite and dioctahedral smectite are lost, leaving a porous, oriented aggregate of goethite crystals each measuring about 50 × 100 × 200 Å (X, Y, Z, respectively), with sporadic veins of halloysite crossing the pseudomorph.