We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To improve early intervention and personalise treatment for individuals early on the psychosis continuum, a greater understanding of symptom dynamics is required. We address this by identifying and evaluating the movement between empirically derived attenuated psychotic symptomatic substates—clusters of symptoms that occur within individuals over time.
Methods
Data came from a 90-day daily diary study evaluating attenuated psychotic and affective symptoms. The sample included 96 individuals aged 18–35 on the psychosis continuum, divided into four subgroups of increasing severity based on their psychometric risk of psychosis, with the fourth meeting ultra-high risk (UHR) criteria. A multilevel hidden Markov modelling (HMM) approach was used to characterise and determine the probability of switching between symptomatic substates. Individual substate trajectories and time spent in each substate were subsequently assessed.
Results
Four substates of increasing psychopathological severity were identified: (1) low-grade affective symptoms with negligible psychotic symptoms; (2) low levels of nonbizarre ideas with moderate affective symptoms; (3) low levels of nonbizarre ideas and unusual thought content, with moderate affective symptoms; and (4) moderate levels of nonbizarre ideas, unusual thought content, and affective symptoms. Perceptual disturbances predominantly occurred within the third and fourth substates. UHR individuals had a reduced probability of switching out of the two most severe substates.
Conclusions
Findings suggest that individuals reporting unusual thought content, rather than nonbizarre ideas in isolation, may exhibit symptom dynamics with greater psychopathological severity. Individuals at a higher risk of psychosis exhibited persistently severe symptom dynamics, indicating a potential reduction in psychological flexibility.
Attention-deficit/hyperactivity disorder (ADHD) remains underdiagnosed and undertreated in girls. One important contributor is the predominance of inattentive symptoms in girls relative to boys. Though less “visible,” inattentive symptoms represent a key driver of impairment, often persisting into adulthood. EndeavorRx (AKL-T01) is a game-based, FDA-authorized digital therapeutic directly targeting inattention. This analysis sought to examine potential sex differences in the efficacy of AKL-T01.
Methods
We conducted a secondary analysis of clinical outcomes by sex in 326 children and adolescents from two trials of AKL-T01 (n1 = 180 children; 30.6% female, M age = 9.71; n2 = 146 adolescents; 41.1% female, M age = 14.34). All participants had high inattention per a baseline score ≤ -1.8 on the Test of Variables of Attention (TOVA), a computerized, FDA-cleared continuous performance task objectively measuring attention. Participants used AKL-T01 for 25 minutes/day over 4 weeks. Primary outcomes included change in attention on the TOVA Attention Comparison Score (ACS) and sub-metrics, and change in symptoms on clinician-rated ADHD Rating Scale (ADHD-RS). To evaluate study hypotheses, we conducted a series of t-tests of TOVA and ADHD-RS change scores by sex.
Results
Across the pooled sample, girls using AKL-T01 demonstrated significantly greater improvements in attention on the TOVA ACS (MΔ = 2.44) compared to boys (MΔ = 1.32; t[211.77]) = 2.62, d = .31, p = .009), as well as TOVA reaction time standard score (girls’ MΔ = 13.22; boys’ MΔ = 3.54; t[229.12] = 3.93, d = .46, p <.001). We did not observe sex differences in the 2 other TOVA sub-metrics, nor in ADHD-RS (ps >.05). There were sex differences in compliance (t[207.99] = 2.17, d = .26, p = .031), with girls completing more sessions on average (M = 90.22) compared to boys (M = 80.19).
Conclusions
Results suggest that AKL-T01 may be associated with particularly strong improvements to attentional functioning in girls relative to boys. That there were no significant sex differences in ADHD symptom change over the course of treatment in either sex underscores the specificity of these effects to inattention processes rather than broad ADHD symptoms. Limitations include categorization based on binary sex, which may not capture nuances of gender identity.
This chapter presents a broad overview of the measurement of hormones, spanning from their collection in different biospecimens and the assay of hormones across laboratory strategies to a brief overview of statistical treatment and analysis that extracts the hormone of interest. We organize each section into a description of measurement tools followed by an agnostic analysis of the tools for their strengths, weaknesses, prospects, and pitfalls. We do not view any single approach as “best” or “optimal.” This view is commensurate with the production and cellular conversion of hormones – adaptive physiological processes that are not “best” or “optimal” but rather constantly changing biobehavioral markers that shift according to the demands of the environment. Measuring the hormone is just the beginning of exploring the multifaceted ways that hormones can inform health, development, morbidity, and mortality.
Understanding post-stroke spasticity (PSS) treatment in everyday clinical practice may guide improvements in patient care.
Methods:
This was a retrospective cohort study that used population-level administrative data. Adults (aged ≥18 years) who initiated PSS treatment (defined by the first PSS clinic visit, focal botulinum toxin injection, or anti-spasticity medication dispensation [baclofen, dantrolene and tizanidine] with none of these treatments occurring during the 2 years before the stroke) were identified between 2012 and 2019 in Alberta, Canada. Spasticity treatment use, time to treatment start and type of prescribing/treating physician were measured. Descriptive statistics were performed.
Results:
Within the cohort (n = 1,079), the most common PSS treatment was oral baclofen (initial treatment: 60.9%; received on/after the initial treatment date up to March 31, 2020: 69.0%), largely prescribed by primary care physicians (77.6%) and started a median of 348 (IQR 741) days after the stroke. Focal botulinum toxin (23.3%; 37.7%) was largely prescribed by physiatrists (72.2%) and started 311 (IQR 446) days after the stroke; spasticity clinic visits (18.6%; 23.8%) were also common.
Conclusions:
We found evidence of gaps in provision of spasticity management in persons with PSS including overuse of systemic oral baclofen (that has common adverse side effects and lacks evidence of effectiveness in PSS) and potential underuse of focal botulinum toxin injections. Further investigation and strategies should be pursued to improve alignment of PSS treatment with guideline recommendations that in turn will support better outcomes for those with PSS.
Simulation-based education (SBE) is widespread in both undergraduate and postgraduate medical education, but less frequently in psychiatry. Despite this, the relatively small evidence base suggests high levels of participant satisfaction and educational benefit from SBE in psychiatry. Bringing SBE into the virtual environment presents another set of challenges we identified both through current medical education research and through our own experience. Our poster will demonstrate our current model of virtual simulation, the evidence base we used to develop this, and the feedback we have had from this new venture.
Methods
Background – As part of our undergraduate CAMHS teaching, where students spend 1 week within our service as part of a 3-week psychiatry clinical placement, we provide a single session of CAMHS SBE. This is delivered by 2 facilitators and a professional medical actor providing the role of the adolescent patient. Our virtual simulation teaching session has now been integrated into our teaching program. We have developed this session in line with current medical education research, and have presented this at the Annual Medical Education Conference and integrated feedback on our session into the current model.
Results
We have successfully adapted this session to be delivered remotely, and have received overwhelmingly positive feedback from our students, citing improvements in their confidence and learning after our session. Along with the challenges to engagement, participation, and patient involvement of remote teaching, we further adapted our session to accommodate increased numbers of students attending – a national trend. However, from current research and our experience, there are also benefits to both educators and students from virtual SBE.
Conclusion
Our results show that simulation can be used effectively in psychiatry through virtual media to expand student clinical experience and provide excellent educational opportunities. We present our model for virtual SBE and the evidence base we have used to develop this session, along with the feedback we have had from students, staff, and teams across the country.
Edited by
William J. Brady, University of Virginia,Mark R. Sochor, University of Virginia,Paul E. Pepe, Metropolitan EMS Medical Directors Global Alliance, Florida,John C. Maino II, Michigan International Speedway, Brooklyn,K. Sophia Dyer, Boston University Chobanian and Avedisian School of Medicine, Massachusetts
For large entertainment tours composed of 100 to 200 personnel moving from one city (or country) to another every few days over several months’ time, the odds of numerous untoward health events occurring, some very serious, become reasonably high. Beyond rigorous schedules and living/dining in close quarters, understandable reticence to abandon one’s post can occasionally delay timely care. Accordingly, having veteran medical specialists as part of the touring team has been found to be invaluable, not only for pre-emptive minor interventions and continuity of care, but also for immediate, expert handling of serious emergencies. Experienced, well-connected touring medical specialists also provide prospective contingency plans for each destination city and venue. These medical advance plans detail the most-knowledgeable local physicians or facilities for best managing any respective medical condition. They also identify the local “point-persons” to contact for coordination of true emergencies and especially if there is a need for multi-casualty incident management at the venue. They anticipate health risks such as air quality, altitude sickness, endemic disease vectors and other concerning threats at each destination. They also train touring staff in basic life support, bleeding control and emergency equipment readiness. Touring specialists should also be well-integrated into security team functions.
Observations of radiocarbon (14C) in Earth’s atmosphere and other carbon reservoirs are important to quantify exchanges of CO2 between reservoirs. The amount of 14C is commonly reported in the so-called Delta notation, i.e., Δ14C, the decay- and fractionation-corrected departure of the ratio of 14C to total C from that ratio in an absolute international standard; this Delta notation permits direct comparison of 14C/C ratios in the several reservoirs. However, as Δ14C of atmospheric CO2, Δ14CO2 is based on the ratio of 14CO2 to total atmospheric CO2, its value can and does change not just because of change in the amount of atmospheric14CO2 but also because of change in the amount of total atmospheric CO2, complicating ascription of change in Δ14CO2 to change in one or the other quantity. Here we suggest that presentation of atmospheric 14CO2 amount as mole fraction relative to dry air (moles of 14CO2 per moles of dry air in Earth’s atmosphere), or as moles or molecules of 14CO2 in Earth’s atmosphere, all readily calculated from Δ14CO2 and the amount of atmospheric CO2 (with slight dependence on δ13CO2), complements presentation only as Δ14CO2, and can provide valuable insight into the evolving budget and distribution of atmospheric 14CO2.
Limited evidence exists regarding care pathways for stroke survivors who do and do not receive poststroke spasticity (PSS) treatment.
Methods:
Administrative data was used to identify adults who experienced a stroke and sought acute care between 2012 and 2017 in Alberta, Canada. Pathways of stroke care within the health care system were determined among those who initiated PSS treatment (PSS treatment group: outpatient pharmacy dispensation of an anti-spastic medication, focal chemo-denervation injection, or a spasticity tertiary clinic visit) and those who did not (non-PSS treatment group). Time from the stroke event until spasticity treatment initiation, and setting where treatment was initiated were reported. Descriptive statistics were performed.
Results:
Health care settings within the pathways of stroke care that the PSS (n = 1,079) and non-PSS (n = 22,922) treatment groups encountered were the emergency department (86 and 84%), acute inpatient care (80 and 69%), inpatient rehabilitation (40 and 12%), and long-term care (19 and 13%), respectively. PSS treatment was initiated a median of 291 (interquartile range 625) days after the stroke event, and most often in the community when patients were residing at home (45%), followed by “other” settings (22%), inpatient rehabilitation (18%), long-term care (11%), and acute inpatient care (4%).
Conclusions:
To our knowledge, this is the first population based cohort study describing pathways of care among adults with stroke who subsequently did or did not initiate spasticity treatment. Areas for improvement in care may include strategies for earlier identification and treatment of PSS.
Loss of control eating is more likely to occur in the evening and is uniquely associated with distress. No studies have examined the effect of treatment on within-day timing of loss of control eating severity. We examined whether time of day differentially predicted loss of control eating severity at baseline (i.e. pretreatment), end-of-treatment, and 6-month follow-up for individuals with binge-eating disorder (BED), hypothesizing that loss of control eating severity would increase throughout the day pretreatment and that this pattern would be less pronounced following treatment. We explored differential treatment effects of cognitive-behavioral guided self-help (CBTgsh) and Integrative Cognitive-Affective Therapy (ICAT).
Methods
Individuals with BED (N = 112) were randomized to receive CBTgsh or ICAT and completed a 1-week ecological momentary assessment protocol at baseline, end-of-treatment, and 6-month follow-up to assess loss of control eating severity. We used multilevel models to assess within-day slope trajectories of loss of control eating severity across assessment periods and treatment type.
Results
Within-day increases in loss of control eating severity were reduced at end-of-treatment and 6-month follow-up relative to baseline. Evening acceleration of loss of control eating severity was greater at 6-month follow-up relative to end-of-treatment. Within-day increases in loss of control severity did not differ between treatments at end-of-treatment; however, evening loss of control severity intensified for individuals who received CBTgsh relative to those who received ICAT at 6-month follow-up.
Conclusions
Findings suggest that treatment reduces evening-shifted loss of control eating severity, and that this effect may be more durable following ICAT relative to CBTgsh.
Early detection of ST-segment elevation myocardial infarction (STEMI) on the prehospital electrocardiogram (ECG) improves patient outcomes. Current software algorithms optimize sensitivity but have a high false-positive rate. The authors propose an algorithm to improve the specificity of STEMI diagnosis in the prehospital setting.
Methods:
A dataset of prehospital ECGs with verified outcomes was used to validate an algorithm to identify true and false-positive software interpretations of STEMI. Four criteria implicated in prior research to differentiate STEMI true positives were applied: heart rate <130, QRS <100, verification of ST-segment elevation, and absence of artifact. The test characteristics were calculated and regression analysis was used to examine the association between the number of criteria included and test characteristics.
Results:
There were 44,611 cases available. Of these, 1,193 were identified as STEMI by the software interpretation. Applying all four criteria had the highest positive likelihood ratio of 353 (95% CI, 201-595) and specificity of 99.96% (95% CI, 99.93-99.98), but the lowest sensitivity (14%; 95% CI, 11-17) and worst negative likelihood ratio (0.86; 95% CI, 0.84-0.89). There was a strong correlation between increased positive likelihood ratio (r2 = 0.90) and specificity (r2 = 0.85) with increasing number of criteria.
Conclusions:
Prehospital ECGs with a high probability of true STEMI can be accurately identified using these four criteria: heart rate <130, QRS <100, verification of ST-segment elevation, and absence of artifact. Applying these criteria to prehospital ECGs with software interpretations of STEMI could decrease false-positive field activations, while also reducing the need to rely on transmission for physician over-read. This can have significant clinical and quality implications for Emergency Medical Services (EMS) systems.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Incidents of civil unrest are increasing in frequency across the United States and may involve protests or demonstrations in response to political or economic activities, responses to concerts or sporting events, or in response to controversial actions by law enforcement. During these incidents, injuries may occur either in isolation or in associated interactions with law enforcement personnel. There are specific patterns of injury that are unique to protests and encounters with law enforcement, and it is important for the urban emergency medicine practitioner to have a fundamental understanding of the management of common injuries associated with law enforcement interactions. This chapter will highlight the general principles of emergency department care during periods of civil unrest, including best practices for interacting with both public safety personnel and private citizens. Patterns of injury will be explored, including injuries occurring from kinetic impact projectiles, conducted electrical weapons, and crowd dispersal agents, in addition to other less-common injury patterns encountered during periods of civil unrest.
The next generation of high-power lasers enables repetition of experiments at orders of magnitude higher frequency than what was possible using the prior generation. Facilities requiring human intervention between laser repetitions need to adapt in order to keep pace with the new laser technology. A distributed networked control system can enable laboratory-wide automation and feedback control loops. These higher-repetition-rate experiments will create enormous quantities of data. A consistent approach to managing data can increase data accessibility, reduce repetitive data-software development and mitigate poorly organized metadata. An opportunity arises to share knowledge of improvements to control and data infrastructure currently being undertaken. We compare platforms and approaches to state-of-the-art control systems and data management at high-power laser facilities, and we illustrate these topics with case studies from our community.
To support school foods programmes by evaluating the relationship between nutritional quality, cost, student consumption and the environmental impacts of menus.
Design:
Using linear programming and data from previously served menu items, the relationships between the nutritional quality, cost, student consumption and the environmental impacts of lunch menus were investigated. Optimised lunch menus with the maximum potential student consumption and nutritional quality and lowest costs and environmental impacts were developed and compared with previously served menus (baseline).
Setting:
Boston Public Schools (BPS), Boston Massachusetts, USA.
Participants:
Menu items served on the 2018–2019 BPS lunch menu (n 142).
Results:
Using single-objective models, trade-offs were observed between most interests, but the use of multi-objective models minimised these trade-offs. Compared with the current weekly menus offered, multi-objective models increased potential caloric intake by up to 27 % and Healthy Eating Index scores by up to 19 % and reduced costs and environmental impacts by up to 13 % and 71 %, respectively. Improvements were made by reducing the frequency of beef and cheese entrées and increasing the frequency of fish and legume entrées on weekly menus.
Conclusions:
This work can be extrapolated to monthly menus to provide further direction for school districts, and the methods can be employed with different recipes and constraints. Future research should test the implementation of optimised menus in schools and consider the broader implications of implementation.
Black and Minority Ethnic (BME) communities have been disproportionately affected by COVID-19: death rates are higher and survival rates are lower, with statistics varying in different BME communities (Public Health England 2020b). BME communities are at risk of higher infection rates and mortality rates due to certain pre-disposed health conditions and living in poorer, overcrowded housing (Meer et al. 2020). These higher infection and mortality rates together with the fear of spreading the virus or catching it from others have caused further distress. Ethnic minorities in Britain have experienced a disproportional impact of COVID-19, as for these groups the pandemic was translated as a syndemic pandemic (Bambra et al. 2020) because of pre-pandemic inequalities on all social determinants of health such as unhealthy dietary practices, poor housing and working conditions, unemployment, poor access to healthcare, high levels of inactivity and discrimination that ethnic minorities and the majority of British Muslims live with.
This chapter highlights the disproportionate impact of COVID-19 on British Muslims and how the pandemic exposed prevalent health inequalities in the UK. We critically analyse the discussions around faith in relation to COVID-19, victim blaming, its impacts and the socioeconomic consequences of COVID-19 lockdowns on marginalised British Muslims. We evaluate the vulnerabilities of British Muslims working in the NHS and healthcare and the responses by professional Muslim organisations providing healthcare awareness. We explore the interplay of ethnicity, religion and deprivation in negotiating the particular challenges of living through COVID-19. We critically evaluate and problematise the notions around ‘vaccine hesitancy’, and question the emphasis on national religious organisations of British Muslims for responses to COVID-19 instead of professional medical organisations or small-scale community-based organisations. We assess the impact of COVID-19 on British Muslim families, children, charity and voluntary organisations, physical activity, mental health and wellbeing, and how British Muslims living in deprived neighbourhoods responded to the pandemic through engaging with community groups. We highlight the work of neighbourhood and community-based organisations and services for healthcare awareness by professional Muslim groups. This chapter also includes multidisciplinary perspectives of academics and practitioners on the pandemic, lockdown, vaccination and subsequent socioeconomic implications of COVID-19 with regard to British Muslims’ lived experience.
The majority of antimicrobials that are produced are administered to animals, particularly food animals. While the overall impact of antimicrobial use in animals on antimicrobial resistance in humans and the environment is unclear, it undeniably has a role. Yet, some degree of antimicrobial use in animals is necessary for animal health and welfare purposes. Balancing the benefits and risks of antimicrobial use in animals is challenging because of the complexity of the problem and limitations in available data. However, a range of measures can be implemented to reduce, refine and optimize antimicrobial use in animals, with a goal of minimizing the impact on human and environmental health while maintaining necessary therapeutic use in animals. A pandemic instrument can provide the necessary foundation for the whole-of-society and whole-of government One Health approach that is required to strengthen surveillance, communication, collaboration, and action.
Looked-after children are at risk of suboptimal attachment patterns and reactive attachment disorder (RAD). However, access to interventions varies widely, and there are no evidence-based interventions for RAD.
Aims
To modify an existing parenting intervention for children with RAD in the UK foster care setting, and test the feasibility of conducting a randomised controlled trial (RCT) of the modified intervention.
Method
The intervention was modified with expert input and tested on a case series. A feasibility and pilot RCT compared the new intervention with usual care. Foster carers and children in their care aged ≤6 years were recruited across nine local authorities, with 1:1 allocation and blind post-treatment assessments. The modified intervention was delivered in-home by trained mental health professionals over 4–6 months. Children were assessed for RAD symptoms, attachment quality and emotional/behavioural difficulties, and foster carers were assessed for sensitivity and stress.
Results
Minimal changes to the intervention programme were necessary, and focused on improving its suitability for the UK foster care context. Recruitment was challenging, and remained below target despite modifications to the protocol and the inclusion of additional sites. Thirty families were recruited to the RCT; 15 were allocated to each group. Most other feasibility outcomes were favourable, particularly high numbers of data and treatment completeness. The revised intervention was positively received by practitioners and foster carers.
Conclusions
A large-scale trial may be feasible, but only if recruitment barriers can be overcome. Dedicated resources to support recruitment within local authorities and wider inclusion criteria are recommended.
Disordered eating behaviors (DEB) impact on health and wellbeing worldwide. This study aimed to examine sociodemographic trends in the prevalence of DEB over 20 years in the Australian general population.
Methods
Data were derived from five sequential cross-sectional surveys (1998, 2008, 2009, 2016 and 2017) with population-representative samples of adults and adolescents residing in South Australia (N = 15 075). DEBs investigated were objective binge eating (OBE), strict dieting/fasting, and purging. Sociodemographic data included gender, age, educational level, work and marital status, and residence.
Results
OBE prevalence increased significantly. Strict dieting/fasting also increased from 1998 to 2008/9 but remained stable between 2008/9 and 2016/7. Purging prevalence did not change significantly over time. All survey years were associated with a significantly higher odds of OBE, and strict diet/fasting compared to 1998. Lower age, a higher Accessibility Remoteness Index of Australia (ARIA) score, higher body mass index (BMI), higher educational attainment, and not being in a married or de facto relationship were independently associated with greater adjusted odds for endorsing OBE. Younger age, female gender, and higher BMI were also independently associated with greater adjusted odds for endorsing strict dieting/fasting.
Conclusions
The increased prevalence of DEBs in various strata of Australian society has both public health and clinical implications. The results refute the stereotype that eating disorders (EDs) predominantly affect young women. They build impetus for future research on EDs among men and older individuals, with a view to developing tailored public health and clinical interventions for these populations.