We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Embedding climate resilient development principles in planning, urban design, and architecture means ensuring that transformation of the built environment helps achieve carbon neutrality, effective adaptation, and well-being for people and nature. Planners, urban designers, and architects are called to bridge the domains of research and practice and evolve their agency and capacity, developing methods and tools consistent across spatial scales to ensure the convergence of outcomes towards targets. Shaping change necessitates an innovative action-driven framework with multi-scale analysis of urban climate factors and co-mapping, co-design, and co-evaluation with city stakeholders and communities. This Element provides analysis on how urban climate factors, system efficiency, form and layout, building envelope and surface materials, and green/blue infrastructure affect key metrics and indicators related to complementary aspects like greenhouse gas emissions, impacts of extreme weather events, spatial and environmental justice, and human comfort. This title is also available as open access on Cambridge Core.
Haemolysis is developing prominence in the setting of supporting increasingly complex children with heart failure with a ventricular assist device. The goal of this study is to better characterise haemolysis and its implications in children supported with pulsatile ventricular assist devices.
Methods:
This is a single-centre retrospective review of 44 children who were supported by Berlin Heart EXCOR between January 2006 and June 2020. Patients were divided into major haemolysers and non-major haemolysers. Major haemolysers were defined as patients with lactate dehydrogenase > 500U/L (2.5x the upper limits of normal) with either total bilirubin > 2mg/dL (with predominantly indirect hyperbilirubinemia) or anaemia out of proportion to the clinical scenario more than three days following implantation of the ventricular assist device(s). Patient demographics, ventricular assist device factors, and outcomes, including end-organ function and mortality, were compared between major haemolysers and non-major haemolysers.
Main results:
Forty-four patients supported by the Berlin EXCOR were included in the analytic cohort of the study: 27 major haemolysers and 17 non-major haemolysers. Major haemolysis was more common in those supported with single-ventricle ventricular assist device (i.e., VAD in the context of functionally univentricular anatomy) compared to those with biventricular hearts, p = 0.01. There were no patients with an isolated left ventricular assist device or isolated right ventricular assist device in our analytic cohort of 44 patients. Of the 19 patients with single-ventricle ventricular assist device, 84% (16/19) were major haemolysers. Of the 25 patients with a biventricular assist device, 44% (11/25) were major haemolysers. Major haemolysers and non-major haemolysers had a body surface area of 0.28 and 0.40, respectively (p = 0.01). Overall, survival to discharge from the hospital was 66% (n = 29/44). Survival to discharge from the hospital was 52% (14/27) in major haemolysers versus 88% (15/17) in non-major haemolysers, p = 0.02. Only 3 of the 27 with major haemolysis had severe haemolysis, that is, lactate dehydrogenase > 2000 and bilirubin above 10. Non-major haemolysers had a better improvement in creatinine clearance during ventricular assist device support, p < 0.0001. (During the same era of this study, 22 patients who were supported with Berlin Heart were excluded from the analytic cohort because they did not have any recorded measurement of lactate dehydrogenase. Seventeen of these 22 patients had no clinical evidence of haemolysis. Survival to discharge from the hospital in this excluded cohort was 86% [19/22].)
Conclusions:
Major haemolysis in patients with pulsatile ventricular assist device is more likely with single-ventricle ventricular assist device support and smaller body surface area.
For many researchers, the ethical approval process can appear confusing, overwhelming, or irrelevant. Common sources of confusion include knowing which types of ethics approvals are required, how to get the approval, and understanding the language surrounding the review process. This editorial discusses the importance of ethics in creating and reporting quality research and provides a practical guide to help navigate the ethical approval process.
The stars of the Milky Way carry the chemical history of our Galaxy in their atmospheres as they journey through its vast expanse. Like barcodes, we can extract the chemical fingerprints of stars from high-resolution spectroscopy. The fourth data release (DR4) of the Galactic Archaeology with HERMES (GALAH) Survey, based on a decade of observations, provides the chemical abundances of up to 32 elements for 917 588 stars that also have exquisite astrometric data from the Gaia satellite. For the first time, these elements include life-essential nitrogen to complement carbon, and oxygen as well as more measurements of rare-earth elements critical to modern-life electronics, offering unparalleled insights into the chemical composition of the Milky Way. For this release, we use neural networks to simultaneously fit stellar parameters and abundances across the whole wavelength range, leveraging synthetic grids computed with Spectroscopy Made Easy. These grids account for atomic line formation in non-local thermodynamic equilibrium for 14 elements. In a two-iteration process, we first fit stellar labels to all 1 085 520 spectra, then co-add repeated observations and refine these labels using astrometric data from Gaia and 2MASS photometry, improving the accuracy and precision of stellar parameters and abundances. Our validation thoroughly assesses the reliability of spectroscopic measurements and highlights key caveats. GALAH DR4 represents yet another milestone in Galactic archaeology, combining detailed chemical compositions from multiple nucleosynthetic channels with kinematic information and age estimates. The resulting dataset, covering nearly a million stars, opens new avenues for understanding not only the chemical and dynamical history of the Milky Way but also the broader questions of the origin of elements and the evolution of planets, stars, and galaxies.
A common and unfortunate error in statistical analysis is the failure to account for dependencies in the data. In many studies, there is a set of individual participants or experimental objects where two observations are made on each individual or object. This leads to a natural pairing of data. This editorial discusses common situations where paired data arises and gives guidance on selecting the correct analysis plan to avoid statistical errors.
The scientific manuscript review process can often seem daunting and mysterious to authors. Frequently, medical journals do not describe the peer-review process in detail, which can further lead to frustration for authors, peer reviewers, and readers. This editorial describes the updated manuscript review process for Prehospital and Disaster Medicine. It is hoped that this editorial will lead to increased clarity and transparency in the review process.
The delivery of paediatric cardiac care across the world occurs in settings with significant variability in available resources. Irrespective of the resources locally available, we must always strive to improve the quality of care we provide to our patients and simultaneously deliver such care in the most efficient and cost-effective manner. The development of cardiac networks is used widely to achieve these aims.
Methods:
This paper reports three talks presented during the 56th meeting of the Association for European Paediatric and Congenital Cardiology held in Dublin in April 2023.
Results:
The three talks describe how centres of congenital cardiac excellence can be developed in low-income countries, middle-income countries, and well-resourced environments, and also reports how centres across different countries can come together to collaborate and deliver high-quality care. It is a fact that barriers to creating effective networks may arise from competition that may exist among programmes in unregulated and especially privatised health care environments. Nevertheless, reflecting on the creation of networks has important implications because collaboration between different centres can facilitate the maintenance of sustainable programmes of paediatric and congenital cardiac care.
Conclusion:
This article examines the delivery of paediatric and congenital cardiac care in resource limited environments, well-resourced environments, and within collaborative networks, with the hope that the lessons learned from these examples can be helpful to other institutions across the world. It is important to emphasise that irrespective of the differences in resources across different continents, the critical principles underlying provision of excellent care in different environments remain the same.
In June of 2024, Becton Dickinson experienced a blood culture bottle shortage for their BACTEC system, forcing health systems to reduce usage or risk exhausting their supply. Virginia Commonwealth University Health System (VCUHS) in Richmond, VA decided that it was necessary to implement austerity measures to preserve the blood culture bottle supply.
Setting:
VCUHS includes a main campus in Richmond, VA as well as two affiliate hospitals in South Hill, VA (Community Memorial Hospital (CMH)) and Tappahannock Hospital in Tappahannock, VA. It also includes a free-standing Emergency Department in New Kent, VA.
Patients:
Blood cultures from both pediatric and adult patients were included in this study.
Interventions:
VCUHS intervened to decrease blood culture utilization across the entire health system. Interventions included communication of blood culture guidance as well as an electronic health record order designed to guide providers and discourage wasteful ordering.
Results:
Post-implementation analyses showed that interventions reduced overall usage by 35.6% (P < .0001) and by greater than 40% in the Emergency Departments. The impact of these changes in utilization on positivity were analyzed, and it was found that the overall positivity rate increased post-intervention from 8.8% to 12.1% (P = .0115) and in the ED specifically from 10.2% to 19.5% (P < .0001).
Conclusions:
These findings strongly suggest that some basic stewardship interventions can significantly change blood culture practice in a manner that minimizes the impact on patient care.
Mitochondrial trifunctional protein deficiency is a long-chain fatty acid disorder that may include manifestations of severe cardiomyopathy and arrhythmias. The pathophysiology for the severe presentation is unclear but is an indicator for worse outcomes. Triheptanoin, a synthetic medium chain triglyceride, has been reported to reverse cardiomyopathy in some individuals, but there is limited literature in severe cases. We describe a neonatal onset of severe disease whose clinical course was not improved despite mechanical support and triheptanoin.
Nuclear strategy as a concept defies easy characterisation. It is a contradiction in terms: such is the destructive power of many nuclear weapons that to employ them would not bring any tangible benefit, especially if the adversary could threaten nuclear retaliation. They hold so little political appeal that since 1945 nuclear weapons have not been used in conflict and are therefore effectively unusable as weapons. Moreover, the idea of a war in which only nuclear weapons are used might exist in theory but would be a remote possibility. Instead, the practice of nuclear strategy has been dominated by ideas and plans in which nuclear weapons might be used in the course of a war alongside conventional weapons. Thus, rather than a concept of ‘nuclear strategy’, a more accurate formulation would be a ‘strategy with a nuclear component’. Yet there remained utility in thinking in terms of ‘nuclear strategy’, particularly in relation to deterrence. This chapter will explore these complex dynamics in several ways. First, it will examine the strategic ideas underpinning use of the atomic bomb at the end of the Second World War. Second, it will discuss the different strategies nuclear states have devised in relation to other nuclear states. Finally, the strategies of non-nuclear states and nuclear aspirants when confronted with nuclear adversaries will be analysed.
A version of the discrete proportional hazards model is developed for psychometrical applications. In such applications, a primary covariate that influences failure times is a latent variable representing a psychological construct. The Metropolis-Hastings algorithm is studied as a method for performing marginal likelihood inference on the item parameters. The model is illustrated with a real data example that relates the age at which teenagers first experience various substances to the latent ability to avoid the onset of such behaviors.
OpenMx is free, full-featured, open source, structural equation modeling (SEM) software. OpenMx runs within the R statistical programming environment on Windows, Mac OS–X, and Linux computers. The rationale for developing OpenMx is discussed along with the philosophy behind the user interface. The OpenMx data structures are introduced—these novel structures define the user interface framework and provide new opportunities for model specification. Two short example scripts for the specification and fitting of a confirmatory factor model are next presented. We end with an abbreviated list of modeling applications available in OpenMx 1.0 and a discussion of directions for future development.
Racial and ethnic variations in antibiotic utilization are well-reported in outpatient settings but little is known about inpatient settings. Our objective was to describe national inpatient antibiotic utilization among children by race and ethnicity.
Methods:
This study included hospital visit data from the Pediatric Health Information System between 01/01/2022 and 12/31/2022 for patients <20 years. Primary outcomes were the percentage of hospitalization encounters that received an antibiotic and antibiotic days of therapy (DOT) per 1000 patient days. Mixed-effect regression models were used to determine the association of race-ethnicity with outcomes, adjusting for covariates.
Results:
There were 846,530 hospitalizations. 45.2% of children were Non-Hispanic (NH) White, 27.1% were Hispanic, 19.2% were NH Black, 4.5% were NH Other, 3.5% were NH Asian, 0.3% were NH Native Hawaiian/Other Pacific Islander (NHPI) and 0.2% were NH American Indian. Adjusting for covariates, NH Black children had lower odds of receiving antibiotics compared to NH White children (aOR 0.96, 95%CI 0.94–0.97), while NH NHPI had higher odds of receiving antibiotics (aOR 1.16, 95%CI 1.05–1.29). Children who were Hispanic, NH Asian, NH American Indian, and children who were NH Other received antibiotic DOT compared to NH White children, while NH NHPI children received more antibiotic DOT.
Conclusions:
Antibiotic utilization in children’s hospitals differs by race and ethnicity. Hospitals should assess policies and practices that may contribute to disparities in treatment; antibiotic stewardship programs may play an important role in promoting inpatient pharmacoequity. Additional research is needed to examine individual diagnoses, clinical outcomes, and drivers of variation.
Nuclear deterrence strategies are predicated on nuclear use scenarios. However, as nuclear weapons haven’t been used since 1945, why does use occur in scenarios but not in practice? If scenarios incorporated the political challenges of crossing the nuclear threshold, how would this change the utility of the deterrence strategies they support? To address these questions, this article examines Cold War-era American debates about a Soviet ‘first strike’, discusses the limits of technical critiques of nuclear use scenarios, and argues for an alternative approach to scenario design and criticism that includes political factors observable in crises and wars involving nuclear states.
Congress directed the Secretary of Defense (DoD) to conduct a Pilot program to increase the National Disaster Medical System’s (NDMS) surge capacity, capabilities, and interoperability to support patient movement during a large-scale overseas contingency operation.
Methods
The Pilot conducted a mixed methods exploratory study, the Military-Civilian NDMS Interoperability Study (MCNIS), identifying 55 areas of solutions for NDMS innovation that align with interagency stakeholder interests. Priorities were determined via facilitated discussions, refined and validated by all five Pilot sites.
Results
As the DoD provides essential support for the patient movement component within NDMS, the results highlighted areas for improvement between receiving patients at an airfield and moving them to NDMS definitive care partners during a large medical surge event. This includes patient tracking capabilities, transportation processes and patient placement.
Conclusions
In collaboration with the Departments of Health & Human Services, Homeland Security, Transportation, and Veterans Health Administration, the Pilot is addressing these areas for improvement, by executing site-specific projects that will be validated and identified for export across the system. Leaders across the Pilot site healthcare networks are working to enhance patient movement and tracking. Ultimately, the Pilot will deliver dozens of proven solutions to enhance the NDMS’s patient movement capabilities.
Artificial intelligence (AI) in emergency medicine has been increasingly studied over the past decade. However, the implementation of AI requires significant buy-in from end-users. This study explored desired clinical applications of AI by emergency physicians.
Methods
A 3-round Delphi process was undertaken using STAT59 software. An international expert panel was assembled through purposeful sampling to reflect a diversity in geography, age, time in practice, practice setting, role, and expertise. Items generated in Round 1 were collated by the study team and ranked in Rounds 2 and 3 on a 7-point linear numeric scale of importance. Consensus was defined as a standard deviation of 1.0 or less.
Results
Of 66 invited experts, 29 completed Round 1, 25 completed Round 2, and 23 completed Round 3. Three statements reached consensus in Round 2 and four statements reached consensus in Round 3, including safe prescribing, guiding choice of drug, adjusting drug doses, identifying risk or prognosis, and reporting/interpreting investigation results.
Conclusions
Many desired clinical applications of AI in emergency medicine have not yet been explored. Clinical and technological experts should co-create new applications to ensure buy-in from all stakeholders. Specialty organizations can lead the way by establishing local clinical priorities.
The release of ChatGPT in November 2022 drastically lowered the barrier to artificial intelligence with an intuitive web-based interface to a large language model. This study addressed the research problem: “Can ChatGPT adequately triage simulated disaster patients using the Simple Triage and Rapid Treatment (START) tool?”
Methods
Five trained disaster medicine physicians developed nine prompts. A Python script queried ChatGPT Version 4 with each prompt combined with 391 validated patient vignettes. Ten repetitions of each combination were performed: 35190 simulated triages.
Results
A valid START score was returned In 35102 queries (99.7%). There was considerable variability in the results. Repeatability (use of the same prompt repeatedly) was responsible for 14.0% of overall variation. Reproducibility (use of different prompts) was responsible for 4.1% of overall variation. Accuracy of ChatGPT for START was 61.4% with a 5.0% under-triage rate and a 33.6% over-triage rate. Accuracy varied by prompt between 45.8% and 68.6%.
Conclusions
This study suggests that the current ChatGPT large language model is not sufficient for triage of simulated patients using START due to poor repeatability and accuracy. Medical practitioners should be aware that while ChatGPT can be a valuable tool, it may lack consistency and may provide false information.
While many medical practitioners value the interactive nature of in-person conferences, results of these interactions are often poorly documented. The objective of this study was to pilot the Delphi method for developing consensus following a national conference and to compare the results between experts who did and did not attend.
Methods:
A 3-round Delphi included experts attending the 2023 Society of Disaster Medicine and Health Preparedness Annual Meeting and experts who were members of the society but did not attend. Conference speakers provided statements related to their presentations. Experts rated the statements on a 1–7 scale for agreement using STAT59 software (STAT59 Services Ltd, Edmonton, Alberta, Canada). Consensus was defined as a standard deviation of ≤ 1.0.
Results:
Seventy-five statements were rated by 27 experts who attended and 10 who did not: 2634 ratings in total. There was no difference in the number of statements reaching consensus in the attending group (26/75) versus that of the nonattending group (27/75) (P = 0.89). However, which statements reached consensus differed between the groups.
Conclusion:
The Delphi method is a viable method to document consensus from a conference. Advantages include the ability to involve large groups of experts, statistical measurement of the degree of consensus, and prioritization of the results.
We conducted a series of experiments that revealed the formation of mm-scale penitente structures in ice illuminated by broadband light under moderate vacuum conditions between 50 and 2000 Pa. The experimental apparatus consists of a 0.3 m diameter cylindrical vacuum chamber with a cooling jacket surrounding the outer radius and bottom surface. Light shines in through an optical window at the top to illuminate most of the ice surface. We observe penitente-like structures at temperatures between −15$^\circ$C and $-2^\circ$C and pressures close to the equilibrium vapor pressure at the ice surface temperature. The formation of these structures is very sensitive to slight changes in background pressure, and the structures tend to vanish with significant deviations away from the equilibrium curve, resulting in a smooth sublimated crater formation instead of penitentes. Application of the physical model by Claudin and others (2015, doi: 10.1103/PhysRevE.92.033015) at experimental conditions generally agrees with observations for penitente spacing.
Bicuspid aortic valve is the most common CHD and commonly associated with activity restrictions that may lead to a sedentary lifestyle known to increase obesity risk. It is unknown whether obesity is associated with changes in aortic dimensions or aortic valve function in young people with bicuspid aortic valve. This study investigates whether overweight and obese children with bicuspid aortic valve have worse aortic valve function or increased aortic dimensions compared to healthy weight children with bicuspid aortic valve.
Methods:
This was a single centre retrospective cohort study comprised of patients 5 to 25 years old with a diagnosis of bicuspid aortic valve between 1 January, 2019 and 31 December, 2020. Patients were classified as healthy weight or overweight/obese. Values for aortic dimensions as well as peak and mean aortic valve gradients were obtained from echocardiogram reports.
Results:
About 251 patients were analysed. Demographics were similar between groups. When indexed to height, the aortic valve annulus (1.28 ± 0.14 vs. 1.34 ± 0.15, p = 0.001) and sinotubular junctions (1.44 ± 0.21 vs. 1.49 ± 0.24, p = 0.038) were larger in the overweight/obese group, with no differences in aortic root or ascending aorta sizes. The obese/overweight group had a higher peak aortic valve gradient (23.03 ± 1.64 mmHg vs. 16.17 ± 1.55 mmHg, p = 0.003) compared to the healthy weight group.
Conclusion:
Healthy weight patients did not have larger aortic dimensions compared to the overweight/obese patients. There was evidence of worsening aortic valve stenosis in overweight/obese patients compared to those at a healthy weight.