We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Hand, foot and mouth disease (HFMD) is a contagious communicable disease, with a high incidence in children aged under 10 years. It is a mainly self-limiting disease but can also cause serious neurological or cardiopulmonary complications in some cases, which can lead to death. Little is known about the burden of HMFD on primary care health care services in the UK. The aim of this work was to describe trends in general practitioner (GP) consultations for HFMD in England from January 2017 to December 2022 using a syndromic surveillance network of GPs. Daily GP consultations for HFMD in England were extracted from 1 January 2017 to 31 December 2022. Mean weekly consultation rates per 100,000 population and 95% confidence intervals (CI) were calculated. Consultation rates and rate ratios (RR) were calculated by age group and sex. During the study period, the mean weekly consultation rate for HFMD (per 100,000 registered GP patients) was 1.53 (range of 0.27 to 2.47). In England, children aged 1–4 years old accounted for the largest affected population followed by children <1 years old. We observed a seasonal pattern of HFMD incidence during the non-COVID years, with a seasonal peak of mean weekly rates between months of September and December. HFMD is typically diagnosed clinically rather than through laboratory sampling. Therefore, the ability to look at the daily HFMD consultation rates provides an excellent epidemiological overview on disease trends. The use of a novel GP-in-hours surveillance system allowed a unique epidemiological insight into the recent trends of general practitioner consultations for HFMD. We demonstrate a male predominance of cases, the impact of the non-pharmaceutical interventions during the COVID-19 pandemic, and a change in the week in which the peak number of cases happens post-pandemic.
To maximize its value, the design, development and implementation of structural health monitoring (SHM) should focus on its role in facilitating decision support. In this position paper, we offer perspectives on the synergy between SHM and decision-making. We propose a classification of SHM use cases aligning with various dimensions that are closely linked to the respective decision contexts. The types of decisions that have to be supported by the SHM system within these settings are discussed along with the corresponding challenges. We provide an overview of different classes of models that are required for integrating SHM in the decision-making process to support the operation and maintenance of structures and infrastructure systems. Fundamental decision-theoretic principles and state-of-the-art methods for optimizing maintenance and operational decision-making under uncertainty are briefly discussed. Finally, we offer a viewpoint on the appropriate course of action for quantifying, validating, and maximizing the added value generated by SHM. This work aspires to synthesize the different perspectives of the SHM, Prognostic Health Management, and reliability communities, and provide directions to researchers and practitioners working towards more pervasive monitoring-based decision-support.
The Accelerating COVID-19 Therapeutic Interventions and Vaccines Therapeutic-Clinical Working Group members gathered critical recommendations in follow-up to lessons learned manuscripts released earlier in the COVID-19 pandemic. Lessons around agent prioritization, preclinical therapeutics testing, master protocol design and implementation, drug manufacturing and supply, data sharing, and public–private partnership value are shared to inform responses to future pandemics.
Desorption processes of low-molecular-weight compounds from the surface of smectites into the gas phase determine a number of processes, e.g. those involved in drug delivery and the release of herbicides. The desorption has not been investigated thoroughly and is not well understood. The present study was undertaken in order to understand better the factors influencing these desorption mechanisms. Starting with a very pure standard (Na+-rich) montmorillonite (Kunipia-F), which was exchanged against cations with different hydration properties (Ca2+, Li+, phenyltrimethylammonium, hexyltrimethyl-ammonium), the experiments explored the rate of desorption of volatiles with different chemical functionalities (water, ethanol, ethyl acetate, and toluene). The desorption was monitored by thermogravimetry and differential scanning calorimetry under isothermal conditions, and by ramping the temperature at a constant rate. The experiments were compared with numerical calculations based on finite-element methods and with analytical models. These data point to a two-step mechanism where the desorption follows the curve of the equilibrium desorption isotherms of those molecules on the montmorillonite. The bulk-like volatiles (i.e. volatiles with release kinetics close to that of the bulk liquids) were desorbed in a first step. With a decrease in the degree of coverage of the volatile on the montmorillonite, the desorption was increasingly dominated by the strength of interaction between the volatile and the interlayer cations of the montmorillonite.
We recently reported on the radio-frequency attenuation length of cold polar ice at Summit Station, Greenland, based on bi-static radar measurements of radio-frequency bedrock echo strengths taken during the summer of 2021. Those data also allow studies of (a) the relative contributions of coherent (such as discrete internal conducting layers with sub-centimeter transverse scale) vs incoherent (e.g. bulk volumetric) scattering, (b) the magnitude of internal layer reflection coefficients, (c) limits on signal propagation velocity asymmetries (‘birefringence’) and (d) limits on signal dispersion in-ice over a bandwidth of ~100 MHz. We find that (1) attenuation lengths approach 1 km in our band, (2) after averaging 10 000 echo triggers, reflected signals observable over the thermal floor (to depths of ~1500 m) are consistent with being entirely coherent, (3) internal layer reflectivities are ≈–60$\to$–70 dB, (4) birefringent effects for vertically propagating signals are smaller by an order of magnitude relative to South Pole and (5) within our experimental limits, glacial ice is non-dispersive over the frequency band relevant for neutrino detection experiments.
In June 2019 the Health Protection Team in Yorkshire and Humber, England, was notified of cases of hepatitis A virus (HAV) infection in staff at a secondary school. Investigation revealed that an earlier case worked as a food handler in the school kitchen. Indirect transmission through food from the canteen was considered the most likely route of transmission. Cases were described according to setting of exposure. Oral fluid was obtained from students for serological testing. Environmental investigations were undertaken at settings where food handling was considered a potential transmission risk. Thirty-three confirmed cases were linked to the outbreak. All of those tested (n = 31) shared the same sequence with a HAV IB genotype. The first three cases were a household cluster and included the index case for the school. A further 19 cases (16 students, 3 staff) were associated with the school and consistent with indirect exposure to the food handler. One late onset case could not be ruled out as a secondary case within the school and resulted in vaccination of the school population. Five cases were linked to a bakery where a case from the initial household cluster worked as a food server. No concerns about hygiene standards were noted at either the school or the bakery. Oral fluid samples taken at the time of vaccination from asymptomatic students (n = 219, 11–16 years-old) showed no evidence of recent or current infection. This outbreak included household and foodborne transmission but limited (and possibly zero) person-to-person transmission among secondary school students. Where adequate hygiene exists, secondary transmission within older students may not occur.
Laws regulating patient care are an essential component of protecting patients and doctors alike. No studies have previously examined what laws exist regarding pelvic examinations in the United States (US). This study systematically reviews and compares regulation and legislation of pelvic examinations in the U.S. and provides a comprehensive resource to educate clinicians, patients, and lawmakers. Each of the fifty States in the U.S. was included. The primary outcome was existence of any pelvic or rectal exam laws. Data was obtained for the type of examination defined within the law, exceptions to the law, to whom the law applied to, the type of consent required, and to whom the consent applied to. Laws were identified from each of the individual state legislative websites. All sections of each law pertaining to pelvic examination were reviewed and organized by state. Descriptive statistics were performed for each of the variables, including frequencies of each amongst the fifty states. State regulation for pelvic examinations varied from no law or regulation to laws pertaining to pelvic, rectal, prostate, and breast examination performed in any context. As of November 22, 2022, there are twenty states (40%) with pelvic examination laws applying to anesthetized or unconscious patients. Thirteen additional states (26%) have proposed pelvic exam laws. Seventeen states (34%) do not have any laws regarding pelvic examinations. Regulation of pelvic examinations has become an increasingly important issue over the past few years in response to growing concerns of patient autonomy and the ethical issues raised by such sensitive examinations. While pelvic examination laws that balance protection for patient autonomy and the needs of caregivers and educators exist in much of the U.S., more work needs to continue in consultation with physicians and health care providers to ensure that all states have reasonable laws protecting the autonomy of patients while also maintaining quality of care.
Over the last 25 years, radiowave detection of neutrino-generated signals, using cold polar ice as the neutrino target, has emerged as perhaps the most promising technique for detection of extragalactic ultra-high energy neutrinos (corresponding to neutrino energies in excess of 0.01 Joules, or 1017 electron volts). During the summer of 2021 and in tandem with the initial deployment of the Radio Neutrino Observatory in Greenland (RNO-G), we conducted radioglaciological measurements at Summit Station, Greenland to refine our understanding of the ice target. We report the result of one such measurement, the radio-frequency electric field attenuation length $L_\alpha$. We find an approximately linear dependence of $L_\alpha$ on frequency with the best fit of the average field attenuation for the upper 1500 m of ice: $\langle L_\alpha \rangle = ( ( 1154 \pm 121) - ( 0.81 \pm 0.14) \, ( \nu /{\rm MHz}) ) \,{\rm m}$ for frequencies ν ∈ [145 − 350] MHz.
Herbicide-resistant (HR) crops are widely grown throughout the United States and Canada. These crop-trait technologies can enhance weed management and therefore can be an important component of integrated weed management (IWM) programs. Concomitantly, evolution of HR weed populations has become ubiquitous in agricultural areas where HR crops are grown. Nevertheless, crop cultivars with new or combined (stacked) HR traits continue to be developed and commercialized. This review, based on a symposium held at the Western Society of Weed Science annual meeting in 2021, examines the impact of HR crops on HR weed management in the U.S. Great Plains, U.S. Pacific Northwest, and the Canadian Prairies over the past 25 yr and their past and future contributions to IWM. We also provide an industry perspective on the future of HR crop development and the role of HR crops in resistance management. Expanded options for HR traits in both major and minor crops are expected. With proper stewardship, HR crops can reduce herbicide-use intensity and help reduce selection pressure on weed populations. However, their proper deployment in cropping systems must be carefully planned by considering a diverse crop rotation sequence with multiple HR and non-HR crops and maximizing crop competition to effectively manage HR weed populations. Based on past experiences in the cultivation of HR crops and associated herbicide use in the western United States and Canada, HR crops have been important determinants of both the selection and management of HR weeds.
Excited delirium, which has been defined as combativeness, agitation, and altered sensorium, requires immediate treatment in prehospital or emergency department (ED) settings for the safety of both patients and caregivers. Prehospital ketamine use is prevalent, although the evidence on safety and efficacy is limited. Many patients with excited delirium are intoxicated with illicit substances. This investigation explores whether patients treated with prehospital ketamine for excited delirium with concomitant substance intoxication have higher rates of subsequent intubation in the ED compared to those without confirmed substance usage.
Methods:
Over 28 months at two large community hospitals, all medical records were retrospectively searched for all patients age 18 years or greater with prehospital ketamine intramuscular (IM) administration for excited delirium and identified illicit and prescription substance co-ingestions. Trained abstractors collected demographic characteristics, history of present illness (HPI), urine drug screens (UDS), alcohol levels, and noted additional sedative administrations. Substance intoxication was determined by UDS and alcohol positivity or negativity, as well as physician HPI. Patients without toxicological testing or documentation of substance intoxication, or who may have tested positive due to ED sedation, were excluded from relevant analyses. Subsequent ED intubation was the primary pre-specified outcome. Odds ratios (OR) and 95% confidence intervals (CI) were calculated to compare variables.
Results:
Among 86 patients given prehospital ketamine IM for excited delirium, baseline characteristics including age, ketamine dose, and body mass index were similar between those who did or did not undergo intubation. Men had higher intubation rates. Patients testing positive for alcohol, amphetamines, barbiturates, benzodiazepines, ecstasy, marijuana, opiates, and synthetic cathinones, both bath salts and flakka, had similar rates of intubation compared to those negative for these substances. Of 27 patients with excited delirium and concomitant cocaine intoxication, nine (33%) were intubated compared with four of 50 (8%) without cocaine intoxication, yielding a 5.75 OR (95%, CI 1.57 to 21.05; P = .009).
Conclusion:
Patients treated with ketamine IM for excited delirium with concomitant cocaine intoxication had a statistically significant 5.75-fold increased rate of subsequent intubation in the ED. Amongst other substances, no other trends with intubation were noted, but further study is warranted.
Substantial progress has been made in the standardization of nomenclature for paediatric and congenital cardiac care. In 1936, Maude Abbott published her Atlas of Congenital Cardiac Disease, which was the first formal attempt to classify congenital heart disease. The International Paediatric and Congenital Cardiac Code (IPCCC) is now utilized worldwide and has most recently become the paediatric and congenital cardiac component of the Eleventh Revision of the International Classification of Diseases (ICD-11). The most recent publication of the IPCCC was in 2017. This manuscript provides an updated 2021 version of the IPCCC.
The International Society for Nomenclature of Paediatric and Congenital Heart Disease (ISNPCHD), in collaboration with the World Health Organization (WHO), developed the paediatric and congenital cardiac nomenclature that is now within the eleventh version of the International Classification of Diseases (ICD-11). This unification of IPCCC and ICD-11 is the IPCCC ICD-11 Nomenclature and is the first time that the clinical nomenclature for paediatric and congenital cardiac care and the administrative nomenclature for paediatric and congenital cardiac care are harmonized. The resultant congenital cardiac component of ICD-11 was increased from 29 congenital cardiac codes in ICD-9 and 73 congenital cardiac codes in ICD-10 to 318 codes submitted by ISNPCHD through 2018 for incorporation into ICD-11. After these 318 terms were incorporated into ICD-11 in 2018, the WHO ICD-11 team added an additional 49 terms, some of which are acceptable legacy terms from ICD-10, while others provide greater granularity than the ISNPCHD thought was originally acceptable. Thus, the total number of paediatric and congenital cardiac terms in ICD-11 is 367. In this manuscript, we describe and review the terminology, hierarchy, and definitions of the IPCCC ICD-11 Nomenclature. This article, therefore, presents a global system of nomenclature for paediatric and congenital cardiac care that unifies clinical and administrative nomenclature.
The members of ISNPCHD realize that the nomenclature published in this manuscript will continue to evolve. The version of the IPCCC that was published in 2017 has evolved and changed, and it is now replaced by this 2021 version. In the future, ISNPCHD will again publish updated versions of IPCCC, as IPCCC continues to evolve.
The SPARC tokamak is a critical next step towards commercial fusion energy. SPARC is designed as a high-field ($B_0 = 12.2$ T), compact ($R_0 = 1.85$ m, $a = 0.57$ m), superconducting, D-T tokamak with the goal of producing fusion gain $Q>2$ from a magnetically confined fusion plasma for the first time. Currently under design, SPARC will continue the high-field path of the Alcator series of tokamaks, utilizing new magnets based on rare earth barium copper oxide high-temperature superconductors to achieve high performance in a compact device. The goal of $Q>2$ is achievable with conservative physics assumptions ($H_{98,y2} = 0.7$) and, with the nominal assumption of $H_{98,y2} = 1$, SPARC is projected to attain $Q \approx 11$ and $P_{\textrm {fusion}} \approx 140$ MW. SPARC will therefore constitute a unique platform for burning plasma physics research with high density ($\langle n_{e} \rangle \approx 3 \times 10^{20}\ \textrm {m}^{-3}$), high temperature ($\langle T_e \rangle \approx 7$ keV) and high power density ($P_{\textrm {fusion}}/V_{\textrm {plasma}} \approx 7\ \textrm {MW}\,\textrm {m}^{-3}$) relevant to fusion power plants. SPARC's place in the path to commercial fusion energy, its parameters and the current status of SPARC design work are presented. This work also describes the basis for global performance projections and summarizes some of the physics analysis that is presented in greater detail in the companion articles of this collection.
Owing to its high magnetic field, high power, and compact size, the SPARC experiment will operate with divertor conditions at or above those expected in reactor-class tokamaks. Power exhaust at this scale remains one of the key challenges for practical fusion energy. Based on empirical scalings, the peak unmitigated divertor parallel heat flux is projected to be greater than 10 GW m−2. This is nearly an order of magnitude higher than has been demonstrated to date. Furthermore, the divertor parallel Edge-Localized Mode (ELM) energy fluence projections (~11–34 MJ m−2) are comparable with those for ITER. However, the relatively short pulse length (~25 s pulse, with a ~10 s flat top) provides the opportunity to consider mitigation schemes unsuited to long-pulse devices including ITER and reactors. The baseline scenario for SPARC employs a ~1 Hz strike point sweep to spread the heat flux over a large divertor target surface area to keep tile surface temperatures within tolerable levels without the use of active divertor cooling systems. In addition, SPARC operation presents a unique opportunity to study divertor heat exhaust mitigation at reactor-level plasma densities and power fluxes. Not only will SPARC test the limits of current experimental scalings and serve for benchmarking theoretical models in reactor regimes, it is also being designed to enable the assessment of long-legged and X-point target advanced divertor magnetic configurations. Experimental results from SPARC will be crucial to reducing risk for a fusion pilot plant divertor design.
Prehospital intramuscular (IM) ketamine is increasingly used for chemical restraint of agitated patients. However, few studies have assessed emergency department (ED) follow-up of patients receiving prehospital ketamine for this indication, with previous reports suggesting a high rate of post-administration intubation. This study examines the rate of and reasons for intubation and other airway interventions in agitated patients who received ketamine by Emergency Medical Services (EMS).
Methods:
This retrospective cohort study included patients who received prehospital ketamine for agitation and were transported to two community hospital EDs. Charts were reviewed for demographics, ketamine dose, and airway intervention by EMS or in the ED. Characteristics of patients who were intubated versus those who did not receive airway intervention were analyzed.
Results:
Over 28 months, 86 patients received ketamine for agitation. Fourteen (16.3%) underwent endotracheal intubation. Patients with a higher temperature and a lower Glasgow Coma Score (GCS) were more likely to require intubation. There was no age or dose-dependent association on intubation rate. Intubated patients averaged 39 years old versus 44 for patients not intubated (negative five-year difference; 95% CI, -16 to 6). The mean ketamine dose was 339.3mg in patients intubated versus 350.7mg in patients not (-11.4mg difference; 95% CI, -72.4 to 49.6). The mean weight-based ketamine dose was 4.44mg/kg in patients intubated versus 4.96mg/kg in patients not (-0.53mg/kg difference; 95% CI, -1.49 to 0.43).
Conclusions:
The observed rate of intubation in patients receiving prehospital ketamine for agitation was 16.3%. Study data did not reveal an age or dose-dependent rate of intubation. Further research should be conducted to compare the airway intervention rate of agitated patients receiving ketamine versus other sedatives in a controlled fashion.
We report key learning from the public health management of the first two confirmed cases of COVID-19 identified in the UK. The first case imported, and the second associated with probable person-to-person transmission within the UK. Contact tracing was complex and fast-moving. Potential exposures for both cases were reviewed, and 52 contacts were identified. No further confirmed COVID-19 cases have been linked epidemiologically to these two cases. As steps are made to enhance contact tracing across the UK, the lessons learned from earlier contact tracing during the country's containment phase are particularly important and timely.
Integration of depression treatment into primary care could improve patient outcomes in low-resource settings. Losses along the depression care cascade limit integrated service effectiveness. This study identified patient-level factors that predicted detection of depressive symptoms by nurses, referral for depression treatment, and uptake of counseling, as part of integrated care in KwaZulu-Natal, South Africa.
Methods
This was an analysis of baseline data from a prospective cohort. Participants were adult patients with at least moderate depressive symptoms at primary care facilities in Amajuba, KwaZulu-Natal, South Africa. Participants were screened for depressive symptoms prior to routine assessment by a nurse. Generalized linear mixed-effects models were used to estimate associations between patient characteristics and service delivery outcomes.
Results
Data from 412 participants were analyzed. Nurses successfully detected depressive symptoms in 208 [50.5%, 95% confidence interval (CI) 38.9–62.0] participants; of these, they referred 76 (36.5%, 95% CI 20.3–56.5) for depression treatment; of these, 18 (23.7%, 95% CI 10.7–44.6) attended at least one session of depression counseling. Depressive symptom severity, alcohol use severity, and perceived stress were associated with detection. Similar factors did not drive referral or counseling uptake.
Conclusions
Nurses detected patients with depressive symptoms at rates comparable to primary care providers in high-resource settings, though gaps in referral and uptake persist. Nurses were more likely to detect symptoms among patients in more severe mental distress. Implementation strategies for integrated mental health care in low-resource settings should target improved rates of detection, referral, and uptake.
Clients presenting for mental health assessment may have medical conditions that either contribute to the presentation, require emergent treatment or affect the choice of therapy that follows any admission to Hospital for a mental illness.
Screening for pathology such as substance abuse, trauma and metabolic or electrolyte imbalances must be carried out before the diagnosis of a mental illness may be confidently made.
The consequences of not detecting these conditions is particularly significant as most Mental Health Inpatient Units are typically not well equipped to monitor or care for these pathologies.
A retrospective study of 100 consecutive Mental Health Admissions to Dubbo Base Hospital was conducted and data concerning the type of medical screening performed and the results were compiled and analysed. The screening included physical examination, radiological imaging and general pathology testing.
The findings indicated that there was a lack of uniformity in the approach to medical assessment of mental health patients that may have resulted in relevant organic pathologies not being appropriately detected. The findings also indicated that, in a significant number of cases, organic pathology played an important role in both the diagnosis and subsequent treatment of a number of these patients.
It was concluded that a standard set of routine investigations be carried out on all Mental Health admissions and that the results of the investigations carried out did considerably influence either the diagnosis or treatment of a significant number of the patients in the study group.
Despite a sizeable evidence base for the risk of campylobacteriosis associated with eating chicken liver pâté, associated outbreaks continue to occur. In January 2017, six cases of campylobacteriosis reported having eaten a Christmas set-menu meal at the same hotel in North Yorkshire, England on the same day. A retrospective cohort study was undertaken to test the null hypothesis that consumption of individual food items was not associated with an increased risk of illness. There were 19 cases of campylobacteriosis linked to the outbreak; seven confirmed and 12 probable cases. Chicken liver pâté was the food item most strongly associated with illness (P < 0.001) with a corresponding high crude relative risk (12.95). This relationship was supported by multivariable analysis, sensitivity analyses and a clear dose–response relationship. Three cases reported an incubation period of <24 h, consistent with other outbreaks of campylobacteriosis associated with consumption of poultry liver. The findings were suggestive of a single point source exposure with a strong association between the consumption of chicken liver pâté and campylobacteriosis. This outbreak highlights that despite evidence that simple cooking techniques can ensure that all campylobacter are killed during cooking, outbreaks continue to occur. Public and professional awareness needs to be raised through a strategic communication plan to reduce the risk of further outbreaks of campylobacteriosis linked to incorrectly cooked chicken liver dishes.