We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Patients with Opioid Use Disorder (OUD) are prone to Multidrug-Resistant Organism (MDRO) colonization and infections, thus at risk for worse outcomes during critical illness. Understanding the prevalence and predictors of MDRO infections is essential to optimize interventions and treatments.
Design:
Retrospective cohort study.
Methods:
The study evaluated the prevalence of MDRO isolation among adults with OUD admitted to an intensive care unit (ICU) between January 1, 2018, and July 31, 2023. It included adults admitted to an ICU with bacterial infections and positive cultures obtained within 48 hours of admission. Demographics, clinical traits, and MDRO isolation rates were analyzed using descriptive statistics, univariate methods, and Least Absolute Shrinkage and Selection Operator (LASSO) regression.
Results:
MDRO isolation occurred in 178 of 790 patients (22.5%), with methicillin-resistant Staphylococcus aureus as the most frequently isolated organism. LASSO regression identified housing insecurity (OR: 1.79, 95% CI 1.09–2.93, P = .022), no receipt of medications for OUD treatment (OR: 1.56, 95% CI 1.06–2.29, P = .023), positive hepatitis C virus (HCV) status (OR: 2.19, 95% CI 1.19–4.03, P = .012), and intravenous antibiotic use in the prior 90 days (OR: 1.04 per 24 h, 95% CI 1.01–1.07, P = .007) as significant predictors of MDRO isolation.
Conclusions:
The study highlights a high prevalence of MDRO isolation in critically ill OUD patients admitted for infection-related issues with positive cultures obtained within 48 hours of admission, influenced by factors like housing insecurity, no receipt of medications for OUD treatment, HCV status, and prior antibiotic use.
From the safety inside vehicles, Knowsley Safari offers visitors a close-up encounter with captive olive baboons. As exiting vehicles may be contaminated with baboon stool, a comprehensive coprological inspection was conducted to address public health concerns. Baboon stools were obtained from vehicles, and sleeping areas, inclusive of video analysis of baboon–vehicle interactions. A purposely selected 4-day sampling period enabled comparative inspections of 2662 vehicles, with a total of 669 baboon stools examined (371 from vehicles and 298 from sleeping areas). As informed by our pilot study, front-line diagnostic methods were: QUIK-CHEK rapid diagnostic test (RDT) (Giardia and Cryptosporidium), Kato–Katz coproscopy (Trichuris) and charcoal culture (Strongyloides). Some 13.9% of vehicles were contaminated with baboon stool. Prevalence of giardiasis was 37.4% while cryptosporidiosis was <0.01%, however, an absence of faecal cysts by quality control coproscopy, alongside lower than the expected levels of Giardia-specific DNA, judged RDT results as misleading, grossly overestimating prevalence. Prevalence of trichuriasis was 48.0% and strongyloidiasis was 13.7%, a first report of Strongyloides fuelleborni in UK. We advise regular blanket administration(s) of anthelminthics to the colony, exploring pour-on formulations, thereafter, smaller-scale indicator surveys would be adequate.
An alternative surgical approach for hypoplastic left heart syndrome is the Hybrid pathway, which delays the risk of acute kidney injury outside of the newborn period. We sought to determine the incidence, and associated morbidity, of acute kidney injury after the comprehensive stage 2 and the cumulative incidence after the first two operations in the Hybrid pathway.
Design:
A single centre, retrospective study was conducted of hypoplastic left heart patients completing the second-stage palliation in the Hybrid pathway from 2009 to 2018. Acute kidney injury was defined utilising Kidney Diseases Improving Global Outcomes criteria. Perioperative and post-operative characteristics were analysed.
Results:
Sixty-one patients were included in the study cohort. The incidence of acute kidney injury was 63.9%, with 36.1% developing severe injury. Cumulatively after the Hybrid Stage 1 and comprehensive stage 2 procedures, 69% developed acute kidney injury with 36% developing severe injury. The presence of post-operative acute kidney injury was not associated with an increase in 30-day mortality (acute kidney injury 7.7% versus none 9.1%; p = > 0.9). There was a significantly longer median duration of intubation among those with acute kidney injury (acute kidney injury 32 (8, 155) hours vs. no injury 9 (0, 94) hours; p = 0.018).
Conclusions:
Acute kidney injury after the comprehensive stage two procedure is common and accounts for most of the kidney injury in the first two operations of the Hybrid pathway. No difference in mortality was detected between those with acute kidney injury and those without, although there may be an increase in morbidity.
Agitated behaviors are frequently encountered in the prehospital setting and require emergent treatment to prevent harm to patients and prehospital personnel. Chemical sedation with ketamine works faster than traditional pharmacologic agents, though it has a higher incidence of adverse events, including intubation. Outcomes following varying initial doses of prehospital intramuscular (IM) ketamine use have been incompletely described.
Objective:
To determine whether using a lower dose IM ketamine protocol for agitation is associated with more favorable outcomes.
Methods:
This study was a pre-/post-intervention retrospective chart review of prehospital care reports (PCRs). Adult patients who received chemical sedation in the form of IM ketamine for agitated behaviors were included. Patients were divided into two cohorts based on the standard IM ketamine dose of 4mg/kg and the lower IM dose of 3mg/kg with the option for an additional 1mg/kg if required. Primary outcomes included intubation and hospital admission. Secondary outcomes included emergency department (ED) length of stay, additional chemical or physical restraints, assaults on prehospital or ED employees, and documented adverse events.
Results:
The standard dose cohort consisted of 211 patients. The lower dose cohort consisted of 81 patients, 17 of whom received supplemental ketamine administration. Demographics did not significantly differ between the cohorts (mean age 35.14 versus 35.65 years; P = .484; and 67.8% versus 65.4% male; P = .89). Lower dose subjects were administered a lower ketamine dose (mean 3.24mg/kg) compared to the standard dose cohort (mean 3.51mg/kg). There was no statistically significant difference between the cohorts in intubation rate (14.2% versus 18.5%; P = .455), ED length of stay (14.31 versus 14.88 hours; P = .118), need for additional restraint and sedation (P = .787), or admission rate (26.1% versus 25.9%; P = .677). In the lower dose cohort, 41.2% (7/17) of patients who received supplemental ketamine doses were intubated, a higher rate than the patients in this cohort who did not receive supplemental ketamine (8/64, 12.5%; P <.01).
Conclusion:
Access to effective, fast-acting chemical sedation is paramount for prehospital providers. No significant outcomes differences existed when a lower dose IM ketamine protocol was implemented for prehospital chemical sedation. Patients who received a second dose of ketamine had a significant increase in intubation rate. A lower dose protocol may be considered for an agitation protocol to limit the amount of medication administered to a population of high-risk patients.
Acute kidney injury leads to worse outcomes following paediatric cardiac surgery. There is a lack of literature focusing on acute kidney injury after the Hybrid stage 1 palliation for single ventricle physiology. Patients undergoing the Hybrid Stage 1, as a primary option, may have a lower incidence of kidney injury than previously reported. When present, kidney injury may increase the risk of post-operative morbidity and mortality.
Methods:
A retrospective, single centre review was conducted in patients with hypoplastic left heart syndrome who underwent Hybrid Stage 1 from 2008 to 2018. Acute kidney injury was defined as a dichotomous yes (meeting any injury criteria) or no (no injury) utilising two different criteria utilised in paediatrics. The impact of kidney injury on perioperative characteristics and 30-day mortality was analysed.
Results:
The incidence of acute kidney injury is 13.4–20.7%, with a severe injury rate of 2.4%. Patients without a prenatal diagnosis of hypoplastic left heart syndrome have a higher incidence of kidney injury than those prenatally diagnosed, (40% versus 14.5%, p = 0.024). Patients with acute kidney injury have a significantly higher incidence of 30-day mortality, 27.3%, compared to without, 5.6% (p = 0.047).
Discussion:
The incidence of severe acute kidney injury after the Hybrid Stage 1 palliation is low. A prenatal diagnosis may be associated with a lower incidence of kidney injury following the Hybrid Stage 1. Though uncommon, severe acute kidney injury following Hybrid Stage 1 may be associated with higher 30-day mortality.
This article presents a detailed account (provenance, codicology and contents) of Surrey History Centre, Woking, MS LM/1 083191/35, a late Restoration manuscript of lyra-viol and keyboard music. Originally from the papers of the More-Molyneux family of Loseley Park, LM/1083191135 is a source of otherwise unknown music by John Moss and Gerhard Diesineer. Two of the lyra-viol pieces in particular demonstrate that the Waking manuscript dates to at least 1687 or 1688, making it the latest known English source of viol music in tablature. The primary purpose of the manuscript seems to have been didactic. It was copied by a single scribe, who was evidently a musician actively engaged with the popular music and current political events of mid- to late-1680s London. LM/1083191/35 allows us a rare glimpse into the amateur musical world of 1680s London.
Disease surveillance in wildlife populations presents a logistical challenge, yet is critical in gaining a deeper understanding of the presence and impact of wildlife pathogens. Erinaceus coronavirus (EriCoV), a clade C Betacoronavirus, was first described in Western European hedgehogs (Erinaceus europaeus) in Germany. Here, our objective was to determine whether EriCoV is present, and if it is associated with disease, in Great Britain (GB). An EriCoV-specific BRYT-Green® real-time reverse transcription PCR assay was used to test 351 samples of faeces or distal large intestinal tract contents collected from casualty or dead hedgehogs from a wide area across GB. Viral RNA was detected in 10.8% (38) samples; however, the virus was not detected in any of the 61 samples tested from Scotland. The full genome sequence of the British EriCoV strain was determined using next generation sequencing; it shared 94% identity with a German EriCoV sequence. Multivariate statistical models using hedgehog case history data, faecal specimen descriptions and post-mortem examination findings found no significant associations indicative of disease associated with EriCoV in hedgehogs. These findings indicate that the Western European hedgehog is a reservoir host of EriCoV in the absence of apparent disease.
The climatic record from Greenland boreholes is likely to extend well beyond the last interglacial only if the basal ice near the drilling sites has never reached its pressure melting point (−2°C). A simplified one-dimensional analysis (Paterson and Waddington, 1986) suggested that this would be true at Crête, Greenland, if the geothermal flux was less than 48 mW m−2. In that study, the vertical velocity pattern for an isothermal ice sheet was used. We have repeated the Crête calculations using the vertical velocity pattern derived by a finite element analysis. Using this temperature-dependent velocity pattern lowered the basal temperature by about 3°C.
We have carried out a similar analysis for the Summit coring site further north on the Greenland ice divide. Here we find that the basal ice does not melt if the geothermal flux is less than 54 mW m−2, using the same mass balance and surface temperature histories as the previous study. We are repeating these one-dimensional calculations with more recently compiled histories and plan to present results from a full two-dimensional temperature model that includes processes only parameterized in the one-dimensional models. Using two dimensions, we will more realistically incorporate the special ice-flow patterns found at divides (e.g. Raymond, 1983; Dahl-Jensen, 1989). In steady-state flow models these patterns lead to significant horizontal temperature gradients and a “hot spot” beneath an ice divide (Paterson and Waddington, 1986). In addition, we will more accurately determine the transient effects on basal temperature resulting from the interaction of these flow patterns and the changing climate. Our discussion will include sensitivity to geothermal heat flux, ice thickness and paleoenvironmental history.
Radiocarbon accelerator mass spectrometric (AMS) dates on the acid-insoluble fraction from 38 core tops from the western Ross Sea, Antarctica, are used to address these questions: (1) What are the apparent ages of sediments at or close to the present sediment/water interface? (2) Is there a statistically significant pattern to the spatial distribution of core top ages? and (3) Is there a “correction factor” that can be applied to these age determinations to obtain the best possible Holocene (downcore) chronologies? Ages of core top sediments range from 2000 to 21,000 14C yr B.P. Some “old” core top dates are from piston cores and probably represent the loss of sediment during the coring process, but some core top samples >6000 14C yr B.P. may represent little or no Holocene deposition. Four possible sources of variability in dates ≤6000 14C yr B.P. (n = 28) are associated with (1) different sample preparation methods, (2) different sediment recovery systems, (3) different geographic regions, and (4) within-sample lateral age variability. Statistical analysis on an a posteriori design indicates that geographic area is the major cause of variability; there is a difference in mean surface sediment age of nearly 2000 yr between sites in the western Ross Sea and sites east of Ross Bank in south-central Ross Sea. The systematic variability in surface age between areas may be attributed to: (a) variable sediment accumulation rates (SAR) (surface age is inversely related to SAR), (b) differences in the percentage of reworked (dead) carbon between each area, and/or (c) differences in the CO2 exchange between the ocean and the atmosphere.
The deep-drilling projects at the Summit ice divide will require thermal models to help interpret the paleoclimatic signals in their cores. An analytic, steady-state model predicts basal temperatures within 1 °C of the ice melting-point and basal ice no older than 100–400 kyear should melting occur. A two-dimensional, time-dependent temperature model includes the effects of realistic two-dimensional ice flow and the temperature and mass-balance patterns of the last two glacial cycles. The model relaxes some assumptions made in one-dimensional studies and produces lower basal temperatures. The basal temperatures are most sensitive to the value of the geothermal heat flux and the mass-balance pattern. If the flux is less than 56 mW m−2, the bed has likely been frozen throughout the last glacial cycle. The decoupling of the energy and mass-conservation equations is a significant source of error which can be eliminated only by a fully coupled ice-flow/ heat-flow model.
Recent commentary has suggested that performance management (PM) is fundamentally “broken,” with negative feelings from managers and employees toward the process at an all-time high (Pulakos, Hanson, Arad, & Moye, 2015; Pulakos & O'Leary, 2011). In response, some high-profile organizations have decided to eliminate performance ratings altogether as a solution to the growing disenchantment. Adler et al. (2016) offer arguments both in support of and against eliminating performance ratings in organizations. Although both sides of the debate in the focal article make some strong arguments both for and against utilizing performance ratings in organizations, we believe there continue to be misunderstandings, mischaracterizations, and misinformation with respect to some of the measurement issues in PM. We offer the following commentary not to persuade readers to adopt one particular side over another but as a call to critically reconsider and reevaluate some of the assumptions underlying measurement issues in PM and to dispel some of the pervasive beliefs throughout the performance rating literature.
To aid in preparation of military medic trainers for a possible new curriculum in teaching junctional tourniquet use, the investigators studied the time to control hemorrhage and blood volume lost in order to provide evidence for ease of use.
Hypothesis
Models of junctional tourniquet could perform differentially by blood loss, time to hemostasis, and user preference.
Methods
In a laboratory experiment, 30 users controlled simulated hemorrhage from a manikin (Combat Ready Clamp [CRoC] Trainer) with three iterations each of three junctional tourniquets. There were 270 tests which included hemorrhage control (yes/no), time to hemostasis, and blood volume lost. Users also subjectively ranked tourniquet performance. Models included CRoC, Junctional Emergency Treatment Tool (JETT), and SAM Junctional Tourniquet (SJT). Time to hemostasis and total blood loss were log-transformed and analyzed using a mixed model analysis of variance (ANOVA) with the users represented as random effects and the tourniquet model used as the treatment effect. Preference scores were analyzed with ANOVA, and Tukey’s honest significant difference test was used for all post-hoc pairwise comparisons.
Results
All tourniquet uses were 100% effective for hemorrhage control. For blood loss, CRoC and SJT performed best with least blood loss and were significantly better than JETT; in pairwise comparison, CRoC-JETT (P < .0001) and SJT-JETT (P = .0085) were statistically significant in their mean difference, while CRoC-SJT (P = .35) was not. For time to hemostasis in pairwise comparison, the CRoC had a significantly shorter time compared to JETT and SJT (P < .0001, both comparisons); SJT-JETT was also significant (P = .0087). In responding to the directive, “Rank the performance of the models from best to worst,” users did not prefer junctional tourniquet models differently (P > .5, all models).
Conclusion
The CRoC and SJT performed best in having least blood loss, CRoC performed best in having least time to hemostasis, and users did not differ in preference of model. Models of junctional tourniquet performed differentially by blood loss and time to hemostasis.
KraghJFJr, LunatiMP, KharodCU, CunninghamCW, BaileyJA, StockingerZT, CapAP, ChenJ, AdenJK3d, CancioLC. Assessment of Groin Application of Junctional Tourniquets in a Manikin Model. Prehosp Disaster Med. 2016;31(4):358–363.
Stomatopods, or mantis shrimps, are malacostracan crustaceans. Known as “lean, mean, killing machines” (Watling et al., 2000, p. 1), modern stomatopods are obligate carnivores, feeding exclusively on live prey (Schram, 1986). Characteristically, their second thoracic appendages are enlarged to form powerful, raptorial claws. Modern stomatopods are divided into two broad functional groups based on the shape and usage of their raptorial claws: ‘spearing’ and ‘smashing’ stomatopods (Caldwell and Dingle, 1976).