We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Tenecteplase has been shown to be non-inferior to alteplase for the treatment of acute ischemic stroke within 4.5 hours of stroke onset. While not formally approved by regulatory authorities, many jurisdictions have transitioned to using tenecteplase for routine stroke treatment because it is simpler to use and has cost advantages.
Methods:
We report a three-phase time-series analysis over 2.5 years and the process for transition from use of alteplase to tenecteplase for the routine treatment of acute ischemic stroke from a system-wide perspective involving an entire province. The transition was planned and implemented centrally. Data were collected in clinical routine, arising from both administrative sources and a prospective stroke registry, and represent real-world outcome data. Data are reported using standard descriptive statistics.
Results:
A total of 1211 patients were treated with intravenous thrombolysis (477 pre-transition using alteplase, 180 transition period using both drugs, 554 post-transition using tenecteplase). Baseline characteristics, adverse events and outcomes were similar between epochs. There were four dosing errors with tenecteplase, including providing the cardiac dose to two patients. There were no instances of major hemorrhage associated with dosing errors.
Discussion:
The transition to using intravenous tenecteplase for stroke treatment was seamless and resulted in identical outcomes to intravenous alteplase.
Recent changes to US research funding are having far-reaching consequences that imperil the integrity of science and the provision of care to vulnerable populations. Resisting these changes, the BJPsych Portfolio reaffirms its commitment to publishing mental science and advancing psychiatric knowledge that improves the mental health of one and all.
To compare rates of Clostridioides difficile infection (CDI) recurrence following initial occurrence treated with tapered enteral vancomycin compared to standard vancomycin.
Design:
Retrospective cohort study.
Setting:
Community health system.
Patients:
Adults ≥18 years of age hospitalized with positive C. difficile polymerase chain reaction or toxin enzyme immunoassay who were prescribed either standard 10–14 days of enteral vancomycin four times daily or a 12-week tapered vancomycin regimen.
Methods:
Retrospective propensity score pair matched cohort study. Groups were matched based on age < or ≥ 65 years and receipt of non-C. difficile antibiotics during hospitalization or within 6 months post-discharge. Recurrence rates were analyzed via logistic regression conditioned on matched pairs and reported as conditional odds ratios. The primary outcome was recurrence rates compared between standard vancomycin versus tapered vancomycin for treatment of initial CDI.
Results:
The CDI recurrence rate at 6 months was 5.3% (4/75) in the taper cohort versus 28% (21/75) in the standard vancomycin cohort. The median time to CDI recurrence was 115 days versus 20 days in the taper and standard vancomycin cohorts, respectively. When adjusted for matching, patients in the taper arm were less likely to experience CDI recurrence at 6 months when compared to standard vancomycin (cOR = 0.19, 95% CI 0.07–0.56, p < 0.002).
Conclusions:
Larger prospective trials are needed to elucidate the clinical utility of tapered oral vancomycin as a treatment option to achieve sustained clinical cure in first occurrences of CDI.
Individuals with single ventricle physiology who are palliated with superior cavopulmonary anastomosis (Glenn surgery) may develop pulmonary arteriovenous malformations. The traditional tools for pulmonary arteriovenous malformation diagnosis are often of limited diagnostic utility in this patient population. We sought to measure the pulmonary capillary transit time to determine its value as a tool to identify pulmonary arteriovenous malformations in patients with single ventricle physiology.
Methods:
We defined the angiographic pulmonary capillary transit time as the number of cardiac cycles required for transit of contrast from the distal pulmonary arteries to the pulmonary veins. Patients were retrospectively recruited from a single quaternary North American paediatric centre, and angiographic and clinical data were reviewed. Pulmonary capillary transit time was calculated in 20 control patients and compared to 20 single ventricle patients at the pre-Glenn, Glenn, and Fontan surgical stages (which were compared with a linear-mixed model). Correlation (Pearson) between pulmonary capillary transit time and haemodynamic and injection parameters was assessed using angiograms from 84 Glenn patients. Five independent observers calculated pulmonary capillary transit time to measure reproducibility (intraclass correlation coefficient).
Results:
Mean pulmonary capillary transit time was 3.3 cardiac cycles in the control population, and 3.5, 2.4, and 3.5 in the pre-Glenn, Glenn, and Fontan stages, respectively. Pulmonary capillary transit time in the Glenn population did not correlate with injection conditions. Intraclass correlation coefficient was 0.87.
Conclusions:
Pulmonary angiography can be used to calculate the pulmonary capillary transit time, which is reproducible between observers. Pulmonary capillary transit time accelerates in the Glenn stage, correlating with absence of direct hepatopulmonary venous flow.
Edited by
Nevena V. Radonjić, State University of New York Upstate Medical University,Thomas L. Schwartz, State University of New York Upstate Medical University,Stephen M. Stahl, University of California, San Diego
Edited by
Nevena V. Radonjić, State University of New York Upstate Medical University,Thomas L. Schwartz, State University of New York Upstate Medical University,Stephen M. Stahl, University of California, San Diego
Edited by
Nevena V. Radonjić, State University of New York Upstate Medical University,Thomas L. Schwartz, State University of New York Upstate Medical University,Stephen M. Stahl, University of California, San Diego
Edited by
Nevena V. Radonjić, State University of New York Upstate Medical University,Thomas L. Schwartz, State University of New York Upstate Medical University,Stephen M. Stahl, University of California, San Diego
Edited by
Nevena V. Radonjić, State University of New York Upstate Medical University,Thomas L. Schwartz, State University of New York Upstate Medical University,Stephen M. Stahl, University of California, San Diego
Edited by
Nevena V. Radonjić, State University of New York Upstate Medical University,Thomas L. Schwartz, State University of New York Upstate Medical University,Stephen M. Stahl, University of California, San Diego
Edited by
Nevena V. Radonjić, State University of New York Upstate Medical University,Thomas L. Schwartz, State University of New York Upstate Medical University,Stephen M. Stahl, University of California, San Diego
Edited by
Nevena V. Radonjić, State University of New York Upstate Medical University,Thomas L. Schwartz, State University of New York Upstate Medical University,Stephen M. Stahl, University of California, San Diego
This new volume in Stahl's Case Studies series presents the continuation of Dr. Schwartz's previous successful collection of psychopharmacology cases from Volume 2, this time in collaboration with Dr. Radonjić and editing from Dr. Stahl. Here they illustrate common questions and dilemmas routinely encountered in psychopharmacologic day-to-day practice. Following a consistent user-friendly layout, each case features icons, tips, and questions about diagnosis and management as it progresses over time, a pre-case self-assessment question, followed by the correct answers at the end. Formatted in alignment with the American Board of Psychiatry and Neurology's maintenance of psychiatry specialty certification, cases address issues in a relevant and understandable way. Covering a wide-ranging and representative selection of clinical scenarios, each case is followed through the complete clinical encounter, from start to resolution, acknowledging the complications, issues, decisions, twists, and turns along the way. This is psychiatry in real life.
Aviation passenger screening has been used worldwide to mitigate the translocation risk of SARS-CoV-2. We present a model that evaluates factors in screening strategies used in air travel and assess their relative sensitivity and importance in identifying infectious passengers. We use adapted Monte Carlo simulations to produce hypothetical disease timelines for the Omicron variant of SARS-CoV-2 for travelling passengers. Screening strategy factors assessed include having one or two RT-PCR and/or antigen tests prior to departure and/or post-arrival, and quarantine length and compliance upon arrival. One or more post-arrival tests and high quarantine compliance were the most important factors in reducing pathogen translocation. Screening that combines quarantine and post-arrival testing can shorten the length of quarantine for travelers, and variability and mean testing sensitivity in post-arrival RT-PCR and antigen tests decrease and increase with the greater time between the first and second post-arrival test, respectively. This study provides insight into the role various screening strategy factors have in preventing the translocation of infectious diseases and a flexible framework adaptable to other existing or emerging diseases. Such findings may help in public health policy and decision-making in present and future evidence-based practices for passenger screening and pandemic preparedness.
Deployment of law enforcement operational canines (OpK9s) risks injuries to the animals. This study’s aim was to assess the current status of states’ OpK9 (veterinary Emergency Medical Services [VEMS]) laws and care protocols within the United States.
Methods:
Cross-sectional standardized review of state laws/regulations and OpK9 VEMS treatment protocols was undertaken. For each state and for the District of Columbia (DC), the presence of OpK9 legislation and/or care protocols was ascertained. Information was obtained through governmental records and from stakeholders (eg, state EMS medical directors and state veterinary boards).
The main endpoints were proportions of states with OpK9 laws and/or treatment protocols. Proportions are reported with 95% confidence intervals (CIs). Fisher’s exact test (P <.05) assessed whether presence of an OpK9 law in a given jurisdiction was associated with presence of an OpK9 care protocol, and whether there was geographic variation (based on United States Census Bureau regions) in presence of OpK9 laws or protocols.
Results:
Of 51 jurisdictions, 20 (39.2%) had OpK9 legislation and 23 (45.1%) had state-wide protocols for EMS treatment of OpK9s. There was no association (P = .991) between presence of legislation and presence of protocols. There was no association (P = .144) between presence of legislation and region: Northeast 66.7% (95% CI, 29.9-92.5%), Midwest 50.0% (95% CI, 21.1-78.9%), South 29.4% (95% CI, 10.3-56.0%), and West 23.1% (95% CI, 5.0-53.8%). There was significant (P = .001) regional variation in presence of state-wide OpK9 treatment protocols: Northeast 100.0% (95% CI, 66.4-100.0%), Midwest 16.7% (95% CI, 2.1-48.4%), South 47.1% (95% CI, 23.0-72.2%), and West 30.8% (95% CI, 9.1-61.4%).
Conclusion:
There is substantial disparity with regard to presence of OpK9 legal and/or clinical guidance. National collaborative guidelines development is advisable to optimize and standardize care of OpK9s. Additional attention should be paid to educational and training programs to best utilize the limited available training budgets.
Evaluation of adult antibiotic order sets (AOSs) on antibiotic stewardship metrics has been limited. The primary outcome was to evaluate the standardized antimicrobial administration ratio (SAAR). Secondary outcomes included antibiotic days of therapy (DOT) per 1,000 patient days (PD); selected antibiotic use; AOS utilization; Clostridioides difficile infection (CDI) cases; and clinicians’ perceptions of the AOS via a survey following the final study phase.
Design:
This 5-year, single-center, quasi-experimental study comprised 5 phases from 2017 to 2022 over 10-month periods between August 1 and May 31.
Setting:
The study was conducted in a 752-bed tertiary care, academic medical center.
Intervention:
Our institution implemented AOSs in the electronic medical record (EMR) for common infections among hospitalized adults.
Results:
For the primary outcome, a statistically significant decreases in SAAR were detected from phase 1 to phase 5 (1.0 vs 0.90; P < .001). A statistically significant decreases were detected in DOT per 1,000 PD (4,884 vs 3,939; P = .001), fluoroquinolone orders (407 vs 175; P < .001), carbapenem orders (147 vs 106; P = .024), and clindamycin orders (113 vs 73; P = .01). No statistically significant change in mean vancomycin orders was detected (991 vs 902; P = .221). A statistically significant decrease in CDI cases was also detected (7.8, vs 2.4; P = .002) but may have been attributable to changes in CDI case diagnosis. Clinicians indicated that the AOSs were easy to use overall and that they helped them select the appropriate antibiotics.
Conclusions:
Implementing AOS into the EMR was associated with a statistically significant reduction in SAAR, antibiotic DOT per 1,000 PD, selected antibiotic orders, and CDI cases.
Anterior temporal lobectomy is a common surgical approach for medication-resistant temporal lobe epilepsy (TLE). Prior studies have shown inconsistent findings regarding the utility of presurgical intracarotid sodium amobarbital testing (IAT; also known as Wada test) and neuroimaging in predicting postoperative seizure control. In the present study, we evaluated the predictive utility of IAT, as well as structural magnetic resonance imaging (MRI) and positron emission tomography (PET), on long-term (3-years) seizure outcome following surgery for TLE.
Participants and Methods:
Patients consisted of 107 adults (mean age=38.6, SD=12.2; mean education=13.3 years, SD=2.0; female=47.7%; White=100%) with TLE (mean epilepsy duration =23.0 years, SD=15.7; left TLE surgery=50.5%). We examined whether demographic, clinical (side of resection, resection type [selective vs. non-selective], hemisphere of language dominance, epilepsy duration), and presurgical studies (normal vs. abnormal MRI, normal vs. abnormal PET, correctly lateralizing vs. incorrectly lateralizing IAT) were associated with absolute (cross-sectional) seizure outcome (i.e., freedom vs. recurrence) with a series of chi-squared and t-tests. Additionally, we determined whether presurgical evaluations predicted time to seizure recurrence (longitudinal outcome) over a three-year period with univariate Cox regression models, and we compared survival curves with Mantel-Cox (log rank) tests.
Results:
Demographic and clinical variables (including type [selective vs. whole lobectomy] and side of resection) were not associated with seizure outcome. No associations were found among the presurgical variables. Presurgical MRI was not associated with cross-sectional (OR=1.5, p=.557, 95% CI=0.4-5.7) or longitudinal (HR=1.2, p=.641, 95% CI=0.4-3.9) seizure outcome. Normal PET scan (OR= 4.8, p=.045, 95% CI=1.0-24.3) and IAT incorrectly lateralizing to seizure focus (OR=3.9, p=.018, 95% CI=1.2-12.9) were associated with higher odds of seizure recurrence. Furthermore, normal PET scan (HR=3.6, p=.028, 95% CI =1.0-13.5) and incorrectly lateralized IAT (HR= 2.8, p=.012, 95% CI=1.2-7.0) were presurgical predictors of earlier seizure recurrence within three years of TLE surgery. Log rank tests indicated that survival functions were significantly different between patients with normal vs. abnormal PET and incorrectly vs. correctly lateralizing IAT such that these had seizure relapse five and seven months earlier on average (respectively).
Conclusions:
Presurgical normal PET scan and incorrectly lateralizing IAT were associated with increased risk of post-surgical seizure recurrence and shorter time-to-seizure relapse.
Agricultural workers are immersed in environments associated with increased risk for adverse psychiatric and neurological outcomes. Agricultural work-related risks to brain health include exposure to pesticides, heavy metals, and organic dust. Despite this, there is a gap in our understanding of the underlying brain systems impacted by these risks. This study explores clinical and cognitive domains, and functional brain activity in agricultural workers. We hypothesized that a history of agricultural work-related risks would be associated with poorer clinical and cognitive outcomes as well as changes in functional brain activity within cortico-striatal regions.
Participants and Methods:
The sample comprised 17 agricultural workers and a comparison group of 45 non-agricultural workers recruited in the Northern Colorado area. All participants identified as White and non-Hispanic. The mean age of participants was 51.7 years (SD = 21.4, range 18-77), 60% identified as female, and 37% identified as male. Participants completed the National Institute of Health Toolbox (NIH Toolbox) and Montreal Cognitive Assessment (MoCA) on their first visit. During the second visit, they completed NIH Patient-Reported Outcomes Measurement Information System (PROMIS) measures and underwent functional magnetic resonance imaging (fMRI; N = 15 agriculture and N = 35 non-agriculture) while completing a working memory task (Sternberg). Blood oxygen-level dependent (BOLD) response was compared between participants. Given the small sample size, the whole brain voxel-wise group comparison threshold was set at alpha = .05, but not otherwise corrected for multiple comparisons. Cohen’s d effect sizes were estimated for all voxels.
Results:
Analyses of cognitive scores showed significant deficits in episodic memory for the agricultural work group. Additionally, the agricultural work group scored higher on measures of self-reported anger, cognitive concerns, and social participation. Analyses of fMRI data showed increased BOLD activity around the orbitofrontal cortex (medium to large effects) and bilaterally in the entorhinal cortex (large effects) for the agricultural work group. The agricultural work group also showed decreased BOLD activity in the cerebellum and basal ganglia (medium to large effects).
Conclusions:
To our knowledge, this study provides the first-ever evidence showing differences in brain activity associated with a history of working in agriculture. These findings of poorer memory, concerns about cognitive functioning, and increased anger suggest clinical relevance. Social participation associated with agricultural work should be explored as a potential protective factor for cognition and brain health. Brain imaging data analyses showed increased activation in areas associated with motor functioning, cognitive control, and emotion. These findings are limited by small sample size, lack of diversity in our sample, and coarsely defined risk. Despite these limitations, the results are consistent with an overall concern that risks associated with agricultural work can lead to cognitive and psychiatric harm via changes in brain health. Replications and future studies with larger sample sizes, more diverse participants, and more accurately defined risks (e.g., pesticide exposure) are needed.
In July 2021, Liverpool was removed from the prestigious List of World Heritage Sites, sending shockwaves around the global heritage community. More recently, the spotlight has shifted to another world famous site also located in the United Kingdom. During the same 44th Session of the World Heritage Committee, UNESCO threatened to place Stonehenge on the List in Danger if the required changes to a significant billion-pound road enhancement project were not implemented. Given what happened in Liverpool, there are fears that Stonehenge is in danger of moving towards delisting. An interesting critical line of inquiry to emerge from Liverpool, and other World Heritage Sites, concerns the local, national, and international ‘politics at the site’. This article develops this debate by analysing the role of different scalar actors involved in the Stonehenge World Heritage Site. More specifically, our article examines how the Stonehenge Alliance sought to engage in, what we define as, scalar manoeuvres that is evidenced by scale jumping and scalar alignments with more powerful players further up the heritage hierarchy in order to effect leverage over the future status of the World Heritage Site.
The anesthesia workstation, commonly referred to as the “anesthesia machine,” is a complex and very specialized piece of equipment that is relatively unique in medical practice. It is, in essence, a device to control the delivery of medical gases to patients, including oxygen, air, nitrous oxide, and volatile anesthetics, along with a specialized ventilator adapted to operating room conditions. The safe use of the anesthesia workstation requires proper training, preuse checkout, and continuous monitoring of its function. The medical literature is replete with examples of patient harm from inappropriate use of the anesthesia workstation and from mechanical or electrical failure of its components. Additionally, volatile anesthetics, while valuable in medical practice, have a very low therapeutic index and manifest severe, and even fatal, side effects when administered improperly. Finally, many patients under general anesthesia are paralyzed for surgery and ventilated through an endotracheal tube. Their safety is completely dependent on the anesthesia professional’s use of the anesthesia workstation to deliver breathing gases, remove carbon dioxide from exhaled gas, and precise administration of volatile anesthetics.