We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Currently there are no disease modifying treatment for Synucleinopathies including Parkinson’s disease Dementia (PDD). Carrying a mutation in the GBA gene (beta-glucocerebrosidase/ GCAse) is a leading risk factor for synucleinopathies. Raising activity GCAse lowers α-synuclein levels in cells and animal models. Ambroxol is a pharmacological chaperone for GCAse and can raise GCAse levels. Our goal is to test Ambroxol as a disease-modifying treatment in PDD. Methods: We randomized fifty-five individuals with PDD to Ambroxol 1050mg/day, 525mg/day, or placebo for 52 weeks. Primary outcome measures included safety, Alzheimer’s disease Assessment Scale-cognitive (ADAS-Cog) subscale and the Clinician’s Global Impression of Change (CGIC). Secondary outcomes included pharmacokinetics, cognitive and motor outcomes and and plasma and CSF biomarkers. Results: Ambroxol was well tolerated. There were 7 serious adverse events (SAEs) none deemed related to Ambroxol. GCase activity was increased in white blood cells by ~1.5 fold. There were no differences between groups on primary outcome measures. Patients receiving high dose Ambroxol appeared better on the Neuropsychiatric Inventory. GBA carriers appeared to improve on some cognitive tests. pTau 181 was reduced in CSF. Conclusions: Ambroxol was safe and well-tolerated in PDD. Ambroxol may improve biomarkers and cognitive outcomes in GBA1 mutation carrie.rs Ambroxol improved some biomarkerss. ClinicalTrials.gov NCT02914366
This work examines the ability of commerical zeolite Y to act as a slow release agent for a number of anthelmintic drugs. Administration to rats, dosed with Nippostrongylus brasiliensis, of pyrantel and/or fenbendazole and pigs, dosed with Ascaris and Oesophagostomum, of dichlorvos (DDVP) loaded onto zeolite Y was more sucessful in killing adult worms than administration of the pure drug alone. The zeolite Y was used as supplied for initial studies and then later dealuminated for further studies. The drug loadings were monitored by thermal analysis and the loaded zeolites were used in several field trials. The results indicate that zeolite Y is a suitable vehicle for the slow release of some anthelmintics. The slow release of drug from the zeolite matrix improved its efficacy.
Animal rescue shelters provide temporary housing for thousands of stray and abandoned dogs every year. Many of these animals fail to find new homes and are forced to spend long periods of time in kennels. This study examined the influence of the length of time spent in a rescue shelter (<1 month, 2-12 months, 1-5 years, >5 years) on the behaviour of 97 dogs. The dogs ‘ position in their kennels (front, back), their activity (moving, standing, sitting, resting, sleeping), and their vocalisation (barking, quiet, other) were recorded over a 4 h period at 10 min intervals. The dogs’ behaviour was significantly related to the length of time the animals had spent in the rescue shelter. Dogs housed in the shelter for over five years spent more of their time at the back of their kennels, more time resting, and less time barking than dogs housed in the shelter for shorter periods of time. The age of the dog could not account for the significant results found, suggesting that environmental factors were responsible for the change in the dogs’ behaviour. The findings suggest that lengthy periods of time spent in a captive environment may encourage dogs to behave in a manner that is generally considered unattractive by potential buyers. This may decrease the chances of such dogs being adopted, resulting in longer periods of time spent in the kennel environment and the possible development of further undesirable behaviours.
This study explored the influence of five types of auditory stimulation (human conversation, classical music, heavy metal music, pop music, and a control) on the behaviour of 50 dogs housed in a rescue shelter. The dogs were exposed to each type of auditory stimulation for 4 h, with an intervening period of one day between conditions. The dogs ‘ position in their kennels (front, back), their activity (moving, standing, sitting, resting, sleeping), and their vocalisation (barking, quiet, other) were recorded over 4 h at 10 min intervals during each condition of auditory stimulation. The dogs’ activity and vocalisation were significantly related to auditory stimulation. Dogs spent more time resting and less time standing when classical music was played than when any of the other stimuli were played. Exposure to heavy metal music encouraged dogs to spend significantly more of their time barking than did other types of auditory stimulation. Classical music resulted in dogs spending significantly more of their time quiet than did other types of auditory stimulation. It is suggested that the welfare of sheltered dogs may be enhanced through exposure to appropriate forms of auditory stimulation. Classical music appears particularly beneficial, resulting in activities suggestive of relaxation and behaviours that are considered desirable by potential buyers. This form of music may also appeal to visitors, resulting in enhanced perceptions of the rescue shelter's environment and an increased desire to adopt a dog from such a source.
Every year sees an increase in the number of dogs admitted to rescue shelters. However well these dogs are cared for in the shelter it cannot be ignored that being in such a situation is stressful, and the time spent in the shelter may change the dogs’ behaviour which may in turn influence their chances of being bought from the shelter. This research examined the behaviour of stray and unwanted dogs on their first, third and fifth days in an Ulster Society for the Prevention of Cruelty to Animals (USPCA) shelter. A questionnaire was also distributed to members of the public to determine how popular the USPCA was as a place from where to purchase a dog, and what factors about a dog's physical characteristics, behaviour and environment influenced potential buyers. Results revealed no significant difference between the behaviour of stray and unwanted dogs although the public viewed stray dogs as much less desirable than unwanted dogs. Time in the shelter had no adverse effects on the dogs’ behaviour. Indeed those changes which did occur during captivity, dogs being more relaxed in the presence of people and eating food more quickly, may be considered as positive changes. The USPCA was viewed as a popular place from which to buy a dog. Of factors influencing the public's choice, the dog's environment and behaviour appeared more important than its physical characteristics. The presence of a toy in the dog's cage greatly increased the public's preference for the dog, although the toy was ignored by the dog. The welfare implications of sheltering dogs are discussed.
HIV and severe wasting are associated with post-discharge mortality and hospital readmission among children with complicated severe acute malnutrition (SAM); however, the reasons remain unclear. We assessed body composition at hospital discharge, stratified by HIV and oedema status, in a cohort of children with complicated SAM in three hospitals in Zambia and Zimbabwe. We measured skinfold thicknesses and bioelectrical impedance analysis (BIA) to investigate whether fat and lean mass were independent predictors of time to death or readmission. Cox proportional hazards models were used to estimate the association between death/readmission and discharge body composition. Mixed effects models were fitted to compare longitudinal changes in body composition over 1 year. At discharge, 284 and 546 children had complete BIA and skinfold measurements, respectively. Low discharge lean and peripheral fat mass were independently associated with death/hospital readmission. Each unit Z-score increase in impedance index and triceps skinfolds was associated with 48 % (adjusted hazard ratio 0·52, 95 % CI (0·30, 0·90)) and 17 % (adjusted hazard ratio 0·83, 95 % CI (0·71, 0·96)) lower hazard of death/readmission, respectively. HIV-positive v. HIV-negative children had lower gains in sum of skinfolds (mean difference −1·49, 95 % CI (−2·01, −0·97)) and impedance index Z-scores (–0·13, 95 % CI (−0·24, −0·01)) over 52 weeks. Children with non-oedematous v. oedematous SAM had lower mean changes in the sum of skinfolds (–1·47, 95 % CI (−1·97, −0·97)) and impedance index Z-scores (–0·23, 95 % CI (−0·36, −0·09)). Risk stratification to identify children at risk for mortality or readmission, and interventions to increase lean and peripheral fat mass, should be considered in the post-discharge care of these children.
Over the last 25 years, radiowave detection of neutrino-generated signals, using cold polar ice as the neutrino target, has emerged as perhaps the most promising technique for detection of extragalactic ultra-high energy neutrinos (corresponding to neutrino energies in excess of 0.01 Joules, or 1017 electron volts). During the summer of 2021 and in tandem with the initial deployment of the Radio Neutrino Observatory in Greenland (RNO-G), we conducted radioglaciological measurements at Summit Station, Greenland to refine our understanding of the ice target. We report the result of one such measurement, the radio-frequency electric field attenuation length $L_\alpha$. We find an approximately linear dependence of $L_\alpha$ on frequency with the best fit of the average field attenuation for the upper 1500 m of ice: $\langle L_\alpha \rangle = ( ( 1154 \pm 121) - ( 0.81 \pm 0.14) \, ( \nu /{\rm MHz}) ) \,{\rm m}$ for frequencies ν ∈ [145 − 350] MHz.
Poor transition planning contributes to discontinuity of care at the child–adult mental health service boundary (SB), adversely affecting mental health outcomes in young people (YP). The aim of the study was to determine whether managed transition (MT) improves mental health outcomes of YP reaching the child/adolescent mental health service (CAMHS) boundary compared with usual care (UC).
Methods
A two-arm cluster-randomised trial (ISRCTN83240263 and NCT03013595) with clusters allocated 1:2 between MT and UC. Recruitment took place in 40 CAMHS (eight European countries) between October 2015 and December 2016. Eligible participants were CAMHS service users who were receiving treatment or had a diagnosed mental disorder, had an IQ ⩾ 70 and were within 1 year of reaching the SB. MT was a multi-component intervention that included CAMHS training, systematic identification of YP approaching SB, a structured assessment (Transition Readiness and Appropriateness Measure) and sharing of information between CAMHS and adult mental health services. The primary outcome was HoNOSCA (Health of the Nation Outcome Scale for Children and Adolescents) score 15-months post-entry to the trial.
Results
The mean difference in HoNOSCA scores between the MT and UC arms at 15 months was −1.11 points (95% confidence interval −2.07 to −0.14, p = 0.03). The cost of delivering the intervention was relatively modest (€17–€65 per service user).
Conclusions
MT led to improved mental health of YP after the SB but the magnitude of the effect was small. The intervention can be implemented at low cost and form part of planned and purposeful transitional care.
Functional benefits of the morphologies described by Bergmann's and Allen's rules in human males have recently been reported. However, the functional implications of ecogeographical patterning in females remain poorly understood. Here, we report the findings of preliminary work analysing the association between body shape and performance in female ultramarathon runners (n = 36) competing in hot and cold environments. The body shapes differed between finishers of hot and cold races, and also between hot race finishers and non-finishers. Variability in race performance across different settings supports the notion that human phenotype is adapted to different thermal environments as ecogeographical patterns have reported previously. This report provides support for the recent hypothesis that the heightened thermal strain associated with prolonged physical activity in hot/cold environments may have driven the emergence of thermally adaptive phenotypes in our evolutionary past. These results also tentatively suggest that the relationship between morphology and performance may be stronger in female vs. male athletes. This potential sex difference is discussed with reference to the evolved unique energetic context of human female reproduction. Further work, with a larger sample size, is required to investigate the observed potential sex differences in the strength of the relationship between phenotype and performance.
An impairment in recognizing distress is implicated in the development and severity of antisocial behavior. It has been hypothesized that a lack of attention to the eyes plays a role, but supporting evidence is limited. We developed a computerized training to improve emotion recognition in children and examined the role of eye gaze before and after training. Children referred into an intervention program to prevent antisocial outcomes completed an emotion recognition task with concurrent eye tracking. Those with emotion recognition impairments (n = 54, mean age: 8.72 years, 78% male) completed the training, while others (n = 38, mean age: 8.95 years, 84% male) continued with their usual interventions. Emotion recognition and eye gaze were reassessed in all children 8 weeks later. Impaired negative emotion recognition was significantly related to severity of behavioral problems at pretest. Children who completed the training significantly improved in emotion recognition; eye gaze did not contribute to impairment or improvement in emotion recognition. This study confirms the role of emotion recognition in severity of disruptive behavior and shows that a targeted intervention can quickly improve emotion impairments. The training works by improving children's ability to appraise emotional stimuli rather than by influencing their visual attention.
One of the problems in the study of mammary carcinomas is the cellular heterogeneity which they present. The use of a specific histochemical marker can help to distinguish the two principal cellular types found in the mammary gland; the epithelial and myoepithelial cells.
Mammary glands of female Balb/C mice of different ages were removed and fixed in 1.5% glutaraldehyde in 0.1M cacodylate buffer pH 7.2 with 1% sucrose. After two hours of fixation at 4°C, the material was washed in 0.1M cacodylate buffer, sectioned with a Smith-Farquar microtome set at 70-100u, and incubated for the histochemical detection of ATPases. The tissue was then washed again with 0.1M cacodylate buffer, postfixed in 1% osmium tetroxide, dehydrated, and embedded in an Epon-Araldite mixture.
For the Mg++ dependent ATPase the conventional method of Wachstein and Meisel was followed. The Mg++ dependent ATPase is localized in the plasma membranes of the epithelial and myoepithelial cells (Fig. 1).
Introduction: Selecting appropriate patients for hospitalization following emergency department (ED) evaluation of syncope is critical for serious adverse event (SAE) identification. The primary objective of this study is to determine the association of hospitalization and SAE detection using propensity score (PS) matching. The secondary objective was to determine if SAE identification with hospitalization varied by the Canadian Syncope Risk Score (CSRS) risk-category. Methods: This was a secondary analysis of two large prospective cohort studies that enrolled adults (age ≥ 16 years) with syncope at 11 Canadian EDs. Patients with a serious condition identified during index ED evaluation were excluded. Outcome was a 30-day SAE identified either in-hospital for hospitalized patients or after ED disposition for discharged patients and included death, ventricular arrhythmia, non-lethal arrhythmia and non-arrhythmic SAE (myocardial infarction, structural heart disease, pulmonary embolism, hemorrhage). Patients were propensity matched using age, sex, blood pressure, prodrome, presumed ED diagnosis, ECG abnormalities, troponin, heart disease, hypertension, diabetes, arrival by ambulance and hospital site. Multivariable logistic regression assessed the interaction between CSRS and SAE detection and we report odds ratios (OR). Results: Of the 8183 patients enrolled, 743 (9.0%) patients were hospitalized and 658 (88.6%) were PS matched. The OR for SAE detection for hospitalized patients in comparison to those discharged from the ED was 5.0 (95%CI 3.3, 7.4), non-lethal arrhythmia 5.4 (95%CI 3.1, 9.6) and non-arrhythmic SAE 6.3 (95%CI 2.9, 13.5). Overall, the odds of any SAE identification, and specifically non-lethal arrhythmia and non-arrhythmia was significantly higher in-hospital among hospitalized patients than those discharged from the ED (p < 0.001). There were no significant differences in 30-day mortality (p = 1.00) or ventricular arrhythmia detection (p = 0.21). The interaction between ED disposition and CSRS was significant (p = 0.04) and the probability of 30-day SAEs while in-hospital was greater for medium and high risk CSRS patients. Conclusion: In this multicenter prospective cohort, 30-day SAE detection was greater for hospitalized compared with discharged patients. CSRS low-risk patients are least likely to have SAEs identified in-hospital; out-patient monitoring for moderate risk patients requires further study.
Introduction: An increasing number of Canadian paramedic services are creating Community Paramedic programs targeting treatment of long-term care (LTC) patients on-site. We explored the characteristics, clinical course and disposition of LTC patients cared for by paramedics during an emergency call, and the possible impact of Community Paramedic programs. Methods: We completed a health records review of paramedic call reports and emergency department (ED) records between April 1, 2016 and March 31, 2017. We utilized paramedic dispatch data to identify emergency calls originating from LTC centers resulting in transport to one of the two EDs of the Ottawa Hospital. We excluded patients with absent vital signs, a Canadian Triage and Acuity Scale (CTAS) score of 1, and whose transfer to hospital were deferrable or scheduled. We stratified remaining cases by month and selected cases using a random number generator to meet our apriori sample size. We collected data using a piloted standardized form. We used descriptive statistics and categorized patients into groups based on the ED care received and if the treatment received fit into current paramedic medical directives. Results: Characteristics of the 381 included patients were mean age 82.5 years, 58.5% female, 59.7% hypertension, 52.6% dementia and 52.1% cardiovascular disease. On arrival at hospital, 57.7% of patients waited in offload delay for a median time of 45 minutes (IQR 33.5-78.0). We could identify 4 groups: 1) Patients requiring no treatment or diagnostics in the ED (7.9%); 2) Patients receiving ED treatment within current paramedic medical directives and no diagnostics (3.2%); 3) Patients requiring diagnostics or ED care outside current paramedic directives (54.9%); and 4) patients requiring admission (34.1%). Most patients were discharged from the ED (65.6%), and 1.1% died. The main ED diagnoses were infection (18.6%) and musculoskeletal injury (17.9%). Of the patients that required ED care but were discharged, 64.1% required x-rays, 42.1% CT, and 3.4% ultrasound. ED care included intravenous fluids (35.7%), medication (67.5%), antibiotics (29.4%), non-opioid analgesics (29.4%) and opioids (20.7%). Overall, 11.1% of patients didn't need management beyond current paramedic capabilities. Conclusion: Many LTC patients could receive care by paramedics on-site within current medical directives and avoid a transfer to the ED. This group could potentially grow using Community Paramedics with an expanded scope of practice.
Introduction: Emergency department (ED) crowding, long waits for care, and paramedic offload delay are of increasing concern. Older adults living in long-term care (LTC) are more likely to utilize the ED and are vulnerable to adverse events. We sought to identify existing programs that seek to avoid ED visits from LTC facilities where allied health professionals are the primary providers of the intervention and, to evaluate their efficacy and safety. Methods: We completed this systematic review based on a protocol we published apriori and following the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) statement. We systematically searched Medline, CINAHL and EMBASE with terms relating to long-term care, emergency services, hospitalization and allied health personnel. Two investigators independently selected studies and extracted data using a piloted standardized form and evaluated the risk of bias of included studies. We report a narrative synthesis grouped by intervention categories. Results: We reviewed 11,176 abstracts and included 22 studies. Most studies were observational and few assessed patient safety. We found five categories of interventions including: 1) use of advanced practice nursing; 2) a program called Interventions to Reduce Acute Care Transfers (INTERACT); 3) end-of-life care; 4) condition specific interventions; and 5) use of extended care paramedics. Of the 13 studies that reported ED visits, all (100%) reported a decrease, and of the 16/17 that reported hospitalization, 94.1% reported a decrease. Patient adverse events such as functional status and relapse were seldom reported (6/22) as were measures of emergency system function such as crowding/inability of paramedics to transfer care to the ED (1/22). Only 4/22 studies evaluated patient mortality and 3/4 found a non-statistically significant worsening. When measured, studies reported decreased hospital length of stay, more time spent with patients by allied health professionals and cost savings. Conclusion: We found five types of programs/interventions which all demonstrated a decrease in ED visits or hospitalization. Many identified programs focused on improved primary care for patients. Interventions addressing acute care issues such as those provided by community paramedics, patient preferences, and quality of life indicators all deserve more study.
As consumer-directed care programmes become increasingly common in aged care provision, there is a heightened requirement for literature summarising the experience and perspectives of recipients. We conducted rapid evidence reviews on two components of consumer experience of home- and community-based aged care: (a) drivers of choice when looking for a service (Question 1 (Q1)); and (b) perceptions of quality of services (Question 2 (Q2)). We systematically searched MEDLINE and EMBASE databases, and conducted manual (non-systematic) searches of primary and grey literature (e.g. government reports) across CINAHL, Scopus, PsychINFO, and Web of Science, Trove and OpenGrey databases. Articles deemed eligible after abstract/full-text screening subsequently underwent risk-of-bias assessment to ensure their quality. The final included studies (Q1: N = 21; Q2: N = 19) comprised both quantitative and qualitative articles, which highlighted that consumer choices of services are driven by a combination of: desire for flexibility in service provision; optimising mobility; need for personal assistance, security and safety, interaction, and social/leisure activities; and to target and address previously unmet needs. Similarly, consumer perspectives of quality include control and autonomy, interpersonal interactions, flexibility of choice, and safety and affordability. Our reviews suggest that future model development should take into account consumers’ freedom to choose services in a flexible manner, and the value they place on interpersonal relationships and social interaction.
Both extinct and extant hominin populations display morphological features consistent with Bergmann's and Allen's Rules. However, the functional implications of the morphologies described by these ecological laws are poorly understood. We examined this through the lens of endurance running. Previous research concerning endurance running has focused on locomotor energetic economy. We considered a less-studied dimension of functionality, thermoregulation. The performance of male ultra-marathon runners (n = 88) competing in hot and cold environments was analysed with reference to expected thermoregulatory energy costs and the optimal morphologies predicted by Bergmann's and Allen's Rules. Ecogeographical patterning supporting both principles was observed in thermally challenging environments. Finishers of hot-condition events had significantly longer legs than finishers of cold-condition events. Furthermore, hot-condition finishers had significantly longer legs than those failing to complete hot-condition events. A degree of niche-picking was evident; athletes may have tailored their event entry choices in accordance with their previous race experiences. We propose that the interaction between prolonged physical exertion and hot or cold climates may induce powerful selective pressures driving morphological adaptation. The resulting phenotypes reduce thermoregulatory energetic expenditure, allowing diversion of energy to other functional outcomes such as faster running.
Introduction: In-hospital cardiac arrest (IHCA) most commonly occurs in non-monitored areas, where we observed a 10min delay before defibrillation (Phase I). Nurses (RNs) and respiratory therapists (RTs) cannot legally use Automated External Defibrillators (AEDs) during IHCA without a medical directive. We sought to evaluate IHCA outcomes following usual implementation (Phase II) vs. a Theory-Based educational program (Phase III) allowing RNs and RTs to use AEDs during IHCA. Methods: We completed a pragmatic before-after study of consecutive IHCA. We used ICD-10 codes to identify potentially eligible cases and included IHCA cases for which resuscitation was attempted. We obtained consensus on all data definitions before initiation of standardized-piloted data extraction by trained investigators. Phase I (Jan.2012-Aug.2013) consisted of baseline data. We implemented the AED medical directive in Phase II (Sept.2013-Aug.2016) using usual implementation strategies. In Phase III (Sept.2016-Dec.2017) we added an educational video informed by key constructs from a Theory of Planned Behavior survey. We report univariate comparisons of Utstein IHCA outcomes using 95% confidence intervals (CI). Results: There were 753 IHCA for which resuscitation was attempted with the following similar characteristics (Phase I n = 195; II n = 372; III n = 186): median age 68, 60.0% male, 79.3% witnessed, 29.7% non-monitored medical ward, 23.9% cardiac cause, 47.9% initial rhythm of pulseless electrical activity and 27.2% ventricular fibrillation/tachycardia (VF/VT). Comparing Phases I, II and III: an AED was used 0 times (0.0%), 21 times (5.6%), 15 times (8.1%); time to 1st rhythm analysis was 6min, 3min, 1min; and time to 1st shock was 10min, 10min and 7min. Comparing Phases I and III: time to 1st shock decreased by 3min (95%CI -7; 1), sustained ROSC increased from 29.7% to 33.3% (AD3.6%; 95%CI -10.8; 17.8), and survival to discharge increased from 24.6% to 25.8% (AD1.2%; 95%CI -7.5; 9.9). In the VF/VT subgroup, time to first shock decreased from 9 to 3 min (AD-6min; 95%CI -12; 0) and survival increased from 23.1% to 38.7% (AD15.6%; 95%CI -4.3; 35.4). Conclusion: The implementation of a medical directive allowing for AED use by RNs and RRTs successfully improved key outcomes for IHCA victims, particularly following the Theory-Based education video. The expansion of this project to other hospitals and health care professionals could significantly impact survival for VF/VT patients.
Introduction: Patients with major bleeding (e.g. gastrointestinal bleeding, and intracranial hemorrhage [ICH]) are commonly encountered in the Emergency Department (ED). A growing number of patients are on either oral or parenteral anticoagulation (AC), but the impact of AC on outcomes of patients with major bleeding is unknown. With regards to oral anticoagulation (OAC), we particularly sought to analyze differences between patients on Warfarin or Direct Oral Anticoagulants (DOACs). Methods: We analyzed a prospectively collected registry (2011-2016) of patients who presented to the ED with major bleeding at two academic hospitals. “Major bleeding” was defined by the International Society on Thrombosis and Haemostasis criteria. The primary outcome, in-hospital mortality, was analyzed using a multivariable logistic regression model. Secondary outcomes included discharge to long-term care among survivors, total hospital length of stay (LOS) among survivors, and total hospital costs. Results: 1,477 patients with major bleeding were included. AC use was found among 215 total patients (14.6%). Among OAC patients (n = 181), 141 (77.9%) had used Warfarin, and 40 (22.1%) had used a DOAC. 484 patients (32.8%) died in-hospital. AC use was associated with higher in-hospital mortality (adjusted odds ratio [OR]: 1.50 [1.17-1.93]). Among survivors to discharge, AC use was associated with higher discharge to long-term care (adjusted OR: 1.73 [1.18-2.57]), prolonged median LOS (19 days vs. 16 days, P = 0.03), and higher mean costs ($69,273 vs. $58,156, P = 0.02). With regards to OAC, a higher proportion of ICH was seen among patients on Warfarin (39.0% vs. 32.5%), as compared to DOACs. No difference in mortality was seen between DOACs and Warfarin (adjusted OR: 0.84 [0.40-1.72]). Patients with major bleeding on Warfarin had longer median LOS (11 days vs. 6 days, P = 0.03) and higher total costs ($51,524 vs. $35,176, P < 0.01) than patients on DOACs. Conclusion: AC use was associated with higher mortality among ED patients with major bleeding. Among survivors, AC use was associated with increased LOS, costs, and discharge to long-term care. Among OAC patients, no difference in mortality was found. Warfarin was associated with prolonged LOS and costs, likely secondary to higher incidence of ICH, as compared to DOACs.
BACKGROUND: IGTS is a rare phenomenon of paradoxical germ cell tumor (GCT) growth during or following treatment despite normalization of tumor markers. We sought to evaluate the frequency, clinical characteristics and outcome of IGTS in patients in 21 North-American and Australian institutions. METHODS: Patients with IGTS diagnosed from 2000-2017 were retrospectively evaluated. RESULTS: Out of 739 GCT diagnoses, IGTS was identified in 33 patients (4.5%). IGTS occurred in 9/191 (4.7%) mixed-malignant GCTs, 4/22 (18.2%) immature teratomas (ITs), 3/472 (0.6%) germinomas/germinomas with mature teratoma, and in 17 secreting non-biopsied tumours. Median age at GCT diagnosis was 10.9 years (range 1.8-19.4). Male gender (84%) and pineal location (88%) predominated. Of 27 patients with elevated markers, median serum AFP and Beta-HCG were 70 ng/mL (range 9.2-932) and 44 IU/L (range 4.2-493), respectively. IGTS occurred at a median time of 2 months (range 0.5-32) from diagnosis, during chemotherapy in 85%, radiation in 3%, and after treatment completion in 12%. Surgical resection was attempted in all, leading to gross total resection in 76%. Most patients (79%) resumed GCT chemotherapy/radiation after surgery. At a median follow-up of 5.3 years (range 0.3-12), all but 2 patients are alive (1 succumbed to progressive disease, 1 to malignant transformation of GCT). CONCLUSION: IGTS occurred in less than 5% of patients with GCT and most commonly after initiation of chemotherapy. IGTS was more common in patients with IT-only on biopsy than with mixed-malignant GCT. Surgical resection is a principal treatment modality. Survival outcomes for patients who developed IGTS are favourable.
Introduction: The Canadian Syncope Risk Score (CSRS) was developed to identify patients at risk for serious adverse events (SAE) within 30 days of an Emergency Department (ED) visit for syncope. We sought to validate the score in a new cohort of ED patients. Methods: We conducted a multicenter prospective cohort study at 8 large academic tertiary-care EDs across Canada from March 2014 to Dec 2016. We enrolled adults (age 16 years) who presented within 24 hours of syncope, after excluding those with persistent altered mentation, witnessed seizure, intoxication, and major trauma requiring hospitalization. Treating ED physicians collected the nine CSRS predictors at the index visit. Adjudicated SAE included death, arrhythmias and non-arrhythmic SAE (myocardial infarction, serious structural heart disease, pulmonary embolism, severe hemorrhage and procedural interventions within 30-days). We assessed area under the Receiver Operating Characteristic (ROC) curve, score calibration, and the classification performance for the various risk categories. Results: Of the 2547 patients enrolled, 146 (5.7%) were lost to follow-up and 111 (4.3%) had serious condition during the index ED visit and were excluded. Among the 2290 analyzed, 79 patients (3.4%; 0.4% death, 1.4% arrhythmia) suffered 30-day serious outcomes after ED disposition. The accuracy of the CSRS remained high with area under the ROC curve at 0.87 (95%CI 0.82-0.92), similar to the derivation phase (0.87; 95%CI 0.84-0.89). The score showed excellent calibration at the prespecified risk strata. For the very-low risk category (0.3% SAE of which 0.2% were arrhythmia and no deaths) the sensitivity was 97.5% and negative predictive value was 99.7% (95%CI 98.7-99.9). For the very high-risk category (61.5% SAE of which 26.9% were arrhythmia and 11.5% death) the specificity was 99.4% and positive predictive value was 61.5% (95% CI 43.0-77.2). Conclusion: In this multicenter validation study, the CSRS accurately risk stratified ED patients with syncope for short-term serious outcomes after ED disposition. The score should aid in minimizing investigation and observation of very-low risk patients, and prioritization of inpatient vs outpatient investigations or following of the rest. The CSRS is ready for implementation studies examining ED management decisions, patient safety and health care resource utilization.