We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study is the first to attempt to isolate a relationship between cognitive activity and equilibration to a Nash Equilibrium. Subjects, while undergoing fMRI scans of brain activity, participated in second price auctions against a single competitor following predetermined strategy that was unknown to the subject. For this auction there is a unique strategy that will maximize the subjects’ earnings, which is also a Nash equilibrium of the associated game theoretic model of the auction. As is the case with all games, the bidding strategies of subjects participating in second price auctions most often do not reflect the equilibrium bidding strategy at first but with experience, typically exhibit a process of equilibration, or convergence toward the equilibrium. This research is focused on the process of convergence.
In the data reported here subjects participated in sixteen auctions, after which all subjects were told the strategy that will maximize their revenues, the theoretical equilibrium. Following that announcement, sixteen more auctions were performed. The question posed by the research concerns the mental activity that might accompany equilibration as it is observed in the bidding behavior. Does brain activation differ between being equilibrated and non-equilibrated in the sense of a bidding strategy? If so, are their differences in the location of activation during and after equilibration? We found significant activation in the frontal pole especially in Brodmann's area 10, the anterior cingulate cortex, the amygdala and the basal forebrain. There was significantly more activation in the basal forebrain and the anterior cingulate cortex during the first sixteen auctions than in the second sixteen. The activity in the amygdala shifted from the right side to the left after the solution was given.
Management of primary headache (PHA) varies across emergency departments (ED), yet there is widespread agreement that computed tomography (CT) scans are overused. This study assessed emergency physicians’ (EPs) PHA management and their attitudes towards head CT ordering.
Methods:
A cross-sectional study was undertaken with EPs from one Canadian center. Drivers of physicians’ perceptions regarding the appropriateness of CT ordering for patients with PHA were explored.
Results:
A total of 73 EPs (70% males; 48% with <10 years of practice) participated in the study. Most EPs (88%) did not order investigations for moderate-severe primary headaches; however, CT was the common investigation (47%) for headaches that did not improve. Computed tomography ordering was frequently motivated by the need for specialist consultation (64%) or admission (64%). A small proportion (27%) believed patients usually/frequently expected a scan. Nearly half of EPs (48%) identified patient imaging expectations/requests as a barrier to reducing CT ordering. Emergency physicians with CCFP (EM) certification were less likely to perceive CT ordering for patients with PHA as appropriate. Conversely, those who identified the possibility of missing a condition as a major barrier to limiting their CT use were more likely to perceive CT ordering for patients with PHA as appropriate.
Conclusions:
Emergency physicians reported consistency and evidence-based medical management. They highlighted the complexities of limiting CT ordering and both their level of training and their perceived barriers for limiting CT ordering seem to be influencing their attitudes. Further studies could elucidate these and other factors influencing their practice.
Descriptions of the aetiology of neurodegenerative disorders often focus on their specific molecular and genetic basis of archetypical phenotypes. Clinical trials usually focus on unequivocal and unconfounded cases, through strict inclusion and exclusion criteria. This is understandable in terms of clarity of didactic teaching, and clinical trial efficiency. However, comorbidity is the norm, not the exception.
The advent of biomarkers highlighted the pre-symptomatic stage of neurodegenerative disease, among apparently healthy adults with normal cognition (Figure 10.3.1A). This preclinical stage comes before the period of mild early symptoms and signs, which is called the prodromal phase. With genetic aetiology, one can study healthy pre-symptomatic adult mutation carriers many years before symptoms. This is underway through international collaborations such as the Dominantly Inherited Alzheimer Network, the Genetic Frontotemporal Dementia Initiative and Parkinson’s Progression Markers Initiative genetic cohort. These international collaborations have much in common, with large cohorts (500–1,500) of patients and healthy first-degree relatives; followed longitudinally with ‘deep phenotyping’ of multiple blood, cerebrospinal fluid and imaging biomarkers and neuropsychological evaluation.
We have seen that there is a long period of neurodegeneration in the absence of symptoms. This asymptomatic period highlights the issues of functional reserve, resilience and tolerance – that function (cognition, activities and roles etc.) can be maintained despite pathology (molecular pathology, neuroinflammation, synaptic loss, atrophy).
The burden of dementia on health, social and economic well-being is enormous, whether viewed in terms of the 40 million people living with dementia (predicted 75 million by 2030) or the trillion-dollar cost per annum (predicted $2 trillion by 2030) [1]. In many parts of the world, mental health services provide the backbone to dementia diagnosis and management. The sections in this chapter focus on neurodegenerative disorders although several non-degenerative causes of dementia are considered alongside. Neurodegenerative disorders commonly present with changes in personality and behaviour that lead to referral for psychiatric assessment. Other common psychiatric disorders can mimic dementia, or complicate its management.
The syndromes caused by Alzheimer’s disease pathology, dementia with Lewy bodies, frontotemporal lobar degeneration and progressive supranuclear palsy pathology are each diverse. For example, Alzheimer’s disease commonly presents with amnesia (and hippocampal atrophy), but it can also present with visuospatial deficits (posterior cortical atrophy), logopenic progressive aphasia (temporoparietal atrophy), corticobasal syndrome (parkinsonism, dystonia, apraxia and higher cognitive deficits) or a behavioural/dysexecutive syndrome [52].
The dichotomy of dementia diagnosis (healthy adult versus patient) conflicts with the gradually progressive nature of neurodegeneration. There may be a memorable first event – such as a fall, or getting lost – but usually symptoms emerge against an individual’s normal ability and behaviour. In other words, they start mild.
In the seminal report of the Lancet Commission on dementia [54] the identification of contributory risk factors, their effect size and their modifiabilty, led to a startling conclusion: as much as 40% of dementia could be prevented by immediate application of known interventions. Not by a novel disease-modifying treatment but a systematic approach to resolve and reduce risk factors across the lifespan. Early-life education; mid-life obesity, hypertension and hearing loss; and later life exercise, smoking, social isolation, diabetes, depression and pollution contribute to this potential for dementia prevention.
The Swan Point site in interior Alaska contains a significant multi-component archaeological record dating back to 14,200 cal BP. The site’s radiocarbon (14C) chronology has been presented in scattered publications that mostly focus on specific archaeological periods in Alaska, in particular its terminal Pleistocene components associated with the East Beringian tradition. This paper synthesizes the site’s 14C data and provides sequential Bayesian models for its cultural zones and subzones. The 14C and archaeological record at Swan Point attests that the location was persistently used over the last 14,000 years, even though major changes are evident within regional vegetation and local faunal communities, reflecting long-term trends culminating in Dene-Athabascan history.
Children from low-socioeconomic backgrounds exhibit more behavioural difficulties than those from more affluent families. Influential theoretical models specify family stress and child characteristics as mediating this effect. These accounts, however, have often been based on cross-sectional data or longitudinal analyses that do not capture all potential pathways, and therefore may not provide good policy guidance.
Methods
In a UK representative sample of 2399 children aged 5–15, we tested mediation of the effect of household income on parent and teacher reports of conduct problems (CP) via unhealthy family functioning, poor parental mental health, stressful life events, child physical health and reading ability. We applied cross-lagged longitudinal mediation models which allowed for testing of reciprocal effects whereby the hypothesised mediators were modelled as outcomes as well as predictors of CP.
Results
We found the predicted significant longitudinal effect of income on CP, but no evidence that it was mediated by the child and family factors included in the study. Instead, we found significant indirect paths from income to parental mental health, child physical health and stressful life events that were transmitted via child CP.
Conclusion
The results confirm that income is associated with change in CP but do not support models that suggest this effect is transmitted via unhealthy family functioning, parental mental health, child physical health, stressful life events or reading difficulties. Instead, the results highlight that child CP may be a mediator of social inequalities in family psychosocial functioning.
Antibiotics are among the most common medications prescribed in nursing homes. The annual prevalence of antibiotic use in residents of nursing homes ranges from 47% to 79%, and more than half of antibiotic courses initiated in nursing-home settings are unnecessary or prescribed inappropriately (wrong drug, dose, or duration). Inappropriate antibiotic use is associated with a variety of negative consequences including Clostridioides difficile infection (CDI), adverse drug effects, drug–drug interactions, and antimicrobial resistance. In response to this problem, public health authorities have called for efforts to improve the quality of antibiotic prescribing in nursing homes.
Background: Pneumonia (PNA) is an important cause of morbidity and mortality among nursing home residents. The McGeer surveillance definitions were revised in 2012 to help NHs better monitor infections for quality improvement purposes. However, the concordance between surveillance definitions and clinically diagnosed PNA has not been well studied. Our objectives were to identify nursing home residents who met the revised McGeer PNA definition, to compare them with residents with clinician documented PNA, and determine whether modifications to the surveillance criteria could increase concordance. Methods: We analyzed respiratory tract infection (RTI) data from 161 nursing homes in 10 states that participated in a 1-day healthcare-associated infection point-prevalence survey in 2017. Trained surveillance officers from the CDC Emerging Infections Program collected data on residents with clinician documentation, signs, symptoms, and diagnostic testing potentially indicating an RTI. Clinician-documented pneumonia was defined as any resident with a diagnosis of pneumonia identified in the medical chart. We identified the proportion of residents with clinician documented PNA who met the revised McGeer PNA definition. We evaluated the criteria reported to develop 3 modified PNA surveillance definitions (Box), and we compared them to residents with clinician documented PNA.
Results: Among the 15,296 NH residents surveyed, 353 (2%) had >1 signs and/or symptoms potentially indicating RTI. Among the 353 residents, the average age was 76 years, 105 (30%) were admitted to postacute care or rehabilitation, and 108 (31%) had clinician-documented PNA. Among those with PNA, 28 (26%) met the Revised McGeer definition. Among 81 residents who did not meet the definition, 39 (48%) were missing the chest x-ray requirement, and among the remaining 42, only 3 (7%) met the constitutional criteria requirement (Fig. 1). Modification of the constitutional criteria requirement increased the detection of clinically documented PNA from 28 (26%) to 36 (33%) using modified definition 1; to 51 (47%) for modified definition 2; and to 55 (51%) for modified definition 3. Conclusions: Tracking PNA among nursing home residents using a standard definition is essential to improving detection and, therefore, informing prevention efforts. Modifying the PNA criteria increased the identification of clinically diagnosed PNA. Better concordance with clinically diagnosed PNA may improve provider acceptance and adoption of the surveillance definition, but additional research is needed to test its validity.
Northern Australia is a region where limited information exists on environments at the last glacial maximum (LGM). Girraween Lagoon is located on the central northern coast of Australia and is a site representative of regional tropical savanna woodlands. Girraween Lagoon remained a perennial waterbody throughout the LGM, and as a result retains a complete proxy record of last-glacial climate, vegetation and fire. This study combines independent palynological and geochemical analyses to demonstrate a dramatic reduction in both tree cover and woody richness, and an expansion of grassland, relative to current vegetation at the site. The process of tree decline was primarily controlled by the cool-dry glacial climate and CO2 effects, though more localised site characteristics restricted wetland-associated vegetation. Fire processes played less of a role in determining vegetation than during the Holocene and modern day, with reduced fire activity consistent with significantly lower biomass available to burn. Girraween Lagoon's unique and detailed palaeoecological record provides the opportunity to explore and assess modelling studies of vegetation distribution during the LGM, particularly where a number of different global vegetation and/or climate simulations are inconsistent for northern Australia, and at a range of resolutions.
Dilophosaurus wetherilli was the largest animal known to have lived on land in North America during the Early Jurassic. Despite its charismatic presence in pop culture and dinosaurian phylogenetic analyses, major aspects of the skeletal anatomy, taxonomy, ontogeny, and evolutionary relationships of this dinosaur remain unknown. Skeletons of this species were collected from the middle and lower part of the Kayenta Formation in the Navajo Nation in northern Arizona. Redescription of the holotype, referred, and previously undescribed specimens of Dilophosaurus wetherilli supports the existence of a single species of crested, large-bodied theropod in the Kayenta Formation. The parasagittal nasolacrimal crests are uniquely constructed by a small ridge on the nasal process of the premaxilla, dorsoventrally expanded nasal, and tall lacrimal that includes a posterior process behind the eye. The cervical vertebrae exhibit serial variation within the posterior centrodiapophyseal lamina, which bifurcates and reunites down the neck. Iterative specimen-based phylogenetic analyses result in each of the additional specimens recovered as the sister taxon to the holotype. When all five specimens are included in an analysis, they form a monophyletic clade that supports the monotypy of the genus. Dilophosaurus wetherilli is not recovered as a ceratosaur or coelophysoid, but is instead a non-averostran neotheropod in a grade with other stem-averostrans such as Cryolophosaurus ellioti and Zupaysaurus rougieri. We did not recover a monophyletic ‘Dilophosauridae.’ Instead of being apomorphic for a small clade of early theropods, it is more likely that elaboration of the nasals and lacrimals of stem-averostrans is plesiomorphically present in early ceratosaurs and tetanurans that share those features. Many characters of the axial skeleton of Dilophosaurus wetherilli are derived compared to Late Triassic theropods and may be associated with macropredation and an increase in body size in Theropoda across the Triassic-Jurassic boundary.
Introduction: Selecting appropriate patients for hospitalization following emergency department (ED) evaluation of syncope is critical for serious adverse event (SAE) identification. The primary objective of this study is to determine the association of hospitalization and SAE detection using propensity score (PS) matching. The secondary objective was to determine if SAE identification with hospitalization varied by the Canadian Syncope Risk Score (CSRS) risk-category. Methods: This was a secondary analysis of two large prospective cohort studies that enrolled adults (age ≥ 16 years) with syncope at 11 Canadian EDs. Patients with a serious condition identified during index ED evaluation were excluded. Outcome was a 30-day SAE identified either in-hospital for hospitalized patients or after ED disposition for discharged patients and included death, ventricular arrhythmia, non-lethal arrhythmia and non-arrhythmic SAE (myocardial infarction, structural heart disease, pulmonary embolism, hemorrhage). Patients were propensity matched using age, sex, blood pressure, prodrome, presumed ED diagnosis, ECG abnormalities, troponin, heart disease, hypertension, diabetes, arrival by ambulance and hospital site. Multivariable logistic regression assessed the interaction between CSRS and SAE detection and we report odds ratios (OR). Results: Of the 8183 patients enrolled, 743 (9.0%) patients were hospitalized and 658 (88.6%) were PS matched. The OR for SAE detection for hospitalized patients in comparison to those discharged from the ED was 5.0 (95%CI 3.3, 7.4), non-lethal arrhythmia 5.4 (95%CI 3.1, 9.6) and non-arrhythmic SAE 6.3 (95%CI 2.9, 13.5). Overall, the odds of any SAE identification, and specifically non-lethal arrhythmia and non-arrhythmia was significantly higher in-hospital among hospitalized patients than those discharged from the ED (p < 0.001). There were no significant differences in 30-day mortality (p = 1.00) or ventricular arrhythmia detection (p = 0.21). The interaction between ED disposition and CSRS was significant (p = 0.04) and the probability of 30-day SAEs while in-hospital was greater for medium and high risk CSRS patients. Conclusion: In this multicenter prospective cohort, 30-day SAE detection was greater for hospitalized compared with discharged patients. CSRS low-risk patients are least likely to have SAEs identified in-hospital; out-patient monitoring for moderate risk patients requires further study.
Introduction: Inhaled toxins from tobacco smoking, cannabis leaf smoking as well as vaping/e-cigarette products use are known causes of cardio-respiratory injury. While tobacco smoking has decreased among Canadian adults, there are now several other forms of legal inhalant products. While legal, the evidence of benefit and safety of vaping is limited. Of concern, cases of e-cigarette or vaping products use associated lung injury (EVALI) have been accumulating in the U.S. and now in Canada. Despite this, very little is known about the inhalation exposure of emergency department (ED) patients; this study was designed to explore lung health in the ED. Methods: We investigated the prevalence of exposure to vaping, tobacco and cannabis among patients presenting to a Canadian ED from July to November 2019. Ambulatory (CTAS 2 to 5), stable, adult (≥ 17 years) patients were prospectively identified and invited to complete a survey addressing factors related to lung health (previous diagnosis of respiratory conditions and respiratory symptoms at the ED presentation) and information on current exposure to vaping, tobacco and cannabis smoking. Categorical variables are reported as frequencies and percentages; continuous variables are reported as medians with interquartile range (IQR). The study was approved by the Health Research Ethics Board. Results: Overall, 1024 (71%) of 1433 eligible patients completed the survey. The median age was 43.5 (IQR: 29, 60), and 51% were female. A total of 351 (31%) participants reported having been previously diagnosed with ≥1 respiratory conditions, and 177 (17%) were visiting the ED as a result of ≥1 respiratory symptoms (e.g., cough, shortness of breath, wheezing). Daily tobacco smoking was reported by 190 (19%), and 83 (8%) reported using vaping/e-cigarette products. Cannabis use within 30 days was described by 80 (15%) respondents. Exposure to tobacco and vaping products was reported by 39 (4%) participants, 63 (6%) reported using tobacco in combination with cannabis smoking, and 3% reported combining vaping and cannabis use. Conclusion: Patients seeking care in the ED are exposed to a large quantity of inhaled toxins. Vaping products, considered the cause of the most recent epidemic of severe lung injury, are used in isolation and in combination with other smoking products in Canada. These exposures should be documented and may increase the risk of lung health injuries and exacerbations of chronic respiratory conditions.
Introduction: In 2010, Alberta Health Services (AHS) introduced Transition Coordinators (TC), a unique nursing role focused on assessment of elderly patients to support safe discharge home. The objective of this study is to describe patient characteristics to predict safe discharge for seniors (≥65 years of age) and identify barriers that can be used to improve ED outcomes for these patients. Methods: Two trained research assistants conducted a chart review of the TC referral form and the ED Information System (EDIS) for patients seen by TCs between April and June 2017. Information on patient characteristics, existing home care and community services, the index ED visit and subsequent revisits were extracted. Data were entered into a purpose-built database in REDCap. A descriptive analysis was conducted; results are reported as mean ± standard deviation (SD), median (interquartile range [IQR]), or proportions, as appropriate. Results: A total of 1411 patients with TC referral forms were included (779 [55%] female). The majority of these patients were ≥65 (1350 [96%]) with a mean age of 82 ± 9.6. The majority of patients were triaged as a CTAS of 3 (835 [59%]) with the most common reasons for presentation including: shortness of breath (128 [9%]), abdominal pain (94 [6.7%]), and general weakness (81 [5.7%]). Nearly one third of patients (391 [30%]) were already receiving home care services; (96 [7%]) received a new home care referral as a result of their ED visit. Of all the patients, 1111 (79%) had comorbidities (median: 3 [IQR: 1 to 5]). Overall, 38% (n = 536) patients had visited the ED in the 12 months prior to the index with a median of 2 [IQR: 1 to 4) visits. On average, patient's length of stay for their index visits was 12 ± 0.35 hours. Admissions occurred for 599 [42%] patients with delays being common; the mean time between the decision to admit and the patient leaving the ED was 6 hrs ± 0.23. Conclusion: Seniors in the ED are complex patients who experience long lengths of stay and frequent delays in decision-making. Upon discharge, few patients receive referrals to community supports, potentially increasing the likelihood of revisits and readmissions. Future studies should assess whether the presence of TCs is associated with better outcomes in the community.
Introduction: For rhythm control of acute atrial flutter (AAFL) in the emergency department (ED), choices include initial drug therapy or initial electrical cardioversion (ECV). We compared the strategies of pharmacological cardioversion followed by ECV if necessary (Drug-Shock), and ECV alone (Shock Only). Methods: We conducted a randomized, blinded, placebo-controlled trial (1:1 allocation) comparing two rhythm control strategies at 11 academic EDs. We included stable adult patients with AAFL, where onset of symptoms was <48 hours. Patients underwent central web-based randomization stratified by site. The Drug-Shock group received an infusion of procainamide (15mg/kg over 30 minutes) followed 30 minutes later, if necessary, by ECV at 200 joules x 3 shocks. The Shock Only group received an infusion of saline followed, if necessary, by ECV x 3 shocks. The primary outcome was conversion to sinus rhythm for ≥30 minutes at any time following onset of infusion. Patients were followed for 14 days. The primary outcome was evaluated on an intention-to-treat basis. Statistical significance was assessed using chi-squared tests and multivariable logistic regression. Results: We randomized 76 patients, and none was lost to follow-up. The Drug-Shock (N = 33) and Shock Only (N = 43) groups were similar for all characteristics including mean age (66.3 vs 63.4 yrs), duration of AAFL (30.1 vs 24.5 hrs), previous AAFL (72.7% vs 69.8%), median CHADS2 score (1 vs 1), and mean initial heart rate (128.9 vs 126.0 bpm). The Drug-Shock and Shock only groups were similar for the primary outcome of conversion (100% vs 93%; absolute difference 7.0%, 95% CI -0.6;14.6; P = 0.25). The multivariable analyses confirmed the similarity of the two strategies (P = 0.19). In the Drug-Shock group 21.2% of patients converted with the infusion. There were no statistically significant differences for time to conversion (84.2 vs 97.6 minutes), total ED length of stay (9.4 vs 7.5 hours), disposition home (100% vs 95.3%), and stroke within 14 days (0 vs 0). Premature discontinuation of infusion (usually for transient hypotension) was more common in the Drug-Shock group (9.1% vs 0.0%) but there were no serious adverse events. Conclusion: Both the Drug-Shock and Shock Only strategies were highly effective and safe in allowing AAFL patients to go home in sinus rhythm. IV procainamide alone was effective in only one fifth of patients, much less than for acute AF.
Introduction: Management of acute atrial fibrillation or flutter (AFF) in the emergency department (ED) can be performed with chemical or electrical cardioversion. Procainamide is the most common chemical agent used in Canada; however, there is substantial practice variation. The objective of this systematic review was to provide comparative evidence on return to normal sinus rhythm (NSR) and adverse events to better support clinical decisions. Methods: Systematic search of five electronic databases and grey literature. Randomized controlled trials (RCTs) and prospective controlled cohort studies including adults (≥17 years) with recent-onset of AFF comparing intravenous procainamide with other cardioversion strategies (e.g., electrical cardioversion, placebo or other antiarrhythmic drugs) were eligible. Two independent reviewers performed study selection and data extraction. Relative risks (RR) with 95% confidence intervals (CIs) were calculated using a random-effects model. The protocol was registered with PROSPERO (CRD42019142080). Results: From 4060 potentially relevant citations, 7 studies were considered eligible and three RCTs and two cohort studies included in the analysis. Procainamide was less effective in promoting return to NSR at 1st attempt compared to other chemical (RR 0.76; 95% CI: 0.65 to 0.90) and electrical (RR 0.58; 95% CI: 0.53 to 0.64) options. Electrical cardioversion was more effective in restoring NSR compared to procainamide when used as 2nd attempt in one RCT (RR 0.46; 95% CI: 0.23 to 0.92). Pre-specified serious adverse events were assessed and reported by two studies showing that hypotension was more common in patients receiving procainamide in comparison with electrical cardioversion (RR 20.57; 95% CI: 1.59 to 265.63). Treatment discontinuation due to adverse events was infrequently reported with only two studies reporting that no patients withdrew from the study following treatment with procainamide. The remaining studies provided incomplete data reporting on adverse events. Conclusion: Shared decision-making for patients with acute AFF in the ED requires knowledge of the effectiveness and safety of comparative interventions. Overall, procainamide is less effective than other chemical options and electrical cardioversion strategies to restore NSR. Evidence shows that hypotension is a concern when procainamide is administered; however, the overall adverse events information provided from the studies is suboptimal.