We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The global population and status of Snowy Owls Bubo scandiacus are particularly challenging to assess because individuals are irruptive and nomadic, and the breeding range is restricted to the remote circumpolar Arctic tundra. The International Union for Conservation of Nature (IUCN) uplisted the Snowy Owl to “Vulnerable” in 2017 because the suggested population estimates appeared considerably lower than historical estimates, and it recommended actions to clarify the population size, structure, and trends. Here we present a broad review and status assessment, an effort led by the International Snowy Owl Working Group (ISOWG) and researchers from around the world, to estimate population trends and the current global status of the Snowy Owl. We use long-term breeding data, genetic studies, satellite-GPS tracking, and survival estimates to assess current population trends at several monitoring sites in the Arctic and we review the ecology and threats throughout the Snowy Owl range. An assessment of the available data suggests that current estimates of a worldwide population of 14,000–28,000 breeding adults are plausible. Our assessment of population trends at five long-term monitoring sites suggests that breeding populations of Snowy Owls in the Arctic have decreased by more than 30% over the past three generations and the species should continue to be categorised as Vulnerable under the IUCN Red List Criterion A2. We offer research recommendations to improve our understanding of Snowy Owl biology and future population assessments in a changing world.
Mobile health has been shown to improve quality, access, and efficiency of health care in select populations. We sought to evaluate the benefits of mobile health monitoring using the KidsHeart app in an infant CHD population.
Methods:
We reviewed data submitted to KidsHeart from parents of infants discharged following intervention for high-risk CHD lesions including subjects status post stage 1 single ventricle palliation, ductal stent or surgical shunt, pulmonary artery band, or right ventricular outflow tract stent. We report on the benefits of a novel mobile health red flag scoring system, mobile health growth/feed tracking, and longitudinal neurodevelopmental outcomes tracking.
Results:
A total of 69 CHD subjects (63% male, 41% non-white, median age 28 days [interquartile range 20, 75 days]) were included with median mobile health follow-up of 137 days (56, 190). During the analytic window, subjects submitted 5700 mobile health red flag notifications including 245 violations (mean [standard deviation] 3 ± 3.96 per participant) with 80% (55/69) of subjects submitting at least one violation. Violations precipitated 116 interventions including hospital admission in 34 (29%) with trans-catheter evaluation in 15 (13%) of those. Growth data (n = 2543 daily weights) were submitted by 63/69 (91%) subjects and precipitated 31 feed changes in 23 participants. Sixty-eight percent of subjects with age >2 months submitted at least one complete neurodevelopment questionnaire.
Conclusion:
In our initial experience, mobile health monitoring using the KidsHeart app enhanced interstage monitoring permitting earlier intervention, allowed for remote tracking of growth feeding, and provided a means for tracking longitudinal neurodevelopmental outcomes.
Plant growth requires the integration of internal and external cues, perceived and transduced into a developmental programme of cell division, elongation and wall thickening. Mechanical forces contribute to this regulation, and thigmomorphogenesis typically includes reducing stem height, increasing stem diameter, and a canonical transcriptomic response. We present data on a bZIP transcription factor involved in this process in grasses. Brachypodium distachyon SECONDARY WALL INTERACTING bZIP (SWIZ) protein translocated into the nucleus following mechanostimulation. Classical touch-responsive genes were upregulated in B. distachyon roots following touch, including significant induction of the glycoside hydrolase 17 family, which may be unique to grass thigmomorphogenesis. SWIZ protein binding to an E-box variant in exons and introns was associated with immediate activation followed by repression of gene expression. SWIZ overexpression resulted in plants with reduced stem and root elongation. These data further define plant touch-responsive transcriptomics and physiology, offering insights into grass mechanotranduction dynamics.
Background: Exercise is commonly recommended to patients following a lumbar microdiscectomy although controversy remains as to the timing and protocols for exercise intervention (early vs late intervention). Our study aimed to evaluate low back pain level, fear avoidance, neurodynamic mobility, and function after early versus later exercise intervention following a unilateral lumbar microdiscectomy. Methods: Forty patients who underwent unilateral lumbar microdiscectomy were randomly allocated to early (Group-1) or later (Group-2) exercise intervention group. The low back pain and fear avoidance were evaluated using Oswestry Low Back Pain Disability Questionnaire, Numeric Pain Rating Scale, and Fear-Avoidance Beliefs Questionnaire. The neurodynamic mobility and function were recorded with Dualer Pro IQ Inclinometer, 50-foot walk test, and Patient-Specific Functional Scale. Measurements were performed before surgery and post-surgery (1-2, 4-6, and 8-10 weeks) after exercise intervention. Results: Both groups showed a significant decrease in low back pain levels and fear avoidance as well as a significant improvement in neurodynamic mobility and function at 4 and 8 weeks post-surgery. No significant difference was detected between the two groups. Conclusions: These findings showed that early exercise intervention after lumbar microdiscectomy is safe and may reduce the low back pain, decrease fear avoidance, and improve neurodynamic mobility and function.
Introduction: Blood transfusions continue to be a critical intervention in patients presenting to emergency departments (ED). Improved understanding of the adverse events associated with transfusions has led to new research to inform and delineate transfusion guidelines. The Nova Scotia Guideline for Blood Component Utilization in Adults and Pediatrics was implemented in June 2017 to reflect current best practice in transfusion medicine. The guideline includes a lowering of the hemoglobin threshold from 80 g/L to 70 g/L for transfusion initiation, to be used in conjunction with the patient's hemodynamic assessment before and after transfusions. Our study aims to augment understanding of transfusion guideline adherence and ED physician transfusing practices at the Halifax Infirmary Emergency Department in Nova Scotia. Methods: A retrospective chart review was conducted on one third of all ED visits involving red-cell transfusions for one year prior to and one year following the guideline implementation. A total of 350 charts were reviewed. The primary data abstracted for the initial transfusion, and subsequent transfusion if applicable, from each reviewed chart included clinical and laboratory data reflective of the transfusion guideline. Based on these data, the transfusion event was classified one of three ways: indicated based on hemoglobin level, indicated based on patient's symptomatic presentation, or unable to determine if transfusion indicated based on charting. Results: The year before guideline implementation, the total number of transfusions initiated at a hemoglobin of between 71-80 was 31 of 146 total transfusions. This number dropped by 23.6% to 22 of 136 in the year following guideline implementation. The number of single-unit transfusions increased by 28.0% from 47 of 146 in the year prior to 56 of 136 in the year after guideline implementation. The initial indication for transfusion being unable to be determined based on charting provided increased by 120%. The indication for subsequent transfusions being unable to be determined based on charting increased by 1500% (P < 0.05). Conclusion: These data suggest that implementing transfusion guidelines effectively reduced the number of transfusions given in the ED setting and increased the number of single-unit transfusions administered. However, the data also suggest the need for better education around transfusion indications and proper documentation clearly outlining the rationale behind the decision to transfuse.
Background: In Canada, injuries represent 21% of Emergency Department (ED) visits. Faced with occupational injuries, physicians may feel pressured to provide urgent imaging to facilitate expedited return to work. There is not a body of literature to support this practice. Twenty percent of adult ED injuries involve workers compensation. Aim Statement: Tacit pressures were felt to impact imaging rates for patients with workplace injuries, and our aim was to determine if this hypothesis was accurate. We conducted a quality review to assess imaging rates among injuries suffered at work and outside work. A secondary aim was to reduce the harm resulting from non-value-added testing. Measures & Design: Information was collected from the Emergency Department Information System on patients with acute injuries over the age of 16-years including upper limb, lower limb, neck, back and head injuries. Data included both workplace and non-work-related presentations, Canadian Triage and Acuity Scale (CTAS) levels and age at presentation. Imaging included any of X-ray, CT, MRI, or Ultrasound ordered in EDs across the central zone of Nova Scotia from July 1, 2009 to June 30, 2019. A total of 282,860 patient-encounters were included for analysis. Comparison was made between patients presenting under the Workers’ Compensation Board of Nova Scotia (WCB) and those covered by the Department of Health and Wellness (DOHW). Imaging rates for all injuries were also trended over this ten-year period. Evaluation/Results: In patients between 16 and 65-years, the WCB group underwent more imaging (55.3% of visits) than did the DOHW group (43.1% of visits). In the same cohort, there was an overall decrease of over 10% in mean imaging rates for both WBC and DOHW between the first five-year period (2009-2013) and the second five-year study period (2013-2018). Imaging rates for WCB and DOHW converged with each decade beyond 35 years of age. No comparison was possible beyond 85-years, due to the absence of WCB presentations. Discussion/Impact: Patients presenting to the ED with workplace injuries are imaged at a higher rate than those covered by the DOHW. Campaigns promoting value-added care may have impacted imaging rates during the ten-year study period, explaining the decline in ED imaging for all injuries. While this 10% decrease in overall imaging is encouraging, these preliminary data indicate the need for further education on resource stewardship, especially for patients presenting to the ED with workplace injuries.
Introduction: Choosing Wisely Nova Scotia (CWNS), an affiliate of Choosing Wisely Canada™ (CWC), aims to address unnecessary care and testing through literature-informed lists developed by various disciplines. CWC has identified unnecessary head CTs among the top five interventions to question in the Emergency Department (ED). Zyluk (2015) determined the Canadian CT Head Rule (CCHR) as the most effective clinical decision rule in adults with minor head injuries. To better understand the current status of CCHR use in Nova Scotia, we conducted a retrospective audit of patient charts at the Charles V. Keating Emergency and Trauma Center, in Halifax, Nova Scotia. Methods: Our mixed methods design included a literature review, retrospective chart audit, and a qualitative audit-feedback component with participating physicians. The chart audit applied the guidelines for adherence to the CCHR and reported on the level of compliance within the ED. Analysis of qualitative data is included here, in parallel with in-depth to contextualize findings from the audit. Results: 302 charts of patients having presented to the surveyed site were retrospectively reviewed. Of the 37 cases where a CT head was indicated as per the CCHR, a CT was ordered 32 (86.5%) times. Of the 176 cases where a CT head was not indicated, a CT was not ordered 155 (88.1%) times. Therefore, the CCHR was followed in 187 (87.8%) of the total 213 cases where the CCHR should be applied. Conclusion: Our study reveals adherence to the CCHR in 87.8% of cases at this ED. Identifying contextual factors that facilitate or hinder the application of CCHR in practice is critical for reducing unnecessary CTs. This work has been presented to the physician group to gain physician engagement and to elucidate enablers and barriers to guideline adherence. In light of the frequency of CT heads ordered EDs, even a small reduction would be impactful.
A systematic review and network meta-analysis were conducted to assess the relative efficacy of internal or external teat sealants given at dry-off in dairy cattle. Controlled trials were eligible if they assessed the use of internal or external teat sealants, with or without concurrent antimicrobial therapy, compared to no treatment or an alternative treatment, and measured one or more of the following outcomes: incidence of intramammary infection (IMI) at calving, IMI during the first 30 days in milk (DIM), or clinical mastitis during the first 30 DIM. Risk of bias was based on the Cochrane Risk of Bias 2.0 tool with modified signaling questions. From 2280 initially identified records, 32 trials had data extracted for one or more outcomes. Network meta-analysis was conducted for IMI at calving. Use of an internal teat sealant (bismuth subnitrate) significantly reduced the risk of new IMI at calving compared to non-treated controls (RR = 0.36, 95% CI 0.25–0.72). For comparisons between antimicrobial and teat sealant groups, concerns regarding precision were seen. Synthesis of the primary research identified important challenges related to the comparability of outcomes, replication and connection of interventions, and quality of reporting of study conduct.
A systematic review and meta-analysis were conducted to determine the efficacy of selective dry-cow antimicrobial therapy compared to blanket therapy (all quarters/all cows). Controlled trials were eligible if any of the following were assessed: incidence of clinical mastitis during the first 30 DIM, frequency of intramammary infection (IMI) at calving, or frequency of IMI during the first 30 DIM. From 3480 identified records, nine trials were data extracted for IMI at calving. There was an insufficient number of trials to conduct meta-analysis for the other outcomes. Risk of IMI at calving in selectively treated cows was higher than blanket therapy (RR = 1.34, 95% CI = 1.13, 1.16), but substantial heterogeneity was present (I2 = 58%). Subgroup analysis showed that, for trials using internal teat sealants, there was no difference in IMI risk at calving between groups, and no heterogeneity was present. For trials not using internal teat sealants, there was an increased risk in cows assigned to a selective dry-cow therapy protocol, compared to blanket treatment, with substantial heterogeneity in this subgroup. However, the small number of trials and heterogeneity in the subgroup without internal teat sealants suggests that the relative risk between treatments may differ from the determined point estimates based on other unmeasured factors.
A systematic review and network meta-analysis were conducted to assess the relative efficacy of antimicrobial therapy given to dairy cows at dry-off. Eligible studies were controlled trials assessing the use of antimicrobials compared to no treatment or an alternative treatment, and assessed one or more of the following outcomes: incidence of intramammary infection (IMI) at calving, incidence of IMI during the first 30 days in milk (DIM), or incidence of clinical mastitis during the first 30 DIM. Databases and conference proceedings were searched for relevant articles. The potential for bias was assessed using the Cochrane Risk of Bias 2.0 algorithm. From 3480 initially identified records, 45 trials had data extracted for one or more outcomes. Network meta-analysis was conducted for IMI at calving. The use of cephalosporins, cloxacillin, or penicillin with aminoglycoside significantly reduced the risk of new IMI at calving compared to non-treated controls (cephalosporins, RR = 0.37, 95% CI 0.23–0.65; cloxacillin, RR = 0.55, 95% CI 0.38–0.79; penicillin with aminoglycoside, RR = 0.42, 95% CI 0.26–0.72). Synthesis revealed challenges with a comparability of outcomes, replication of interventions, definitions of outcomes, and quality of reporting. The use of reporting guidelines, replication among interventions, and standardization of outcome definitions would increase the utility of primary research in this area.
Information retrieval (IR) aims at retrieving documents that are most relevant to a query provided by a user. Traditional techniques rely mostly on syntactic methods. In some cases, however, links at a deeper semantic level must be considered. In this paper, we explore a type of IR task in which documents describe sequences of events, and queries are about the state of the world after such events. In this context, successfully matching documents and query requires considering the events’ possibly implicit uncertain effects and side effects. We begin by analyzing the problem, then propose an action language-based formalization, and finally automate the corresponding IR task using answer set programming.
Introduction: Stress has been shown to impair performance during acute events. The goal of this pilot study was to investigate the effects of two simulation-based training interventions and baseline demographics (gender, age) on stress responses to simulated trauma scenarios. Methods: Sixteen (16) Emergency Medicine and Surgery residents were randomly assigned to one of two groups: Stress Inoculation Training (SIT) or Crisis Resource Management (CRM). Residents served as trauma team leaders in simulated trauma scenarios pre and post intervention. CRM training focused on non-technical skills required for effective teamwork. The SIT group focused on cognitive reappraisal, breathing and mental rehearsal. Training lasted 3 hours, involving brief didactic sessions and practice scenarios with debriefing focused on either CRM or SIT. Stress responses were measured with the State Trait Anxiety Inventory (anxiety), cognitive appraisal (degree to which a person interprets a situation as a threat or challenge) and salivary cortisol levels. Results: Because the pre-intervention stress responses were different between the two groups, the results were analyzed with stepwise regression analyses. The only significant predictor of anxiety and cortisol responses were the residents appraisal responses to that scenario, explaining 31% of the variance in anxiety and cortisol. Appraisals of the post-intervention scenarios were predicted by their appraisals of the pre-intervention scenario and gender, explaining 73% of the variance. Men were more likely than women to appraise the scenarios as threatening. There were no differences in subjective anxiety, cognitive appraisal or salivary cortisol responses as a result of either intervention. Conclusion: Male residents, as well as those who appraised an initial simulated trauma scenario as threatening, were more likely to interpret a subsequent scenario as threatening, and were more likely to have larger subjective (anxiety) and physiological (cortisol) responses a subsequent scenario. Both CRM and SIT training were not effective in overcoming initial appraisals of potentially stressful events.
Introduction: Screening for organ and tissue donation is an essential skill for emergency physicians. In 2015, 4564 individuals were on a waiting list for organ transplant and 242 died while waiting. As Canadas donation rates are less than half that of other comparable countries, it is crucial to ensure we are identifying all potential donors. Patients deceased from poisoning are a source that may not be considered for referral as often as those who die from other causes. This study aims to identify if patients dying from poisoning represent an under-referred group and determine what physician characteristics influence referral decisions. Methods: In this cross-sectional unidirectional survey study, physician members of the Canadian Association of Emergency Physicians were invited to participate. Participants were presented with 20 organ donation scenarios that included poisoned and non-poisoned deaths, as well as one ideal scenario for organ or tissue donation used for comparison. Participants were unaware of the objective to explore donation in the context of poisoning deaths. Following the organ donation scenarios, a range of follow-up questions and demographics were included to explore factors influencing the decision to refer or not refer for organ or tissue donation. Results were reported descriptively and associations between physician characteristics and decisions to refer were assessed using odds ratios and 95% confidence intervals. Results: 208/2058 (10.1%) physicians participated. 25% did not refer in scenarios involving a drug overdose (n=71). Specific poisonings commonly triggering the decision to not refer included palliative care medications (n=34, 18%), acetaminophen (n=42, 22%), chemical exposure (n=48, 27%) and organophosphates (n=87, 48%). Factors associated with an increased likelihood to refer potential donors following overdose included previous organ and tissue donation training (OR=2.6), having referred in the past (OR=4.3), available donation support (OR=3.9), greater than 10 years of service (OR=2.1), large urban center (OR=3.8), holding emergency medicine certification (OR=3.6), male gender (OR=2.2, CI), and having indicated a desire to be a donor on government identification (OR=5.8). Conclusion: Scenarios involving drug overdoses were associated with under-referral for organ and tissue donation. As poisoning is not a contraindication for referral, this represents a potential source of donors. By examining characteristics that put clinicians at risk for under-referral of organ or tissue donors, becoming aware of potential biases, improving transplant knowledge bases, and implementing support and training programs for the organ and tissue donation processes, we have the opportunity to improve these rates and reduce morbidity and mortality for Canadians requiring organ or tissue donation.
The purpose of the present study was to evaluate locomotor strategies during development in domestic chickens (Gallus gallus domesticus); we were motivated, in part, by current efforts to improve the design of housing systems for laying hens which aim to reduce injury and over-exertion. Using four strains of laying hens (Lohmann Brown, Lohmann LSL lite, Dekalb White and Hyline Brown) throughout this longitudinal study, we investigated their locomotor style and climbing capacity in relation to the degree (0 to 70°) of incline, age (2 to 36 weeks) and the surface substrate (sandpaper or wire grid). Chicks and adult fowl performed only walking behavior to climb inclines ⩽40° and performed a combination of wing-assisted incline running (WAIR) or aerial ascent on steeper inclines. Fewer birds used their wings to aid their hind limbs when climbing 50° inclines on wire grid surface compared with sandpaper. The steepness of angle achieved during WAIR and the tendency to fly instead of using WAIR increased with increasing age and experience. White-feathered strains performed more wing-associated locomotor behavior compared with brown-feathered strains. A subset of birds was never able to climb incline angles >40° even when using WAIR. Therefore, we suggest that inclines of up to 40° should be provided for hens in three-dimensional housing systems, which are easily negotiated (without wing use) by chicks and adult fowl.
Constraint answer set programming (CASP) is an extension of answer set programming that allows for numerical constraints to be added in the rules. PDDL+ is an extension of the PDDL standard language of automated planning for modeling mixed discrete-continuous dynamics. In this paper, we present CASP solutions for dealing with PDDL+ problems, i.e., encoding from PDDL+ to CASP, and extensions to the algorithm of the ezcsp CASP solver in order to solve CASP programs arising from PDDL+ domains. An experimental analysis, performed on well-known linear and non-linear variants of PDDL+ domains, involving various configurations of the ezcsp solver, other CASP solvers, and PDDL+ planners, shows the viability of our solution.
Studies were conducted to calibrate and validate a mathematical model previously developed to predict common lambsquarters seedling emergence at different corn seedbed preparation times. The model was calibrated for different types of soil by adjusting the base temperature of common lambsquarters seedling emergence to the soil texture. A relationship was established with the sand mineral fraction of the soil and was integrated into the model. The calibrated model provided a good fit of the field data and was accurate in predicting cumulative weed emergence in different soil types. The validation was done using data collected independently at a site located 80 km from the original experimental area. There were no differences between observed and predicted values. The accuracy of the model is very satisfactory because the emergence of common lambsquarters populations was accurately predicted at the 95% probability level. This model is one of the first to take into consideration seedbed preparation time and soil texture. This common lambsquarters emergence model could be adapted to model other weed species whose emergence is limited by low spring temperature.
A 3-yr study was conducted to assess cranberry bean susceptibility to mechanical weeding using a rotary hoe at preemergence, hook, cotyledon, unifoliate, and first to fourth trifoliate stages of bean development and at different combinations of stages. The experiment was conducted in a weed-free environment. Cultivation with the rotary hoe reduced bean yield only for the treatment that received four cultivations at four different bean growth stages. Three cultivations improved yield compared with no cultivation. Single cultivation done at any of the eight crop growth stages did not affect yield. Crop density at harvest was decreased 6% in the treatments receiving two cultivations and 9% in the treatments receiving four cultivations compared to no cultivation. The effects of the cultivations on grain moisture were not consistent and differed from year to year. Seed weight did not differ among treatments in either year. Because this study was conducted under weed-free conditions, the beneficial effects of cultivating with the rotary hoe are probably mostly related to breaking the soil crust, improving soil aeration, preserving soil moisture, or promoting mineralization of the nutrients required by the crop.
A 3-yr study was conducted to establish if the presence of corn had an effect on the emergence patterns and total weed seedling density under growing conditions in southwestern Québec. Weed seedling emergence was monitored in permanent quadrats throughout the growing season in the presence and absence of growing corn. Common lambsquarters and barnyardgrass were prevalent in most site-years. The presence of corn did not affect the patterns of common lambsquarters and barnyardgrass emergence nor their total weed seedling density except in 1994. Corn canopy was probably not sufficiently developed to affect light levels or soil temperature needed for weed germination and, consequently, seedling emergence. In 1994, in the absence of corn, some soil crusting was observed on a fine-textured soil, and the total number of seedlings was reduced. The results of these weed emergence studies in corn can be extended to other crops growing with wide row spacing and relatively slow canopy closure similar to those of grain corn.