We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The global population and status of Snowy Owls Bubo scandiacus are particularly challenging to assess because individuals are irruptive and nomadic, and the breeding range is restricted to the remote circumpolar Arctic tundra. The International Union for Conservation of Nature (IUCN) uplisted the Snowy Owl to “Vulnerable” in 2017 because the suggested population estimates appeared considerably lower than historical estimates, and it recommended actions to clarify the population size, structure, and trends. Here we present a broad review and status assessment, an effort led by the International Snowy Owl Working Group (ISOWG) and researchers from around the world, to estimate population trends and the current global status of the Snowy Owl. We use long-term breeding data, genetic studies, satellite-GPS tracking, and survival estimates to assess current population trends at several monitoring sites in the Arctic and we review the ecology and threats throughout the Snowy Owl range. An assessment of the available data suggests that current estimates of a worldwide population of 14,000–28,000 breeding adults are plausible. Our assessment of population trends at five long-term monitoring sites suggests that breeding populations of Snowy Owls in the Arctic have decreased by more than 30% over the past three generations and the species should continue to be categorised as Vulnerable under the IUCN Red List Criterion A2. We offer research recommendations to improve our understanding of Snowy Owl biology and future population assessments in a changing world.
Background: Parkinson’s disease (PD) varies widely across individuals in terms of clinical manifestations and course of progression. We aimed to compare patterns of brain atrophy between PD clinical subtypes using longitudinally acquired brain MRIs. Methods: We used T1-weighted MRIs from Parkinson’s Progression Markers Initiative (PPMI) on 134 PD individuals and 60 healthy controls with at least two MRIs. Patients were classified into three clinical subtypes at de novo stage using validated subtyping criteria based on major motor and non-motor classifiers (early cognitive impairment, RBD, dysautonomia): mild-motor predominant (n=74), intermediate (n=44), and diffuse-malignant (n=16). Deformation-based morphometry (DBM) maps were calculated and mixed effect models were used to examine the interaction between PD subtypes and rate of atrophy across brain regions over time, controlling for sex and age at baseline. Results: Individuals with ‘diffuse malignant’ PD showed a significantly higher rate of atrophy across multiple brain regions, including lateral nucleus of the forebrain, precuneus, paracentral lobule, inferior temporal gyrus, fusiform gyrus, and lateral hemisphere of the cerebellum (FDR corrected p<0.05). Conclusions: We demonstrated an accelerated atrophy pattern within several brain regions in ‘diffuse malignant’ PD subtype. These findings suggest the presence of a more diffuse multidomain neurodegenerative process in a subgroup of people with PD, favoring the existence of diverse underlying pathophysiologies.
It has been posited that alcohol use may confound the association between greater concussion history and poorer neurobehavioral functioning. However, while greater alcohol use is positively correlated with neurobehavioral difficulties, the association between alcohol use and concussion history is not well understood. Therefore, this study investigated the cross-sectional and longitudinal associations between cumulative concussion history, years of contact sport participation, and health-related/psychological factors with alcohol use in former professional football players across multiple decades.
Participants and Methods:
Former professional American football players completed general health questionnaires in 2001 and 2019, including demographic information, football history, concussion/medical history, and health-related/psychological functioning. Alcohol use frequency and amount was reported for three timepoints: during professional career (collected retrospectively in 2001), 2001, and 2019. During professional career and 2001 alcohol use frequency included none, 1-2, 3-4, 5-7 days/week, while amount included none, 12, 3-5, 6-7, 8+ drinks/occasion. For 2019, frequency included never, monthly or less, 2-4 times/month, 2-3 times/week, >4 times/week, while amount included none, 1-2, 3-4, 5-6, 7-9, 10+ drinks/occasion. Scores on a screening measure for Alcohol Use Disorder (CAGE) were also available at during professional career and 2001 timepoints. Concussion history was recorded in 2001 and binned into five groups: 0, 1-2, 3-5, 6-9, 10+. Depression and pain interference were assessed via PROMIS measures at all timepoints. Sleep disturbance was assessed in 2001 via separate instrument and with PROMIS Sleep Disturbance in 2019. Spearman’s rho correlations tested associations between concussion history and years of sport participation with alcohol use across timepoints, and whether poor health functioning (depression, pain interference, sleep disturbance) in 2001 and 2019 were associated with alcohol use both within and between timepoints.
Results:
Among the 351 participants (Mage=47.86[SD=10.18] in 2001), there were no significant associations between concussion history or years of contact sport participation with CAGE scores or alcohol use frequency/amount during professional career, 2001, or 2019 (rhos=-.072-.067, ps>.05). In 2001, greater depressive symptomology and sleep disturbance were related to higher CAGE scores (rho=.209, p<.001; rho=.176, p<.001, respectively), while greater depressive symptomology, pain interference, and sleep disturbance were related to higher alcohol use frequency (rho=.176, p=.002; rho=.109, p=.045; rho=.132, p=.013, respectively) and amount/occasion (rho=.215, p<.001; rho=.127, p=.020; rho=.153, p=.004, respectively). In 2019, depressive symptomology, pain interference, and sleep disturbance were not related to alcohol use (rhos=-.047-.087, ps>.05). Between timepoints, more sleep disturbance in 2001 was associated with higher alcohol amount/occasion in 2019 (rho=.115, p=.036).
Conclusions:
Increased alcohol intake has been theorized to be a consequence of greater concussion history, and as such, thought to confound associations between concussion history and neurobehavioral function later in life. Our findings indicate concussion history and years of contact sport participation were not significantly associated with alcohol use cross-sectionally or longitudinally, regardless of alcohol use characterization. While higher levels of depression, pain interference, and sleep disturbance in 2001 were related to greater alcohol use in 2001, they were not associated cross-sectionally in 2019. Results support the need to concurrently address health-related and psychological factors in the implementation of alcohol use interventions for former NFL players, particularly earlier in the sport discontinuation timeline.
To assess the relative risk of hospital-onset Clostridioides difficile (HO-CDI) during each month of the early coronavirus disease 2019 (COVID-19) pandemic and to compare it with historical expectation based on patient characteristics.
Design:
This study used a retrospective cohort design. We collected secondary data from the institution’s electronic health record (EHR).
Setting:
The Ohio State University Wexner Medical Center, Ohio, a large tertiary healthcare system in the Midwest.
Patients or participants:
All adult patients admitted to the inpatient setting between January 2018 and May 2021 were eligible for the study. Prisoners, children, individuals presenting with Clostridioides difficile on admission, and patients with <4 days of inpatient stay were excluded from the study.
Results:
After controlling for patient characteristics, the observed numbers of HO-CDI cases were not significantly different than expected. However, during 3 months of the pandemic period, the observed numbers of cases were significantly different from what would be expected based on patient characteristics. Of these 3 months, 2 months had more cases than expected and 1 month had fewer.
Conclusions:
Variations in HO-CDI incidence seemed to trend with COVID-19 incidence but were not fully explained by our case mix. Other factors contributing to the variability in HO-CDI incidence beyond listed patient characteristics need to be explored.
In Paper I, we presented an overview of the Southern-sky MWA Rapid Two-metre (SMART) survey, including the survey design and search pipeline. While the combination of MWA’s large field-of-view and the voltage capture system brings a survey speed of ${\sim} 450\, {\textrm{deg}}^{2}\,\textrm{h}^{-1}$, the progression of the survey relies on the availability of compact configuration of the Phase II array. Over the past few years, by taking advantage of multiple windows of opportunity when the compact configuration was available, we have advanced the survey to 75% of the planned sky coverage. To date, about 10% of the data collected thus far have been processed for a first-pass search, where 10 min of observation is processed for dispersion measures out to 250 ${\textrm{pc cm}}^{-3}$, to realise a shallow survey that is largely sensitive to long-period pulsars. The ongoing analysis has led to two new pulsar discoveries, as well as an independent discovery and a rediscovery of a previously incorrectly characterised pulsar, all from ${\sim} 3\% $ of the data for which candidate scrutiny is completed. In this sequel to Paper I, we describe the strategies for further detailed follow-up including improved sky localisation and convergence to timing solution, and illustrate them using example pulsar discoveries. The processing has also led to re-detection of 120 pulsars in the SMART observing band, bringing the total number of pulsars detected to date with the MWA to 180, and these are used to assess the search sensitivity of current processing pipelines. The planned second-pass (deep survey) processing is expected to yield a three-fold increase in sensitivity for long-period pulsars, and a substantial improvement to millisecond pulsars by adopting optimal de-dispersion plans. The SMART survey will complement the highly successful Parkes High Time Resolution Universe survey at 1.2–1.5 GHz, and inform future large survey efforts such as those planned with the low-frequency Square Kilometre Array (SKA-Low).
We present an overview of the Southern-sky MWA Rapid Two-metre (SMART) pulsar survey that exploits the Murchison Widefield Array’s large field of view and voltage-capture system to survey the sky south of 30$^{\circ}$ in declination for pulsars and fast transients in the 140–170 MHz band. The survey is enabled by the advent of the Phase II MWA’s compact configuration, which offers an enormous efficiency in beam-forming and processing costs, thereby making an all-sky survey of this magnitude tractable with the MWA. Even with the long dwell times employed for the survey (4800 s), data collection can be completed in $<$100 h of telescope time, while still retaining the ability to reach a limiting sensitivity of $\sim$2–3 mJy (at 150 MHz, near zenith), which is effectively 3–5 times deeper than the previous-generation low-frequency southern-sky pulsar survey, completed in the 1990s. Each observation is processed to generate $\sim$5000–8000 tied-array beams that tessellate the full $\sim 610\, {\textrm{deg}^{2}}$ field of view (at 155 MHz), which are then processed to search for pulsars. The voltage-capture recording of the survey also allows a multitude of post hoc processing options including the reprocessing of data for higher time resolution and even exploring image-based techniques for pulsar candidate identification. Due to the substantial computational cost in pulsar searches at low frequencies, the survey data processing is undertaken in multiple passes: in the first pass, a shallow survey is performed, where 10 min of each observation is processed, reaching about one-third of the full-search sensitivity. Here we present the system overview including details of ongoing processing and initial results. Further details including first pulsar discoveries and a census of low-frequency detections are presented in a companion paper. Future plans include deeper searches to reach the full sensitivity and acceleration searches to target binary and millisecond pulsars. Our simulation analysis forecasts $\sim$300 new pulsars upon the completion of full processing. The SMART survey will also generate a complete digital record of the low-frequency sky, which will serve as a valuable reference for future pulsar searches planned with the low-frequency Square Kilometre Array.
Background: Eye movements reveal neurodegenerative disease processes due to overlap between oculomotor circuitry and disease-affected areas. Characterizing oculomotor behaviour in context of cognitive function may enhance disease diagnosis and monitoring. We therefore aimed to quantify cognitive impairment in neurodegenerative disease using saccade behaviour and neuropsychology. Methods: The Ontario Neurodegenerative Disease Research Initiative recruited individuals with neurodegenerative disease: one of Alzheimer’s disease, mild cognitive impairment, amyotrophic lateral sclerosis, frontotemporal dementia, Parkinson’s disease, or cerebrovascular disease. Patients (n=450, age 40-87) and healthy controls (n=149, age 42-87) completed a randomly interleaved pro- and anti-saccade task (IPAST) while their eyes were tracked. We explored the relationships of saccade parameters (e.g. task errors, reaction times) to one another and to cognitive domain-specific neuropsychological test scores (e.g. executive function, memory). Results: Task performance worsened with cognitive impairment across multiple diseases. Subsets of saccade parameters were interrelated and also differentially related to neuropsychology-based cognitive domain scores (e.g. antisaccade errors and reaction time associated with executive function). Conclusions: IPAST detects global cognitive impairment across neurodegenerative diseases. Subsets of parameters associate with one another, suggesting disparate underlying circuitry, and with different cognitive domains. This may have implications for use of IPAST as a cognitive screening tool in neurodegenerative disease.
Grasshoppers are one of the most predominant insects in the grasslands of the southern Pampas. In this region, Dichroplus elongatus, Dichroplus maculipennis, Dichroplus pratensis and Borellia bruneri are the most abundant species and have the greatest economic importance. This study aimed to assess the relationship between temporal changes in the density of these species and climate variables associated with temperature and rainfall over an 11-year study period., We monitored 22 sites in different areas of Laprida county from 2005 to 2016. A total of 25 grasshopper species were collected. The most abundant species were D. maculipennis and B. bruneri which reached the highest densities from 2008–2009 to 2010–2011. The rainfall accumulated from September (RAS) to the sampling date and the number of rainy days (RD) largely explained the density variation of B. bruneri. Besides RD and RAS, winter rainfall, rainfall accumulated from October to the sampling date, and thermal amplitude of October (TAO) influenced the density of D. maculipennis. Our results indicated that seasons with less rainfall and fewer RD favored these two species’ abundance. We identified that the RD and TAO contributed significantly to variations in the density of D. elongatus. In contrast to the other two species, we recorded D. elongatus in seasons with high rainfall and high RD. A better understanding of the climate influence on the life cycle of these economically important insects may identify key factors in their population dynamics which in turn may improve management options.
The radiocarbon (14C) calibration curve so far contains annually resolved data only for a short period of time. With accelerator mass spectrometry (AMS) matching the precision of decay counting, it is now possible to efficiently produce large datasets of annual resolution for calibration purposes using small amounts of wood. The radiocarbon intercomparison on single-year tree-ring samples presented here is the first to investigate specifically possible offsets between AMS laboratories at high precision. The results show that AMS laboratories are capable of measuring samples of Holocene age with an accuracy and precision that is comparable or even goes beyond what is possible with decay counting, even though they require a thousand times less wood. It also shows that not all AMS laboratories always produce results that are consistent with their stated uncertainties. The long-term benefits of studies of this kind are more accurate radiocarbon measurements with, in the future, better quantified uncertainties.
Introduction: Compared to other areas in Alberta Health Services (AHS), internal data show that emergency departments (EDs) and urgent care centres (UCCs) experience a high rate of workforce violence. As such, reducing violence in AHS EDs and UCCs is a key priority. This project explored staff's lived experience with patient violence with the goal of better understanding its impact, and what strategies and resources could be put in place. Methods: To obtain a representative sample, we recruited staff from EDs and a UCC (n = 6) situated in urban and rural settings across Alberta. As the interviews had the potential to be upsetting, we conducted in-person interviews in a private space. Interviews were conducted with over 60 staff members including RNs, LPNs, unit clerks, physicians, and protective services. Data collection and analysis occurred simultaneously and iteratively until saturation was reached. The analysis involved data reduction, category development, and synthesis. Key phrases and statements were first highlighted. Preliminary labels were then assigned to the data and data was then organized into meaningful clusters. Finally, we identified common themes of participants’ lived experience. Triangulation of sources, independent and team analysis, and frequent debriefing sessions were used to enhance the trustworthiness of the data. Results: Participants frequently noted the worry they carry with them when coming into work, but also said there was a high threshold of acceptance dominating ED culture. A recurring feature of this experience was the limited resources (e.g., no peace officers, scope of security staff) available to staff to respond when patients behave violently or are threatening. Education like non-violent crisis intervention training, although helpful, was insufficient to make staff feel safe. Participants voiced the need for more protective services, the addition of physical barriers like locking doors and glass partitions, more investment in addictions and mental health services (e.g., increased access to psychiatrists or addictions counsellors), and a greater shared understanding of AHS’ zero tolerance policy. Conclusion: ED and UCC staff describe being regularly exposed to violence from patients and visitors. Many of these incidents go unreported and unresolved, leaving the workforce feeling worried and unsupported. Beyond education, the ED and UCC workforce need additional resources to support them in feeling safe coming to work.
Introduction: Emergency Departments (EDs) are at high risk of workforce-directed violence (WDV). To address ED violence in Alberta Health Services (AHS), we conducted key informant interviews to identify successful strategies that could be adopted in AHS EDs. Methods: The project team identified potential participants through their ED network; additional contacts were identified through snowball sampling. We emailed 197 individuals from Alberta (123), Canada (46), and abroad (28). The interview guide was developed and reviewed in partnership with ED managers and Workplace Health and Safety. We conducted semi-structured phone interviews with 26 representatives from urban and rural EDs or similar settings from Canada, the United States, and Australia. This interview process received an ARECCI score of 2. Two researchers conducted a content analysis of the interview notes; rural and urban sites were analyzed separately. We extracted strategies, their impact, and implementation barriers and facilitators. Strategies identified were categorized into emergent themes. We aggregated similar strategies and highlighted key or unique findings. Results: Interview results showed that there is no single solution to address ED violence. Sites with effective violence prevention strategies used a comprehensive approach where multiple strategies were used to address the issue. For example, through a violence prevention working group, one site implemented weekly violence simulations, a peer mentorship support team, security rounding, and more. This multifaceted approach had positive results: a decrease in code whites, staff feeling more supported, and the site no longer being on union “concerned” lists. Another promising strategy included addressing the culture of violence by increasing reporting, clarifying policies (i.e., zero tolerance), and establishing flagging or alert systems for visitors with violent histories. Physician involvement and support was highly valued in responding to violence (e.g., support when refusing care, on the code white response team, flagging). Conclusion: Overall, one strategy is not enough to successfully address WDV in EDs. Strategies need to be comprehensive and context specific, especially when considering urban and rural sites with different resources available. We note that few strategies were formally evaluated, and recommend that future work focus on developing comprehensive metrics to evaluate the strategies and define success.
Introduction: Non-medical cannabis recently became legal on October 18th, 2018 to Canadian adults. The impact of legalization on Emergency Departments (EDs) has been identified as a major concern. The study objective was to identify changes in cannabis-related ED visits and changes in co-existing diagnoses associated with cannabis-related ED visits pre- and post-legalization for the entire urban population of Alberta. Urban Alberta was defined as Calgary and Edmonton, inclusive of Sherwood Park and St. Albert given the proximity of some Edmontonians to their EDs) encompassing 12 adult EDs and 2 pediatric EDs. Methods: Retrospective data was collected from the National Ambulatory Care Reporting System, and from the HealthLink and the Alberta Poison and Drug Information Service (PADIS) public telehealth call databases. An interrupted time-series analysis was completed via segmented regression calculation in addition to incident rate and relative risk ratio calculation for the pre- and post-legalization periods to identify both differences among the entire urban Alberta population and differences among individuals presenting to the ED. Data was collected from October 1st, 2013 up to July 31st, 2019 for ED visits and was adjusted for natural population increase using quarterly reports from the Government of Alberta. Results: The sample included 11 770 pre-legalization cannabis-related visits, and 2962 post-legalization visits. Volumes of ED visits for cannabis-related harms were found to increase post-legalization within urban EDs (IRR 1.45, 95% CI 1.39, 1.51; absolute level change: 43.48 visits per month in urban Alberta, 95% CI 26.52, 60.43), and for PADIS calls (IRR 1.87, 95% CI 1.55, 2.37; absolute level change: 4.02 calls per month in Alberta, 95% CI 0.11, 7.94). The increase in visits to EDs equates to an increase of 2.72 visits per month, per ED. Lastly, increases were observed for cannabinoid hyperemesis (RR 1.23, 95% CI 1.10, 1.36), unintentional ingestion (RR 1.48, 95% CI 1.34, 1.62), and in individuals leaving the ED pre-treatment (RR 1.28, 95% CI 1.08, 1.49). Decreases were observed for coingestant use (RR 0.77, 95% CI 0.73, 0.81) and hospital admissions (RR 0.88, 95% CI 0.80, 0.96). Conclusion: Overall, national legalization of cannabis appears to be correlated with a small increase in cannabis-related ED visits and poison control calls. Post-legalization, fewer patients are being admitted, though cannabinoid hyperemesis appears to be on the rise.
Background: Since January 1, 2016 2358 people have died from opioid poisoning in Alberta. Buprenorphine/naloxone (bup/nal) is the recommended first line treatment for opioid use disorder (OUD) and this treatment can be initiated in emergency departments and urgent care centres (EDs). Aim Statement: This project aims to spread a quality improvement intervention to all 107 adult EDs in Alberta by March 31, 2020. The intervention supports clinicians to initiate bup/nal for eligible individuals and provide rapid referrals to OUD treatment clinics. Measures & Design: Local ED teams were identified (administrators, clinical nurse educators, physicians and, where available, pharmacists and social workers). Local teams were supported by a provincial project team (project manager, consultant, and five physician leads) through a multi-faceted implementation process using provincial order sets, clinician education products, and patient-facing information. We used administrative ED and pharmacy data to track the number of visits where bup/nal was given in ED, and whether discharged patients continued to fill any opioid agonist treatment (OAT) prescription 30 days after their index ED visit. OUD clinics reported the number of referrals received from EDs and the number attending their first appointment. Patient safety event reports were tracked to identify any unintended negative impacts. Evaluation/Results: We report data from May 15, 2018 (program start) to September 31, 2019. Forty-nine EDs (46% of 107) implemented the program and 22 (45% of 49) reported evaluation data. There were 5385 opioid-related visits to reporting ED sites after program adoption. Bup/nal was given during 832 ED visits (663 unique patients): 7 visits in the 1st quarter the program operated, 55 in the 2nd, 74 in the 3rd, 143 in the 4th, 294 in the 5th, and 255 in the 6th. Among 505 unique discharged patients with 30 day follow up data available 319 (63%) continued to fill any OAT prescription after receiving bup/nal in ED. 16 (70%) of 23 community clinics provided data. EDs referred patients to these clinics 440 times, and 236 referrals (54%) attended their first follow-up appointment. Available data may under-report program impact. 5 patient safety events have been reported, with no harm or minimal harm to the patient. Discussion/Impact: Results demonstrate effective spread and uptake of a standardized provincial ED based early medical intervention program for patients who live with OUD.
Introduction: The elderly (65 yo and more) increase in Canada is well documented along with a disproportionate use of Emergency Departments after a minor injury. These patients requires specific care given a 16% risk of functional decline following a visit to ED. To prevent functional decline, a multidimensional assessment of the elderly is recommended in the emergency department. Objective: To determine if ED grip strength can predict functional decline at 3 or 6 months post-injury. Methods: A multicentre prospective study in 5 ED across Canada was realized between 2013 and 16. Patients 65 years old and over, autonomous in daily living activities and consulting the emergency department for minor trauma were recruited 7 days a week. Clinical-demographic data, functional status, fear of falling, number of falls in the last month, grip strength measurement were collected in the ED. Functional decline (loss of at least points to functional status) was calculated at 3 and 6 months. Descriptive statistics and linear regression model with repeated measurements were used to determine if the grip strength was predictive of functional decline at 3 or 6 months. Results: 387 patient were recruited. Mean age was 74 ± 7 years old, 52% were male. XXX experienced a fall in the last month. The initial maximum grip strength was (24 ± 10 intervention vs. 28 ± 13 control; p ≤ 0.05). grip strength is associated with pre-injury functional status (p < 0.0001) and fear of falling (p = 0.0001) but does not predict 3 or 6 month functional decline. Conclusion: Given the strong association with fear of falling and functional status at initial ED evaluation, we recommend that grip strength measurement could be included in a multidisciplinary geriatric emergency department assessment as needed.
In the Netherlands, the Depression Initiative has been launched in 2006 as a nationwide attempt to implement the Multidisciplinary Guideline for Depression and to evaluate its cost effectiveness. An evaluation of the selected strategy to implement the guidelines, a Breakthrough Collaborative, was conducted as a quasi experimental trial. The intervention group consisted of around 530 patients coming from 10 multidisciplinary teams in primary care. The intervention teams received a set of implementation strategies, as part of the Breakthrough Method, developed by the Institute for Healthcare Improvement (www.ihi.org). The aim of the participants in this implementation project was to implement guideline recommendations directed at a reduction of unnecessary antidepressant treatment for patients with mild depression, better use of effective, but minimally invasive treatment options and improved antidepressant and psychotherapeutic treatment for patients with severe symptoms. Monitoring depression severeness was also one of the goals. This guideline derived, stepped care approach was compared to Care As Usual in the primary care setting, as provided in a different study group (NESDA). Outcomes were measured in terms of quality of care provided by the general practitioner (antidepressant prescription rates) and clinical outcomes (BDI, IDS-SR, WHODAS). A process evaluation and simple economic evaluation was part of the study design. In the presentation, preliminary results will be presented.
Persons using the Internet generate large amounts of health-related data, which are increasingly used in modern health sciences.
Objectives/aims
We analysed the relation between annual prescription volumes (APV) of several antidepressants with marketing approval in Germany and corresponding web search query data generated in Google to test, if web search query volume may be a proxy for medical prescription practice.
Methods
We obtained APVs of several antidepressants related to corresponding prescriptions at the expense of the statutory health insurance in Germany from 2004–2013. Web search query data generated in Germany and related to defined search-terms (active substance or brand name) were obtained with Google Trends. We calculated correlations (Pearson's r) between the APVs of each substance and the respective annual “search share” values; coefficients of determination (R2) were computed to determine the amount of variability shared by the two variables.
Results
Significant and strong correlations between substance-specific APVs and corresponding annual query volume were found for each substance during the observational interval: agomelatine (r = 0.968; R2 = 0.932; P = 0.01), bupropion (r = 0.962; R2 = 0.925; P = 0.01), citalopram (r = 0.970; R2 = 0.941; P = 0.01), escitalopram (r = 0.824; R2 = 0.682; P = 0.01), fluoxetine (r = 0.885; R2 = 0.783; P = 0.01), paroxetine (r = 0.801; R2 = 0.641; P = 0.01), and sertraline (r = 0.880; R2 = 0.689; P = 0.01).
Conclusions
Although the used data did not allow to perform an analysis with a higher temporal resolution our results suggest that web search query volume may be a proxy for corresponding prescription behaviour. However, further studies analysing other pharmacologic agents and prescription data that facilitates an increased temporal resolution are needed to confirm this hypothesis.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
There is a growing interest in low-grade inflammatory and metabolic alterations in patients with chronic schizophrenia (SCH).
Methods
Inflammatory (tumor-necrosis factor-α [TNF-α], interferon-γ [IFN-γ], interleukins [IL-1α, IL-1β, IL-2, IL-4, IL-6, IL-8, IL-10], monocyte chemo-attractant protein-1 [MCP-1]) and growth factors (vascular endothelial growth factor [VEGF], epidermal growth factor [EGF]) were measured in blood serum samples of 105 SCH patients and 148 control subjects (CS). Simultaneously the clinical biomarkers (C-reactive protein [CRP], triglycerides [TG], low-density lipoprotein [LDL-c] and high-density lipoprotein [HDL-c] cholesterol, glycated hemoglobin [HbA1c]) were measured, and body mass index (BMI) was calculated for patients.
Results
Several cyto-/chemokines (IFN-γ, MCP-1, IL-2, IL-6, IL-8 and IL-10) were significantly (P < 0.0000001) elevated in SCH patients compared to CS. Odds ratios, obtained from logistic regression analyses, were significantly elevated for IL-2, IL-6, IL-10, INF-γ, and decreased for TNF-α in SCH group. Among the patients, higher IL-2, IL-6, INF-γ and lower MCP-1 levels as well as male gender were together significant (P < 0.000001) predictors of higher HbA1c levels, and TG/HDL-c parameter was associated with ratios of INF-γ/IL-10 (P = 0.004), and INF-γ/IL-4 (P = 0.049), HbA1c (P = 0.005), INF-γ (P = 0.009), as well as LDL-c (P = 0.02) levels.
Conclusions
IL-2, IL-6, IL-10 and IFN-γ were the most significant SCH-related markers among the measured cytokines in our patient group. Furthermore, significant associations between pro-/anti-inflammatory imbalance and HbA1c as well as cardio-metabolic risk marker (TG/HDL-c) were observed, indicating higher risks of diabetes and cardiovascular diseases among SCH patients.
Drug-induced liver injury is a major problem of pharmacotherapy and is also frequent with anti-depressive psychopharmacotherapy.
Objectives/aims
However, there are only few studies using a consistent methodologic approach to study hepatotoxicity of a larger group of antidepressants.
Methods
We performed a quantitative signal detection analysis using pharmacovigilance data from the Uppsala monitoring center from the WHO that records adverse drug reaction data from worldwide sources; we calculated reporting odds ratios (ROR) as measures for disproportionality within a case-/non-case approach for several frequently prescribed anti-depressants.
Results
Both positive controls, amineptine (ROR 38.4 [95% CI: 33.8–43.6]) and nefazodone (ROR 3.2 [95% CI: 3.0–3.5]), were statistically associated with hepatotoxicity. Following amineptine, agomelatine (ROR 6.4 [95% CI: 5.7–7.2]) was associated with the second highest ROR, followed by tianeptine (ROR 4.4 [95% CI: 3.6–5.3]), mianserin (ROR 3.6 [95% CI: 3.3–3.4]) and nefazodone.
Conclusions
In line with previous studies our results support the hypothesis that agomelatine and several other anti-depressants may be associated with relevant hepatotoxicity. However, the used data and applied method do not allow a quantitative evaluation of hepatotoxicity or assessment of substance–specific differences regarding the extent of hepatotoxicity.
Disclosure of interest
The authors have not supplied their declaration of competing interest.