We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
OBJECTIVES/GOALS: Quantitative Staff are an essential workforce for biomedical research. While faculty can engage with peers locally and through national organizations, similar opportunities are limited for staff and often do not meet their unique needs and interests. Creating a professional community is valuable for supporting and developing this workforce. METHODS/STUDY POPULATION: We established the Quantitative Scientific Staff National Network (QS2N2) with the mission to provide professional development and networking opportunities, and to serve as an information resource and advocate through the fostering of community among staff quantitative analysts at any career stage. The initial membership outreach was to all Biostatistics, Epidemiology, and Research Design (BERD) programs through members of ACTS BERD Special Interest Group (SIG). We created a Leadership Team and an Advisory Board consisting of staff and faculty biostatisticians with experience working as or managing staff to govern the network. A Core Planning Committee consisting of 15 members guides planning, implementation, and execution of network activities as operationalized through subcommittees. RESULTS/ANTICIPATED RESULTS: The network currently has 131 members from over 30 health science institutions. Subcommittees focused on Education and Training, Membership, Communication and Web Development, and Mentoring were created and are developing events, programs and infrastructure to further the network’s mission. Network events such as webinars will be offered quarterly; with our first event planned for Nov 3rd. Expansion and maturation of QS2N2 will be done through regular remote meetings where members can connect with peers at other institutions, engage in career development activities, and attend technical seminars. Additional membership outreach will seek to connect with staff in government and private sectors. DISCUSSION/SIGNIFICANCE: Knowledgeable, highly skilled collaborative analysts (e.g., biostatisticians, data scientists) are an essential workforce in clinical and translational science and health research centers.The QS2N2 will support professional development, engagement and growth of this critical workforce which is necessary to advance quality research.
The rapid growth of cultural evolutionary science, its expansion into numerous fields, its use of diverse methods, and several conceptual problems have outpaced corollary developments in theory and philosophy of science. This has led to concern, exemplified in results from a recent survey conducted with members of the Cultural Evolution Society, that the field lacks ‘knowledge synthesis’, is poorly supported by ‘theory’, has an ambiguous relation to biological evolution and uses key terms (e.g. ‘culture’, ‘social learning’, ‘cumulative culture’) in ways that hamper operationalization in models, experiments and field studies. Although numerous review papers in the field represent and categorize its empirical findings, the field's theoretical challenges receive less critical attention even though challenges of a theoretical or conceptual nature underlie most of the problems identified by Cultural Evolution Society members. Guided by the heterogeneous ‘grand challenges’ emergent in this survey, this paper restates those challenges and adopts an organizational style requisite to discussion of them. The paper's goal is to contribute to increasing conceptual clarity and theoretical discernment around the most pressing challenges facing the field of cultural evolutionary science. It will be of most interest to cultural evolutionary scientists, theoreticians, philosophers of science and interdisciplinary researchers.
There are numerous adverse health outcomes associated with dementia caregiving, including increased stress and depression. Caregivers often face time-related, socioeconomic, geographic, and pandemic-related barriers to treatment. Thus, implementing mobile health (mHealth) interventions is one way of increasing caregivers’ access to supportive care. The objective of the current study was to collect data from a 3-month feasibility trial of a multicomponent mHealth intervention for dementia caregivers.
Participants and Methods:
40 community-dwelling dementia caregivers were randomized to receive the CARE-Well (Caregiver Assessment, Resources, and Education) App or internet links connected to caregiver education, support, and resources. Caregivers were encouraged to use the App or links at least 4 times per week for 3 months. The App consisted of self-assessments, caregiver and stress reduction education, behavior problem management, calendar reminders, and online social support. Caregivers completed measures of burden, depression, and desire to institutionalize at baseline and post-intervention. Feasibility data included App usage, retention and adherence rates, and treatment satisfaction. Data were analyzed via descriptive statistics.
Results:
Caregivers were mostly white (95%), female (68%), in their mid-60s, (M= 66.38, SD= 10.64), and well-educated (M= 15.52 years, SD= 2.26). Caregivers were mainly spouses (68%) or adult children (30%). Care recipients were diagnosed with mild (60%) or moderate (40%) dementia, with 80% diagnosed as having Alzheimer’s disease. Overall, the study had an 85% retention rate (80% for App group; 90% for links group). 58% of caregivers in the App group were considered high users, using the App >120 minutes over the course of 3 months (M= 362.42, SD= 432.68), and an average of 16.44 days (SD= 15.51). 15% of the sample was non-adherent due to time constraints, disinterest, and/or technology issues. Most participants (75%) using the App were mostly or very satisfied, about 87% would be likely or very likely to seek similar programs in the future, and 93% found the App mostly or very understandable. Groups did not significantly differ on clinical outcomes, although the study was not powered for an efficacy analysis. Within groups analysis revealed significant increases in depressive symptoms at post-treatment for caregivers in both groups.
Conclusions:
This study demonstrated initial feasibility of the CARE-Well App for dementia caregivers. App use was lower than expected, however, high satisfaction, ease of use, and willingness to use similar programs in the future were endorsed. Some caregivers did not complete the intervention due to caregiving responsibilities, general disinterest, and/or technology issues. Although the study was not designed to assess clinical outcomes, we found that both groups reported higher depressive symptoms at post-treatment. This finding was unexpected and might reflect pandemic-related stress, which has been shown to particularly impact dementia caregivers. Future studies should address the efficacy of multicomponent mHealth interventions for dementia caregivers and the effects of increased dose on clinical outcomes. mHealth interventions should be refined to cater to varying levels of technology literacy among caregivers, and further research should aim to better integrate interventions into caregivers’ routines to enhance treatment engagement.
Lithium is viewed as the first-line long-term treatment for prevention of relapse in people with bipolar disorder.
Aims
This study examined factors associated with the likelihood of maintaining serum lithium levels within the recommended range and explored whether the monitoring interval could be extended in some cases.
Method
We included 46 555 lithium rest requests in 3371 individuals over 7 years from three UK centres. Using lithium results in four categories (<0.4 mmol/L; 0.40–0.79 mmol/L; 0.80–0.99 mmol/L; ≥1.0 mmol/L), we determined the proportion of instances where lithium results remained stable or switched category on subsequent testing, considering the effects of age, duration of lithium therapy and testing history.
Results
For tests within the recommended range (0.40–0.99 mmol/L categories), 84.5% of subsequent tests remained within this range. Overall, 3 monthly testing was associated with 90% of lithium results remaining within range, compared with 85% at 6 monthly intervals. In cases where the lithium level in the previous 12 months was on target (0.40–0.79 mmol/L; British National Formulary/National Institute for Health and Care Excellence criteria), 90% remained within the target range at 6 months. Neither age nor duration of lithium therapy had any significant effect on lithium level stability. Levels within the 0.80–0.99 mmol/L category were linked to a higher probability of moving to the ≥1.0 mmol/L category (10%) compared with those in the 0.4–0.79 mmol/L group (2%), irrespective of testing frequency.
Conclusion
We propose that for those who achieve 12 months of lithium tests within the 0.40–0.79 mmol/L range, the interval between tests could increase to 6 months, irrespective of age. Where lithium levels are 0.80–0.99 mmol/L, the test interval should remain at 3 months. This could reduce lithium test numbers by 15% and costs by ~$0.4 m p.a.
We present an overview of the Middle Ages Galaxy Properties with Integral Field Spectroscopy (MAGPI) survey, a Large Program on the European Southern Observatory Very Large Telescope. MAGPI is designed to study the physical drivers of galaxy transformation at a lookback time of 3–4 Gyr, during which the dynamical, morphological, and chemical properties of galaxies are predicted to evolve significantly. The survey uses new medium-deep adaptive optics aided Multi-Unit Spectroscopic Explorer (MUSE) observations of fields selected from the Galaxy and Mass Assembly (GAMA) survey, providing a wealth of publicly available ancillary multi-wavelength data. With these data, MAGPI will map the kinematic and chemical properties of stars and ionised gas for a sample of 60 massive (${>}7 \times 10^{10} {\mathrm{M}}_\odot$) central galaxies at $0.25 < z <0.35$ in a representative range of environments (isolated, groups and clusters). The spatial resolution delivered by MUSE with Ground Layer Adaptive Optics ($0.6-0.8$ arcsec FWHM) will facilitate a direct comparison with Integral Field Spectroscopy surveys of the nearby Universe, such as SAMI and MaNGA, and at higher redshifts using adaptive optics, for example, SINS. In addition to the primary (central) galaxy sample, MAGPI will deliver resolved and unresolved spectra for as many as 150 satellite galaxies at $0.25 < z <0.35$, as well as hundreds of emission-line sources at $z < 6$. This paper outlines the science goals, survey design, and observing strategy of MAGPI. We also present a first look at the MAGPI data, and the theoretical framework to which MAGPI data will be compared using the current generation of cosmological hydrodynamical simulations including EAGLE, Magneticum, HORIZON-AGN, and Illustris-TNG. Our results show that cosmological hydrodynamical simulations make discrepant predictions in the spatially resolved properties of galaxies at $z\approx 0.3$. MAGPI observations will place new constraints and allow for tangible improvements in galaxy formation theory.
Lithium was first found to have an acute antimanic effect in 1948 with further corroboration in the early 1950s. It took some time for lithium to become the standard treatment for relapse prevention in bipolar affective disorder. In this study, our aims were to examine the factors associated wtih the likelihood of maintaining lithium levels within the recommended therapeutic range and to look at the stability of lithium levels between blood tests. We examined this relation using clinical laboratory serum lithium test requesting data collected from three large UK centres, where the approach to managing patients with bipolar disorder and ordering lithium testing varied.
Method
46,555 lithium rest requests in 3,371 individuals over 7 years were included from three UK centres. Using lithium results in four categories (<0.4 mmol/L; 0.40–0.79 mmol/L; 0.80–0.99 mmol/L; ≥1.0 mmol/L), we determined the proportion of instances where, on subsequent testing, lithium results remained in the same category or switched category. We then examined the association between testing interval and proportion remaining within target, and the effect of age, duration of lithium therapy and testing history.
Result
For tests within the recommended range (0.40–0.99 mmol/L categories), 84.5% of subsequent tests remained within this range. Overall 3-monthly testing was associated with 90% of lithium results remaining within range compared with 85% at 6-monthly intervals. At all test intervals, lithium test result history in the previous 12-months was associated with the proportion of next test results on target (BNF/NICE criteria), with 90% remaining within range target after 6-months if all tests in the previous 12-months were on target. Age/duration of lithium therapy had no significant effect on lithium level stability. Levels within the 0.80–0.99 mmol/L category were linked to a higher probability of moving to the ≥1.0 mmol/L category (10%) than those in the 0.40–0.79 mmolL group (2%), irrespective of testing frequency. Thus prior history in relation to stability of lithium level in the previous 12 months is a predictor of future stability of lithium level.
Conclusion
We propose that, for those who achieve 12-months of lithium tests within the 0.40–0.79mmol/L range, it would be reasonable to increase the interval between tests to 6 months, irrespective of age, freeing up resource to focus on those less concordant with their lithium monitoring. Where lithium level is 0.80–0.99mmol/L test interval should remain at 3 months. This could reduce lithium test numbers by 15% and costs by ~$0.4 m p.a.
To examine the factors that relate to antipsychotic prescribing in general practices across England and how these relate to cost changes in recent years.
Background
Antipsychotic medications are the first-line pharmacological intervention for severe mental illnesses(SMI) such as schizophrenia and other psychoses, while also being used to relieve distress and treat neuropsychiatric symptoms in dementia.
Since 2014 many antipsychotic agents have moved to generic provision. In 2017_18 supplies of certain generic agents were affected by substantial price increases.
Method
The study examined over time the prescribing volume and prices paid for antipsychotic medication by agent in primary care and considered if price change affected agent selection by prescribers.
The NHS in England/Wales publishes each month the prescribing in general practice by BNF code. This was aggregated for the year 2018_19 using Defined Daily doses (DDD) as published by the World Health Organisation Annual Therapeutic Classification (WHO/ATC) and analysed by delivery method and dose level. Cost of each agent year-on-year was determined.
Monthly prescribing in primary care was consolidated over 5 years (2013-2018) and DDD amount from WHO/ATC for each agent was used to convert the amount to total DDD/practice.
Result
Description
In 2018_19 there were 10,360,865 prescriptions containing 136 million DDD with costs of £110 million at an average cost of £0.81/DDD issued in primary care. We included 5,750 GP Practices with practice population >3000 and with >30 people on their SMI register.
Effect of price
In 2017_18 there was a sharp increase in overall prices and they had not reduced to expected levels by the end of the 2018_19 evaluation year. There was a gradual increase in antipsychotic prescribing over 2013-2019 which was not perturbed by the increase in drug price in 2017/18.
Regression
Demographic factors
The strongest positive relation to increased prescribing of antipsychotics came from higher social disadvantage, higher population density(urban), and comorbidities e.g. chronic obstructive pulmonary disease(COPD). Higher %younger and %older populations, northerliness and non-white (Black and Minority Ethnic (BME)) ethnicity were all independently associated with less antipsychotic prescribing.
Prescribing Factors
Higher DDD/general practice population was linked with higher %injectable, higher %liquid, higher doses/prescription and higher %zuclopenthixol. Less DDD/population was linked with general practices using higher %risperidone and higher spending/dose of antipsychotic.
Conclusion
Higher levels of antipsychotic prescribing are driven by social factors/comorbidities. The link with depot medication prescriptions, alludes to the way that antipsychotics can induce receptor supersensitivity with consequent dose escalation.
Parkinson's psychosis can be very challenging to manage, with limited treatment options available. There is a good evidence base to support the use of clozapine, but practical obstacles often prevent its use. There is a drive nationally to set up services so that people with Parkinson's psychosis can access treatment with clozapine in a timely manner and, where possible, with initiation in the community. The authors describe their experiences in setting up clozapine services specifically for this patient group in England and offer a practical approach to the assessment of Parkinson's psychosis. They also outline the evidence base in relation to treatment options and share their experiences of prescribing clozapine for Parkinson's psychosis.
Haematopoietic stem cell transplantation is an important and effective treatment strategy for many malignancies, marrow failure syndromes, and immunodeficiencies in children, adolescents, and young adults. Despite advances in supportive care, patients undergoing transplant are at increased risk to develop cardiovascular co-morbidities.
Methods:
This study was performed as a feasibility study of a rapid cardiac MRI protocol to substitute for echocardiography in the assessment of left ventricular size and function, pericardial effusion, and right ventricular hypertension.
Results:
A total of 13 patients were enrolled for the study (age 17.5 ± 7.7 years, 77% male, 77% white). Mean study time was 13.2 ± 5.6 minutes for MRI and 18.8 ± 5.7 minutes for echocardiogram (p = 0.064). Correlation between left ventricular ejection fraction by MRI and echocardiogram was good (ICC 0.76; 95% CI 0.47, 0.92). None of the patients had documented right ventricular hypertension. Patients were given a survey regarding their experiences, with the majority both perceiving that the echocardiogram took longer (7/13) and indicating they would prefer the MRI if given a choice (10/13).
Conclusion:
A rapid cardiac MRI protocol was shown feasible to substitute for echocardiogram in the assessment of key factors prior to or in follow-up after haematopoietic stem cell transplantation.
Type 2 diabetes results mainly from weight gain in adult life and affects one in twelve people worldwide. In the Diabetes REmission Clinical Trial (DiRECT), the primary care-led Counterweight-Plus weight management program achieved remission of type 2 diabetes (for up to six years) for forty-six percent of patients after one year and thirty-six percent after two years. The objective of this study was to estimate the implementation costs of the program, as well as its two-year within-trial cost effectiveness and lifetime cost effectiveness.
Methods
Within-trial cost effectiveness included the Counterweight-Plus costs (including training, practitioner appointments, and low-energy diet), medications, and all routine healthcare contacts, combined with achieved remission rates. Lifetime cost per quality-adjusted life-year (QALY) was estimated according to projected durations of remissions, assuming continued relapse rates as seen in year two of DiRECT and the consequent life expectancy, quality of life and healthcare costs.
Results
The two-year intervention cost was EUR 1,580 per participant, with over eighty percent of the costs incurred in year one. Compared with the control group, medication savings were EUR 259 (95% confidence interval [CI]: 166–352) for anti-diabetes drugs and EUR 29 (95% CI: 12–47) for anti-hypertensive medications. The intervention was modeled with a lifetime horizon to achieve a mean 0.06 (95% CI: 0.04–0.09) gain in QALYs for the DiRECT population and a mean total lifetime cost saving per participant of EUR 1,497 (95% CI: 755–2,331), with the intervention becoming cost-saving within six years.
Conclusions
The intensive weight loss and maintenance program reduced the cost of anti-diabetes drugs through improved metabolic control, achieved diabetes remission in over one-third of participants, and reduced total healthcare contacts and costs over two years. A substantial lifetime healthcare cost saving is anticipated from periods of diabetes remission and delaying complications. Healthcare resources could be shifted cost effectively to establish diabetes remission services, using the existing DiRECT intervention, even if remissions are only maintained for limited durations. However, more research investment is needed to further improve weight-loss maintenance and extend remissions.
Background: Healthcare services are increasingly shifting from inpatient to outpatient settings. Outpatient settings such as emergency departments (EDs), oncology clinics, dialysis clinics, and day surgery often involve invasive procedures with the risk of acquiring healthcare-associated infections (HAIs). As a leading cause of HAI, Clostridioides difficile infection (CDI) in outpatient settings has not been sufficiently described in Canada. The Canadian Nosocomial Infection Surveillance Program (CNISP) aims to describe the epidemiology, molecular characterization, and antimicrobial susceptibility of outpatient CDI across Canada. Methods: Epidemiologic data were collected from patients diagnosed with CDI from a network of 47 adult and pediatric CNISP hospitals. Patients presenting to an outpatient setting such as the ED or outpatient clinics were considered as outpatient CDI. Cases were considered HAIs if the patient had had a healthcare intervention within the previous 4 weeks, and they were considered community-associated if there was no history of hospitalization within the previous 12 weeks. Clostridioides difficile isolates were submitted to the National Microbiology Laboratory for testing during an annual 2-month targeted surveillance period. National and regional rates of CDI were stratified by outpatient location. Results: Between January 1, 2015, and June 30, 2019, 2,691 cases of outpatient-CDI were reported, and 348 isolates were available for testing. Most cases (1,475 of 2,691, 54.8%) were identified in outpatient clinics, and 72.8% (1,960 of 2,691) were classified as community associated. CDI cases per 100,000 ED visits were highest in 2015, at 10.3, and decreased to 8.1 in 2018. Rates from outpatient clinics decreased from 3.5 in 2016 to 2.7 in 2018 (Fig. 1). Regionally, CDI rates in the ED declined in Central Canada and increased in the West after 2016. Rates in outpatient clinics were >2 times higher in the West compared to other regions. RT027 associated with NAP1 was most common among ED patients (26 of 195, 13.3%), whereas RT106 associated with NAP11 was predominant in outpatient clinics (22 of 189, 11.6%). Overall, 10.4% of isolates were resistant to moxifloxacin, 0.5% were resistant to rifampin, and 24.2% were resistant to clindamycin. No resistance was observed for metronidazole, vancomycin, or tigecycline. Compared to CNISP inpatient CDI data, outpatients with CDI were younger (51.8 ± 23.3 vs 64.2 ± 21.6; P < .001), included more females (56.4% vs 50.9%; P < .001), and were more often treated with metronidazole (63.0% vs 56.1%; P < .001). Conclusions: For the first time, CDI cases identified in outpatient settings were characterized in a Canadian context. Outpatient CDI rates are decreasing overall, but they vary by region. Predominant ribotypes vary based on outpatient location. Outpatients with CDI are younger and are more likely female than inpatients with CDI.
Funding: None
Disclosures: Susy Hota reports contract research for Finch Therapeutics.
Background: Carbapenemase-producing Enterobacterales (CPE) have rapidly become a global health concern and are associated with substantial morbidity and mortality due to limited treatment options. Travel to endemic areas, especially healthcare exposure in these areas, is an important risk factor for acquisition. We describe the evolving epidemiology, molecular features, and outcomes of CPE in Canada through surveillance by the Canadian Nosocomial Infection Surveillance Program (CNISP). Methods: CNISP has conducted surveillance for CPE among inpatients and outpatients of all ages since 2010. Participating acute-care facilities submit eligible specimens to the National Microbiology Laboratory for detection of carbapenemase production, and epidemiological data are collected. Incidence rates per 10,000 patient days are calculated based on inpatient data. Results: In total, 59 CNISP hospitals in 10 Canadian provinces representing 21,789 beds and 6,785,013 patient days participated in this surveillance. From 2010 to 2018, 118 (26%) CPE-infected and 547 (74%) CPE-colonized patients were identified. Few pediatric cases were identified (n = 18). Infection incidence rates remain low and stable (0.02 per 10,000 patient days in 2010 to 0.03 per 10,000 patient days in 2018), and colonization incidence rates have increased by 89% over the surveillance period. Overall, 92% of cases were acquired in a healthcare facility: 61% (n = 278) in a Canadian healthcare facility and 31% (n = 142) in a healthcare facility outside Canada. Of the 8% of cases not acquired in a healthcare facility, 50% (16 of 32) reported travel outside of Canada in the 12 months prior to positive culture. The distribution of carbapenemases varied by region; New Delhi metallo-B-lactamase (NDM) was dominant (59%) in western Canada and Klebsiella pneumoniae carbapenemase (KPC) (66%) in central Canada. NDM and class D carbapenemase OXA-48 were more commonly identified among those who traveled outside of Canada, whereas KPC was more commonly identified among patients without travel. In addition, 30-day all-cause mortality was 14% (25 of 181) among CPE infected patients and 32% (14 of 44) among those with bacteremia. Conclusions: CPE rates remain low in Canada; however, national surveillance data suggest that the increase in CPE in Canada is now being driven by local nosocomial transmission as well as travel and healthcare within endemic areas. Changes in screening practices may have contributed to the increase in colonizations; however, these data are currently lacking and will be collected moving forward. These data highlight the need to intensify surveillance and coordinate infection control measures to prevent further spread of CPE in Canadian acute-care hospitals.
Funding: None
Disclosures: Susy Hota reports contracted research for Finch Therapeutics. Allison McGeer reports funds to her institution for projects for which she is the principal investigator from Pfizer and Merck, as well as consulting fees from the following companies: Sanofi-Pasteur, Sunovion, GSK, Pfizer, and Cidara.
OBJECTIVES/GOALS: Active surveillance (AS) is a recognized strategy to manage low-risk prostate cancer (PCa) in the absence of cancer progression. Little prospective data exists on the decisional factors associated with selecting and adhering to AS in the absence of cancer progression. We developed a survey instrument to predict AS uptake and adherence. METHODS/STUDY POPULATION: We utilized a three-step process to develop and refine a survey instrument designed to predict AS uptake and adherence among men with low-risk PCa: 1) We identified relevant conceptual domains based on prior research and a literature review. 2) We conducted 21 semi-structured concept elicitation interviews to identify patient-perceived barriers and facilitators to AS uptake and adherence among men with a low-risk PCa who had been on AS for ≥1 year. The identified concepts became the basis of our draft survey instrument. 3) We conducted two rounds of cognitive interviews with men with low-risk PCa (n = 12; n = 6) to refine and initially validate the instrument. RESULTS/ANTICIPATED RESULTS: Relevant concepts identified from the initial interviews included the importance of patient: knowledge of their PCa risk, value in delaying treatment, trust in urologist and the AS surveillance protocol, and perceived social support. Initially, the survey was drafted as a single instrument to be administered after a patient had selected AS comprising sections on patient health, AS selection, and AS adherence. Based on the first round of cognitive interviews, we revised the single instrument into two surveys to track shifts in patient preference and experience. The first, administered at diagnosis, focuses on selection, and the second, a 6-month follow up, focuses on adherence. Following revisions, participants indicated the revised 2-part instrument was clear and not burdensome to complete. DISCUSSION/SIGNIFICANCE OF IMPACT: The instrument’s content validity was evaluated through cognitive interviews, which supported that the survey items’ intended and understood meanings were isomorphic. In the next phase, we plan to conduct a large-scale prospective cohort study to evaluate the predictive validity, after which it will be available for public research use.
The social model of disability was implemented in the United States partially through the Americans with Disabilities Act (ADA), and most notably through certain universal accommodations for physical disabilities. The social model has also been applied to mental health, but the ADA did not provide for universal accommodations in mental health. In this Chapter, the authors conduct a systematic review of PubMed and PsycARTICLES to identify evidence for potential universal accommodations in mental health and discuss the policy and ethical considerations of implementing universal accommodations in mental health.
Performing an extended Focused Assessment with Sonography in Trauma (eFAST) exam is common practice in the initial assessment of trauma patients. The objective of this study was to systematically review the published literature on diagnostic accuracy of all components of the eFAST exam.
Methods
We searched Medline and Embase from inception through October 2018, for diagnostic studies examining the sensitivity and specificity of the eFAST exam. After removal of duplicates, 767 records remained for screening, of which 119 underwent full text review. Meta-DiSc™ software was used to create pooled sensitivities and specificities for included studies. Study quality was assessed using the Quality in Prognostic Studies (QUADAS-2) tool.
Results
Seventy-five studies representing 24,350 patients satisfied our selection criteria. Studies were published between 1989 and 2017. Pooled sensitivities and specificities were calculated for the detection of pneumothorax (69% and 99% respectively), pericardial effusion (91% and 94% respectively), and intra-abdominal free fluid (74% and 98% respectively). Sub-group analysis was completed for detection of intra-abdominal free fluid in hypotensive (sensitivity 74% and specificity 95%), adult normotensive (sensitivity 76% and specificity 98%) and pediatric patients (sensitivity 71% and specificity 95%).
Conclusions
Our systematic review and meta-analysis suggests that e-FAST is a useful bedside tool for ruling in pneumothorax, pericardial effusion, and intra-abdominal free fluid in the trauma setting. Its usefulness as a rule-out tool is not supported by these results.
Cognitive-behavioural therapy (CBT) is an effective treatment for depressed adults. CBT interventions are complex, as they include multiple content components and can be delivered in different ways. We compared the effectiveness of different types of therapy, different components and combinations of components and aspects of delivery used in CBT interventions for adult depression. We conducted a systematic review of randomised controlled trials in adults with a primary diagnosis of depression, which included a CBT intervention. Outcomes were pooled using a component-level network meta-analysis. Our primary analysis classified interventions according to the type of therapy and delivery mode. We also fitted more advanced models to examine the effectiveness of each content component or combination of components. We included 91 studies and found strong evidence that CBT interventions yielded a larger short-term decrease in depression scores compared to treatment-as-usual, with a standardised difference in mean change of −1.11 (95% credible interval −1.62 to −0.60) for face-to-face CBT, −1.06 (−2.05 to −0.08) for hybrid CBT, and −0.59 (−1.20 to 0.02) for multimedia CBT, whereas wait list control showed a detrimental effect of 0.72 (0.09 to 1.35). We found no evidence of specific effects of any content components or combinations of components. Technology is increasingly used in the context of CBT interventions for depression. Multimedia and hybrid CBT might be as effective as face-to-face CBT, although results need to be interpreted cautiously. The effectiveness of specific combinations of content components and delivery formats remain unclear. Wait list controls should be avoided if possible.
We sought to define the prevalence of echocardiographic abnormalities in long-term survivors of paediatric hematopoietic stem cell transplantation and determine the utility of screening in asymptomatic patients. We analysed echocardiograms performed on survivors who underwent hematopoietic stem cell transplantation from 1982 to 2006. A total of 389 patients were alive in 2017, with 114 having an echocardiogram obtained ⩾5 years post-infusion. A total of 95 patients had echocardiogram performed for routine surveillance. The mean time post-hematopoietic stem cell transplantation was 13 years. Of 95 patients, 77 (82.1%) had ejection fraction measured, and 10/77 (13.0%) had ejection fraction z-scores ⩽−2.0, which is abnormally low. Those patients with abnormal ejection fraction were significantly more likely to have been exposed to anthracyclines or total body irradiation. Among individuals who received neither anthracyclines nor total body irradiation, only 1/31 (3.2%) was found to have an abnormal ejection fraction of 51.4%, z-score −2.73. In the cohort of 77 patients, the negative predictive value of having a normal ejection fraction given no exposure to total body irradiation or anthracyclines was 96.7% at 95% confidence interval (83.3–99.8%). Systolic dysfunction is relatively common in long-term survivors of paediatric hematopoietic stem cell transplantation who have received anthracyclines or total body irradiation. Survivors who are asymptomatic and did not receive radiation or anthracyclines likely do not require surveillance echocardiograms, unless otherwise indicated.
Examination of rock samples from Job's Hill, St Mary, Jamaica, by chemical, X-ray, differential thermal, and infrared analyses showed predominant dickite with associated nacrite and kaolinite. The uniqueness of this deposit makes it possible that selected material could be used as a standard mineral.
Our primary aim in this paper is to sketch a cognitive evolutionary approach for developing explanations of social change that is anchored in the psychological mechanisms underlying normative cognition and the transmission of social norms. We throw the relevant features of this approach into relief by comparing it with the self-fulfilling social expectations account developed by Bicchieri and colleagues. After describing both accounts, we argue that the two approaches are largely compatible, but that the cognitive evolutionary approach is well suited to encompass much of the social expectations view, whose focus on a narrow range of norms comes at the expense of the breadth the cognitive evolutionary approach can provide.