We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Given the rate of advancement in predictive psychiatry, there is a threat that it outpaces public and professional willingness for use in clinical care and public health. Prediction tools in psychiatry estimate the risk of future development of mental health conditions. Prediction tools used with young populations have the potential to reduce the worldwide burden of depression. However, little is known globally about adolescents’ and other stakeholders’ attitudes toward use of depression prediction tools. To address this, key informant interviews and focus group discussions were conducted in Brazil, Nepal, Nigeria and the United Kingdom with 23 adolescents, 45 parents, 47 teachers, 48 health-care practitioners and 78 other stakeholders (total sample = 241) to assess attitudes toward using a depression prediction risk calculator based on the Identifying Depression Early in Adolescence Risk Score. Three attributes were identified for an acceptable depression prediction tool: it should be understandable, confidential and actionable. Understandability includes depression literacy and differentiating between having a condition versus risk of a condition. Confidentiality concerns are disclosing risk and impeding educational and occupational opportunities. Prediction results must also be actionable through prevention services for high-risk adolescents. Six recommendations are provided to guide research on attitudes and preparedness for implementing prediction tools.
OBJECTIVES/GOALS: The COVID-19 pandemic disrupted established social support networks (faith-based, community, family, friends), resulting in unprecedented health-related, financial, and employment challenges among African Americans (AAs). This study explores the psychosocial influences of the pandemic on the health and wellness of AAs. METHODS/STUDY POPULATION: The FAITH! (Fostering African-American Improvement in Total Health!) Program, an academic-community partnership with AA churches, shifted focus to COVID-19 prevention in AA communities. Funded by the Mayo Clinic Center for Clinical and Translation Sciences, this cross-sectional study recruited AA adults from FAITH!-affiliated churches and social media to complete a survey exploring the personal impact of the pandemic from hardships (e.g., food and housing insecurity, paying utilities) on healthy lifestyle (HL). The primary outcome was difficulty maintaining a HL during the pandemic. Logistic regression (odds ratios and associated 95% confidence intervals (CIs)) was used to examine the associations between difficulty maintaining a HL and factors including COVID-19 hardships and mental health. RESULTS/ANTICIPATED RESULTS: Participants (N=169, 71.4% female, 41.4% essential workers) had a mean age [SD] of 49.4 [14.9] years. Over half (91/169, 54%) reported difficulty maintaining a HL. Those reporting unemployment (OR 2.3; 95% CI [1.2,4.4]; p=0.008), difficulty paying rent (OR 4.1; 95% CI [2.1,8.6]; p<0.001), or food/utilities (OR 5.5; 95% CI [2.7,11.5]; p<0.001) all had greater odds of difficulty maintaining a HL. High stress (≥5/10, scale 1-10) was associated with difficulty maintaining a HL (OR 4.1; 95% CI [2.1,8.5]; p<0.001) compared to AAs with low stress. Negative mental health (depression (OR 3.4; 95% CI [1.0,13.7]; p<0.001), anger (OR 2.5; 95% CI [0.5,18.9]; p=0.005), and nervousness (OR 4.1; 95% CI [1.1,19.5]; p=0.003) was associated with difficulty maintaining a HL compared to AAs with positive mental health. DISCUSSION/SIGNIFICANCE: Our study findings revealed that COVID-19 hardships, stress, and negative mental health impacted the ability of AAs to maintain a HL. These issues should be considered in the design and implementation of community-based health programs to promote healthy living during future public health emergencies.
The Antarctic seaspider Pentanymphon antarcticum is a benthic species in the Southern Ocean, but little is known about its pathogen profile. In this study, we provide a draft genome for a new iridovirus species that has been identified using metagenomic techniques. The draft genome totals 157 260 bp and encodes 188 protein-coding genes. The virus shows greatest protein similarity to a ‘carnivorous sponge-associated iridovirus’ from a deep-sea sponge host. This study represents the first discovery of a pycnogonid iridovirus and the first iridovirus from the Antarctic region.
Vancomycin therapy is associated with an increased risk of acute kidney injury (AKI). Previous studies suggest that area under the curve (AUC) monitoring reduces the risk of AKI, but literature is lacking to support this in patients receiving longer durations of vancomycin therapy.
Design:
Retrospective cohort study.
Method:
Patients ≥18 years old, admitted between August 2015 and July 2017 or October 2017 and September 2019, and received at least 14 days of intravenous (IV) vancomycin therapy were included in the study. Our primary outcome was the incidence of AKI between trough monitoring and AUC monitoring groups using Kidney Disease Improving Global Outcomes criteria. Secondary outcomes included inpatient mortality, median inpatient length of stay, and median intensive care unit length of stay.
Results:
Overall, 582 patients were included in the study, with 318 patients included in the trough monitoring group and 264 included in the AUC monitoring group. The median duration of vancomycin therapy was 23 days (interquartile range, 16–39). Patients within the trough monitoring group had a higher incidence of AKI compared to the AUC monitoring group (45.6% vs 28.4%, p < 0.001). Furthermore, logistic regression analysis showed that AUC monitoring was associated with a 54% lower incidence of AKI (OR 0.46, 95% CI [0.31–0.69]). All-cause inpatient mortality was numerically higher in the trough monitoring group (12.9% vs 8.3%, p = 0.078).
Conclusions:
In patients who received at least 14 days of IV vancomycin therapy, AUC monitoring was associated with a lower incidence of AKI.
The objective of this study was to determine antibiotic appropriateness based on Loeb minimum criteria (LMC) in patients with and without altered mental status (AMS).
Design:
Retrospective, quasi-experimental study assessing pooled data from 3 periods pertaining to the implementation of a UTI management guideline.
Setting:
Academic medical center in Lexington, Kentucky.
Patients:
Adult patients aged ≥18 years with a collected urinalysis receiving antimicrobial therapy for a UTI indication.
Methods:
Appropriateness of UTI management was assessed in patients prior to an institutional UTI guideline, after guideline introduction and education, and after implementation of a prospective audit-and-feedback stewardship intervention from September to November 2017–2019. Patient data were pooled and compared between patients noted to have AMS versus those with classic UTI symptoms. Loeb minimum criteria were used to determine whether UTI diagnosis and treatment was warranted.
Results:
In total, 600 patients were included in the study. AMS was one of the most common indications for testing across the 3 periods (19%–30.5%). Among those with AMS, 25 patients (16.7%) met LMC, significantly less than the 151 points (33.6%) without AMS (P < .001).
Conclusions:
Patients with AMS are prescribed antibiotic therapy without symptoms indicative of UTI at a higher rate than those without AMS, according to LMC. Further antimicrobial stewardship efforts should focus on prescriber education and development of clearly defined criteria for patients with and without AMS.
The purpose of this scoping review is two-fold: to assess the literature that quantitatively measures outcomes of mentorship programs designed to support research-focused junior faculty and to identify mentoring strategies that promote diversity within academic medicine mentoring programs.
Methods:
Studies were identified by searching Medline using MESH terms for mentoring and academic medicine. Eligibility criteria included studies focused on junior faculty in research-focused positions, receiving mentorship, in an academic medical center in the USA, with outcomes collected to measure career success (career trajectory, career satisfaction, quality of life, research productivity, leadership positions). Data were abstracted using a standardized data collection form, and best practices were summarized.
Results:
Search terms resulted in 1,842 articles for title and abstract review, with 27 manuscripts meeting inclusion criteria. Two studies focused specifically on women, and four studies focused on junior faculty from racial/ethnic backgrounds underrepresented in medicine. From the initial search, few studies were designed to specifically increase diversity or capture outcomes relevant to promotion within academic medicine. Of those which did, most studies captured the impact on research productivity and career satisfaction. Traditional one-on-one mentorship, structured peer mentorship facilitated by a senior mentor, and peer mentorship in combination with one-on-one mentorship were found to be effective strategies to facilitate research productivity.
Conclusion:
Efforts are needed at the mentee, mentor, and institutional level to provide mentorship to diverse junior faculty on research competencies and career trajectory, create a sense of belonging, and connect junior faculty with institutional resources to support career success.
Higher milk intake has been associated with a lower stroke risk, but not with risk of CHD. Residual confounding or reverse causation cannot be excluded. Therefore, we estimated the causal association of milk consumption with stroke and CHD risk through instrumental variable (IV) and gene-outcome analyses. IV analysis included 29 328 participants (4611 stroke; 9828 CHD) of the European Prospective Investigation into Cancer and Nutrition (EPIC)-CVD (eight European countries) and European Prospective Investigation into Cancer and Nutrition-Netherlands (EPIC-NL) case-cohort studies. rs4988235, a lactase persistence (LP) SNP which enables digestion of lactose in adulthood was used as genetic instrument. Intake of milk was first regressed on rs4988235 in a linear regression model. Next, associations of genetically predicted milk consumption with stroke and CHD were estimated using Prentice-weighted Cox regression. Gene-outcome analysis included 777 024 participants (50 804 cases) from MEGASTROKE (including EPIC-CVD), UK Biobank and EPIC-NL for stroke, and 483 966 participants (61 612 cases) from CARDIoGRAM, UK Biobank, EPIC-CVD and EPIC-NL for CHD. In IV analyses, each additional LP allele was associated with a higher intake of milk in EPIC-CVD (β = 13·7 g/d; 95 % CI 8·4, 19·1) and EPIC-NL (36·8 g/d; 95 % CI 20·0, 53·5). Genetically predicted milk intake was not associated with stroke (HR per 25 g/d 1·05; 95 % CI 0·94, 1·16) or CHD (1·02; 95 % CI 0·96, 1·08). In gene-outcome analyses, there was no association of rs4988235 with risk of stroke (OR 1·02; 95 % CI 0·99, 1·05) or CHD (OR 0·99; 95 % CI 0·95, 1·03). Current Mendelian randomisation analysis does not provide evidence for a causal inverse relationship between milk consumption and stroke or CHD risk.
Coronavirus disease 2019 (COVID-19) has migrated to regions that were initially spared, and it is likely that different populations are currently at risk for illness. Herein, we present our observations of the change in characteristics and resource use of COVID-19 patients over time in a national system of community hospitals to help inform those managing surge planning, operational management, and future policy decisions.
Few studies have derived data-driven dietary patterns in youth in the USA. This study examined data-driven dietary patterns and their associations with BMI measures in predominantly low-income, racial/ethnic minority US youth. Data were from baseline assessments of the four Childhood Obesity Prevention and Treatment Research (COPTR) Consortium trials: NET-Works (534 2–4-year-olds), GROW (610 3–5-year-olds), GOALS (241 7–11-year-olds) and IMPACT (360 10–13-year-olds). Weight and height were measured. Children/adult proxies completed three 24-h dietary recalls. Dietary patterns were derived for each site from twenty-four food/beverage groups using k-means cluster analysis. Multivariable linear regression models examined associations of dietary patterns with BMI and percentage of the 95th BMI percentile. Healthy (produce and whole grains) and Unhealthy (fried food, savoury snacks and desserts) patterns were found in NET-Works and GROW. GROW additionally had a dairy- and sugar-sweetened beverage-based pattern. GOALS had a similar Healthy pattern and a pattern resembling a traditional Mexican diet. Associations between dietary patterns and BMI were only observed in IMPACT. In IMPACT, youth in the Sandwich (cold cuts, refined grains, cheese and miscellaneous) compared with Mixed (whole grains and desserts) cluster had significantly higher BMI (β = 0·99 (95 % CI 0·01, 1·97)) and percentage of the 95th BMI percentile (β = 4·17 (95 % CI 0·11, 8·24)). Healthy and Unhealthy patterns were the most common dietary patterns in COPTR youth, but diets may differ according to age, race/ethnicity or geographic location. Public health messages focused on healthy dietary substitutions may help youth mimic a dietary pattern associated with lower BMI.
To determine risk factors for mortality among COVID-19 patients admitted to a system of community hospitals in the United States.
Design:
Retrospective analysis of patient data collected from the routine care of COVID-19 patients.
Setting:
System of >180 acute-care facilities in the United States.
Participants:
All admitted patients with positive identification of COVID-19 and a documented discharge as of May 12, 2020.
Methods:
Determination of demographic characteristics, vital signs at admission, patient comorbidities and recorded discharge disposition in this population to construct a logistic regression estimating the odds of mortality, particular for those patients characterized as not being critically ill at admission.
Results:
In total, 6,180 COVID-19+ patients were identified as of May 12, 2020. Most COVID-19+ patients (4,808, 77.8%) were admitted directly to a medical-surgical unit with no documented critical care or mechanical ventilation within 8 hours of admission. After adjusting for demographic characteristics, comorbidities, and vital signs at admission in this subgroup, the largest driver of the odds of mortality was patient age (OR, 1.07; 95% CI, 1.06–1.08; P < .001). Decreased oxygen saturation at admission was associated with increased odds of mortality (OR, 1.09; 95% CI, 1.06–1.12; P < .001) as was diabetes (OR, 1.57; 95% CI, 1.21–2.03; P < .001).
Conclusions:
The identification of factors observable at admission that are associated with mortality in COVID-19 patients who are initially admitted to non-critical care units may help care providers, hospital epidemiologists, and hospital safety experts better plan for the care of these patients.
Cognitive deficits affect a significant proportion of patients with bipolar disorder (BD). Problems with sustained attention have been found independent of mood state and the causes are unclear. We aimed to investigate whether physical parameters such as activity levels, sleep, and body mass index (BMI) may be contributing factors.
Methods
Forty-six patients with BD and 42 controls completed a battery of neuropsychological tests and wore a triaxial accelerometer for 21 days which collected information on physical activity, sleep, and circadian rhythm. Ex-Gaussian analyses were used to characterise reaction time distributions. We used hierarchical regression analyses to examine whether physical activity, BMI, circadian rhythm, and sleep predicted variance in the performance of cognitive tasks.
Results
Neither physical activity, BMI, nor circadian rhythm predicted significant variance on any of the cognitive tasks. However, the presence of a sleep abnormality significantly predicted a higher intra-individual variability of the reaction time distributions on the Attention Network Task.
Conclusions
This study suggests that there is an association between sleep abnormalities and cognition in BD, with little or no relationship with physical activity, BMI, and circadian rhythm.
Proximal environments could facilitate smoking cessation among low-income smokers by making cessation appealing to strive for and tenable.
Aims
We sought to examine how home smoking rules and proximal environmental factors such as other household members' and peers' smoking behaviors and attitudes related to low-income smokers' past quit attempts, readiness, and self-efficacy to quit.
Methods
This analysis used data from Offering Proactive Treatment Intervention (OPT-IN) (randomized control trial of proactive tobacco cessation outreach) baseline survey, which was completed by 2,406 participants in 2011/12. We tested the associations between predictors (home smoking rules and proximal environmental factors) and outcomes (past-year quit attempts, readiness to quit, and quitting self-efficacy).
Results
Smokers who lived in homes with more restrictive household smoking rules, and/or reported having ‘important others’ who would be supportive of their quitting, were more likely to report having made a quit attempt in the past year, had greater readiness to quit, and greater self-efficacy related to quitting.
Conclusions
Adjustments to proximal environments, including strengthening household smoking rules, might encourage cessation even if other household members are smokers.
Degradation of contaminant hydrocarbons in groundwater by microbially mediated oxidation, linked to the reduction of electron acceptors, is fundamental to the strategy of ‘monitored natural attenuation’ (MNA) for oxidizable hydrocarbons, which is increasingly being adopted at polluted aquifer sites throughout Europe and North America. Commonly, oxygen is depleted and following the reduction of nitrate, solid-phase Fe oxides become the dominant electron acceptors. Arsenic, associated with Fe and Mn oxides in soils and sediments, may therefore be mobilized to groundwater and pose an additional threat to environmental receptors. In a pilot study of three aquifers in England, we have examined the extent to which arsenic is released to groundwater under Fe(III)-reducing conditions imposed by contaminant hydrocarbons. Results show that arsenic is locally mobilized in the Chalk to <10 μg/1, in Quaternary gravels to 70 μg/1 and in the Triassic sandstones to 160 μg/1. At the Chalk and Quaternary gravels sites arsenic mobilization is demonstrably linked to reduction of Fe- and Mn-oxides. This is not so at the Triassic sandstone site, where release of arsenic is related to elevated bicarbonate alkalinity. Redox-driven arsenic mobilization at other Triassic sandstone locations is possible. Further work is required on the solid-phase sources of arsenic in the aquifers, and to relate the hydrochemical observations to groundwater hydraulic conditions.
The Carnmenellis granite and its aureole contain the only recorded thermal groundwaters (up to 52 °C) in British granites. They occur as springs in tin mines at depths between 200 and 700 m and most are saline (maximum mineralization 19 310 mg 1−1). Mining activity has disturbed the groundwater circulation pattern developed over a geological time-scale and levels of bomb-produced tritium (> 4 TU) indicate that a significant component (up to 65 %) of the most saline waters are of recent origin. All components of all the mine waters are of meteoric origin. Radiogenic 4He contents, 40Ar/36Ar ratios, and uranium series geochemistry suggest that the thermal component has a likely residence time of at least 5 × 104 years and probably of order 106 years.
The thermal waters have molar Na+/Cl− ratios considerably less than 1 but they are enriched relative to sea water in all major cations except Mg. The groundwater is also particularly enriched in Li with contents ranging up to 125 mg 1−1. The groundwater salinity, which may reach a maximum of 30 000 mg 1−1, is shown to result from weathering reactions of biotite (probably through a chloritization step) and plagioclase feldspar, to kaolinite. On volumetric considerations, fluid inclusions cannot contribute significantly to the groundwater salinity, and stable isotope ratios rule out any contribution from sea water.
Groundwater silica contents and molar Na+/K+ ratios suggest that the likely equilibration temperature is 54°C, which would imply a depth of circulation of about 1.2 km.
Introduction: BACKGOUND In the modern era of terrorism and senseless violence, it is essential that hospital staff have expertise in implementation of a mass casualty incident (MCI) plan. OBJECTIVES 1. To assess current gaps in implementation of an academic urban hospital code orange plan using live simulation and tabletop exercise. 2. To identify and educate front-line staff to champion a hospital-wide MCI plan. INNOVATION Historically, in order to limit resource utilization and impact on patient care, disaster response training of front-line staff involved tabletop exercises only. The tenets of experiential learning suggest that learner engagement through realistic active practice of skills achieves deeper uptake of new knowledge. We enhanced the traditional tabletop approach through novel use of live actor patients presenting to an academic, urban emergency department (ED) during a hospital-wide MCI simulation. Methods: To assess the current code orange plan, an interprofessional, committee comprising expert leaders in trauma, emergency preparedness, emergency medicine and simulation integrated tabletop and live simulation to stage a MCI based on a mock incident at a new subway station. ED staff, the trauma team and champions from medicine, surgery and critical care participated along with support departments such as Patient Flow, Patient Transport, Environmental Services and the Hospital Emergency Operations Centre. Ten live actor patients and eight virtual patients presented to the ED. The exercise occurred in situ in the ED. Other participating departments conducted tabletop exercises and received live actor patients. Results: CURRICULUM Staff decanted the ED and other participating units using their current knowledge of hospital code orange policy. Live and virtual patients were triaged and managed according to severity of injuries. Live actor patients were assessed, intervened and transported to their designated unit. Virtual patients were managed through verbal discussion with the simulation controllers. An ED debrief took place using a plus/delta approach followed by a hospital-wide debrief. Conclusion: CONCLUSION An interprofessional hospital-wide MCI simulation revealed important challenges such as communication, command and control and patient-tracking . The exercise ignited enthusiasm and commitment to longitudinal practice and improvement for identified gaps.
An original cohort study found that over half of the individuals detained under Section 136 (S136) of the Mental Health Act 1983 were discharged home after assessment, and nearly half were intoxicated.
Aims
To investigate whether the cohort was followed up by psychiatric services, characterise those repeatedly detained and assess whether substance use was related to these outcomes.
Method
Data were retrospectively collected from the notes of 242 individuals, who presented after S136 detention to a place of safety over a 6-month period, and were followed up for 1 year.
Results
After 1 year, 48% were in secondary care. Those with psychosis were the most likely to be admitted. Diagnoses of personality disorder or substance use were associated with multiple detentions; however, few were in contact with secondary services.
Conclusions
Crisis and long-term care pathways for these groups need to be developed to reduce repeated and unnecessary police detention.
We present a detailed, complete glacier inventory for Alaska and neighboring Canada using multi-sensor satellite data from 2000 to 2011. For each glacier, we derive outlines and 51 variables, including center-line lengths, outline types and debris cover. We find 86 723 km2 of glacier area (27 109 glaciers >0.025 km2), ∼12% of the global glacierized area outside ice sheets. Of this area 12.0% is drained by 39 marine-terminating glaciers (74 km of tidewater margin), and 19.3% by 148 lake- and river-terminating glaciers (420 km of lake-/river margin). The overall debris cover is 11%, with considerable differences among regions, ranging from 1.4% in the Kenai Mountains to 28% in the Central Alaska Range. Comparison of outlines from different sources on >2500 km2 of glacierized area yields a total area difference of ∼10%, emphasizing the difficulties in accurately delineating debris-covered glaciers. Assuming fully correlated (systematic) errors, uncertainties in area reach 6% for all Alaska glaciers, but further analysis is needed to explore adequate error correlation scales. Preliminary analysis of the glacier database yields a new set of well-constrained area/length scaling parameters and shows good agreement between our area–altitude distributions and previously established synthetic hypsometries. The new glacier database will be valuable to further explore relations between glacier variables and glacier behavior.
Three annual plant species — erect plantain (Plantago erecta Morris), common chickweed [Stellaria media (L.) Cyrill.], and silver hairgrass (Aira caryophyllea L.)—are commonly found and may dominate a unique flora on areas sprayed with paraquat (1,1′-dimethyl-4,4′-bipyridinium ion) in cismontane rangelands of California. The basis of this phenomenon is shown to be temperature-related germination requirements, novel seed characteristics, and lack of competition.
A technique was developed for seeding rangelands which are too steep or too rocky to seed by current methods. Hardinggrass (Phalaris tuberosa L. var. stenoptera (Hack.) Hitchc.) and subclover (Trifolium subterraneum L.) were established by seeding immediately after spraying the resident vegetation with 1,1′-dimethyl-4,4′-bipyridinium ion (paraquat). Tested for seeding in sod were single-disk, double-disk, and hoe-type drill openers. The double disk was best adapted to the clay soils most common in the area. A heavy-duty rangeland drill was modified with custom-made, heavy, double-disk openers and equipped with a spray system which sprays either bands or full coverage. The resulting planter will kill weeds, plant seeds, and spread fertilizer any place where a crawler tractor can pull it. Weed-free bands of 6 and 12 inches were compared with full spray coverage. No hardinggrass was established without some weed control. In only 5 of 16 tests over a 5-year period was full-spray coverage superior to the 6-inch band. The 12-inch band or full spray may be preferable on shallow soils or soils of low water-holding capacity. Spraying helped establish subclover but, unlike with hardinggrass, was not critical. Grazing or mowing during the establishment period improved stands of both hardinggrass and subclover. Prolonged weed control made paraquat superior to cultivation by giving better weed control and a firmer seedbed.
The phenology of reproduction was highly variable among 23 selections of medusahead (Taeniatherum asperum (Sim.) Nevski). Selections with markedly early and late maturity were observed. The phenology generally was consistent during 4 years of testing at Reno, Nevada, and during 2 years at Davis, California. Individual selections differed greatly in phenology between two locations. Some of the selections exhibited phenotypic plasticity in phenology when grown in competition with other weeds.