We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The 21st Century Cures Act mandated that new medication research include patient focused drug development initiatives. The act also recognized CPSSs as integral members of the healthcare team. Inclusion of CPSSs within care teams is associated with reduced hospitalization, increased treatment engagement, and a renewed focus on patient desired outcomes. CPSSs are people with lived experience in navigating complex mental health systems and whose unique perspective helps guide peers on their journey to wellness. In the same manner that CPSS knowledge has improved clinical outcomes, partnering with CPSSs during CNS drug development may provide wellness outcomes in clinical trials that are more meaningful for people with lived experience. To this end, a CPSS Ambassador program was initiated.
Methods
Of 85 peer support specialists identified by internet searches, Linked-In, and peer support specialists’ registries, 7 CPSSs met our criteria (i.e., having lived experience of psychosis and being a member of a treatment team) and agreed to be part of our ambassador program. Interactions included 6 monthly virtual meetings and a live roundtable meeting. The objectives of the program were to: 1) understand unmet needs in people with lived experience and identify impediments to effective treatment, 2) learn best practices for discussing medication use to support wellness, 3) identify resources that can help educate people and families with lived experience, and 4) highlight the importance of CPSSs within healthcare teams to optimize treatment outcomes.
Results
This CPSS ambassador program emphasized the need for shared decision making and partnership to forge a positive treatment team alliance. As such, treatment goals should be tailored to patients’ needs (“nothing about me without me”). A major obstacle to effective treatment is the presence of bias or stigma among health care practitioners. Specifically, certain language used by clinicians has the potential to ostracize patients and negatively impact treatment. Medications should be discussed as one pillar of a larger treatment plan and not as a “fix” for symptoms. Educational resources written in layman’s terms are needed to explain treatment algorithms and medication side effects. And finally, CPSSs make a significant contribution to person-focused positive outcomes and are an essential part of the treatment team. CPSSs are a conduit of lived experience and advocate for the individual
Conclusions
The following key outcomes were illuminated because of this work together: CPSS’s are liaisons that facilitate the intersection between the treatment team and people utilizing mental health systems. CPSS’s are critical to successful navigation of the mental health care system and reaching desired outcomes. Best practices for treatment teams are about effective, person-based and stigma free partnerships for positive and patient focused outcomes.
Funding
Sumitomo Pharma America (formerly Sunovion Pharmaceuticals Inc)
Background: Central-line–associated blood stream infections (CLABSIs) are linked with significant morbidity and mortality. A NHSN laboratory-confirmed bloodstream infection (LCBSI) has specific criteria to ascribe an infection to the central line or not. The criteria used to associate the pathogen to another site are restrictive. This objective to better classify CLABSIs using enhanced criteria to gain a comprehensive understanding of the error so that appropriate reduction efforts are utilized. Methods: We conducted a retrospective review of medical records with NHSN-identified CLABSI from July 2017 to December 2018 at 2 geographically proximate hospitals. Trained infectious diseases personnel from tertiary-care academic medical centers, the University of Virginia Health System, a 600-bed medical center in Charlottesville, Virginia, and Virginia Commonwealth University Health System with 865 beds in Richmond, Virginia, reviewed charts. We defined “overcaptured” or O-CLABSI into different categories: O-CLABSI-1 is bacteremia attributable to a primary infectious source; O-CLABSI-2 is bacteremia attributable to neutropenia with gastrointestinal translocation not meeting mucosal barrier injury criteria; O-CLABSI-3 is a positive blood culture attributable to a contaminant; and O-CLABSI-4 is a patient injecting line, though not officially documented. Descriptive analyses were performed using the χ2 and the Fisher exact tests. Results: We found a large number of O-CLABSIs on chart review (79 of 192, 41%). Overall, 56 of 192 (29%) LCBSIs were attributable to a primary infectious source not meeting NHSN definition. O-CLABSI proportions between the 2 hospitals were statistically different; hospital A identified 34 of 59 (58%) of their NHSN-identified CLABSIs as O-CLABSIs, and hospital B identified a 45 of 133 (34%) as O-CLABSIs (P = .0020) (Table 1). When comparing O-CLABSI types, hospital B had a higher percentage of O-CLABSI-1 compared to hospital B: 76% versus 64%. Hospital A had a higher proportion of O-CLABSI-2: 21 versus 7%. Hospitals A and B had similar proportion of O-CLABSI-3: 15% versus 18%. These values were all statistically significant (P < .0001). Discussions: The results of these 2 geographically proximate systems indicate that O-CLABSIs are common. Attribution can vary significantly between institutions, likely depending on differences in incidence of true CLABSI, patient populations, protocols, and protocol compliance. These findings have implications for interfacility comparisons of publicly reported data. Most importantly, erroneous attribution can result in missed opportunity to direct patient safety efforts to the root cause of the bacteremia and could lead to inappropriate treatment.
Funding: None
Disclosures: Michelle Doll, Research Grant from Molnlycke Healthcare
An experiment was conducted to test the hypothesis that meat products have digestible indispensable amino acid scores (DIAAS) >100 and that various processing methods will increase standardised ileal digestibility (SID) of amino acids (AA) and DIAAS. Nine ileal-cannulated gilts were randomly allotted to a 9 × 8 Youden square design with nine diets and eight 7-d periods. Values for SID of AA and DIAAS for two reference patterns were calculated for salami, bologna, beef jerky, raw ground beef, cooked ground beef and ribeye roast heated to 56, 64 or 72°C. The SID of most AA was not different among salami, bologna, beef jerky and cooked ground beef, but was less (P < 0·05) than the values for raw ground beef. The SID of AA for 56°C ribeye roast was not different from the values for raw ground beef and 72°C ribeye roast, but greater (P < 0·05) than those for 64°C ribeye roast. For older children, adolescents and adults, the DIAAS for all proteins, except cooked ground beef, were >100 and bologna and 64°C ribeye roast had the greatest (P < 0·05) DIAAS. The limiting AA for this age group were sulphur AA (beef jerky), leucine (bologna, raw ground beef and cooked ground beef) and valine (salami and the three ribeye roasts). In conclusion, meat products generally provide high-quality protein with DIAAS >100 regardless of processing. However, overcooking meat may reduce AA digestibility and DIAAS.
Demoralization is prevalent in patients with life-limiting chronic illnesses, many of whom reside in rural areas. These patients also have an increased risk of disease-related psychosocial burden due to the unique health barriers in this population. However, the factors affecting demoralization in this cohort are currently unknown. This study aimed to examine demoralization amongst the chronically ill in Lithgow, a town in rural New South Wales, Australia, and identify any correlated demographic, physical, and psychosocial factors in this population.
Method
A cross-sectional survey of 73 participants drawn from Lithgow Hospital, the adjoining retirement village and nursing home, assessing correlating demographic, physical, psychiatric, and psychosocial factors.
Results
The total mean score of the DS-II was 7.8 (SD 26.4), and high demoralization scores were associated with the level of education (p = 0.01), comorbid condition (p = 0.04), severity of symptom burden (p = <0.001), depression (p = <0.001), and psychological distress (p = <0.001). Prevalence of serious demoralization in this population was 27.4% according to a cutoff of a DS-II score ≥11. Of those, 11 (15%) met the criteria for clinical depression, leaving 9 (12.3%) of the cohort demoralized but not depressed.
Significance of results
Prevalence of demoralization was high in this population. In line with the existing literature, demoralization was associated with the level of education, symptom burden, and psychological distress, demonstrating that demoralization is a relevant psychometric factor in rural populations. Further stratification of the unique biopsychosocial factors at play in this population would contribute to better understanding the burdens experienced by people with chronic illness in this population and the nature of demoralization.
Reforestation in the Inland Northwest, including northeastern Oregon, USA, is often limited by a dry climate and soil moisture availability during the summer months. Reduction of competing vegetative cover in forest plantations is a common method for retaining available soil moisture. Several spring and summer site preparation (applied prior to planting) herbicide treatments were evaluated to determine their efficacy in reducing competing cover, thus retaining soil moisture, on three sites in northeastern Oregon. Results varied by site, year, and season of application. In general, sulfometuron (0.14 kg ai ha–1 alone and in various mixtures), imazapyr (0.42 ae kg ha–1), and hexazinone (1.68 kg ai ha–1) resulted in 3 to 17% cover of forbs and grasses in the first-year when applied in spring. Sulfometuron+glyphosate (2.2 kg ha–1) consistently reduced grasses and forbs for the first year when applied in summer, but forbs recovered in the second year on two of three sites. Aminopyralid (0.12 kg ae ha–1)+sulfometuron applied in summer also led to comparable control of forb cover. In the second year after treatment, forb cover in treated plots was similar to levels in nontreated plots, and some species of forbs had increased relative to nontreated plots. Imazapyr (0.21 and 0.42 kg ha–1) at either rate, spring or summer 2007, or at lower rate (0.14 kg ha–1) with glyphosate in summer, provided the best control of shrubs, of which snowberry was the dominant species. Total vegetative cover was similar across all treatments seven and eight years after application, and differences in vegetation were related to site rather than treatment. In the first year after treatment, rates of soil moisture depletion in the 0- to 23-cm depth were correlated with vegetative cover, particularly late season soil moisture, suggesting increased water availability for tree seedling growth.
New approaches are needed to safely reduce emergency admissions to hospital by targeting interventions effectively in primary care. A predictive risk stratification tool (PRISM) identifies each registered patient's risk of an emergency admission in the following year, allowing practitioners to identify and manage those at higher risk. We evaluated the introduction of PRISM in primary care in one area of the United Kingdom, assessing its impact on emergency admissions and other service use.
METHODS:
We conducted a randomized stepped wedge trial with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. PRISM was implemented in eleven primary care practice clusters (total thirty-two practices) over a year from March 2013. We analyzed routine linked data outcomes for 18 months.
RESULTS:
We included outcomes for 230,099 registered patients, assigned to ranked risk groups.
Overall, the rate of emergency admissions was higher in the intervention phase than in the control phase: adjusted difference in number of emergency admissions per participant per year at risk, delta = .011 (95 percent Confidence Interval, CI .010, .013). Patients in the intervention phase spent more days in hospital per year: adjusted delta = .029 (95 percent CI .026, .031). Both effects were consistent across risk groups.
Primary care activity increased in the intervention phase overall delta = .011 (95 percent CI .007, .014), except for the two highest risk groups which showed a decrease in the number of days with recorded activity.
CONCLUSIONS:
Introduction of a predictive risk model in primary care was associated with increased emergency episodes across the general practice population and at each risk level, in contrast to the intended purpose of the model. Future evaluation work could assess the impact of targeting of different services to patients across different levels of risk, rather than the current policy focus on those at highest risk.
Emergency admissions to hospital are a major financial burden on health services. In one area of the United Kingdom (UK), we evaluated a predictive risk stratification tool (PRISM) designed to support primary care practitioners to identify and manage patients at high risk of admission. We assessed the costs of implementing PRISM and its impact on health services costs. At the same time as the study, but independent of it, an incentive payment (‘QOF’) was introduced to encourage primary care practitioners to identify high risk patients and manage their care.
METHODS:
We conducted a randomized stepped wedge trial in thirty-two practices, with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. We analysed routine linked data on patient outcomes for 18 months (February 2013 – September 2014). We assigned standard unit costs in pound sterling to the resources utilized by each patient. Cost differences between the two study phases were used in conjunction with differences in the primary outcome (emergency admissions) to undertake a cost-effectiveness analysis.
RESULTS:
We included outcomes for 230,099 registered patients. We estimated a PRISM implementation cost of GBP0.12 per patient per year.
Costs of emergency department attendances, outpatient visits, emergency and elective admissions to hospital, and general practice activity were higher per patient per year in the intervention phase than control phase (adjusted δ = GBP76, 95 percent Confidence Interval, CI GBP46, GBP106), an effect that was consistent and generally increased with risk level.
CONCLUSIONS:
Despite low reported use of PRISM, it was associated with increased healthcare expenditure. This effect was unexpected and in the opposite direction to that intended. We cannot disentangle the effects of introducing the PRISM tool from those of imposing the QOF targets; however, since across the UK predictive risk stratification tools for emergency admissions have been introduced alongside incentives to focus on patients at risk, we believe that our findings are generalizable.
A predictive risk stratification tool (PRISM) to estimate a patient's risk of an emergency hospital admission in the following year was trialled in general practice in an area of the United Kingdom. PRISM's introduction coincided with a new incentive payment (‘QOF’) in the regional contract for family doctors to identify and manage the care of people at high risk of emergency hospital admission.
METHODS:
Alongside the trial, we carried out a complementary qualitative study of processes of change associated with PRISM's implementation. We aimed to describe how PRISM was understood, communicated, adopted, and used by practitioners, managers, local commissioners and policy makers. We gathered data through focus groups, interviews and questionnaires at three time points (baseline, mid-trial and end-trial). We analyzed data thematically, informed by Normalisation Process Theory (1).
RESULTS:
All groups showed high awareness of PRISM, but raised concerns about whether it could identify patients not yet known, and about whether there were sufficient community-based services to respond to care needs identified. All practices reported using PRISM to fulfil their QOF targets, but after the QOF reporting period ended, only two practices continued to use it. Family doctors said PRISM changed their awareness of patients and focused them on targeting the highest-risk patients, though they were uncertain about the potential for positive impact on this group.
CONCLUSIONS:
Though external factors supported its uptake in the short term, with a focus on the highest risk patients, PRISM did not become a sustained part of normal practice for primary care practitioners.
The Atypical Maternal Behavior Instrument for Assessment and Classification (AMBIANCE; Bronfman, Madigan, & Lyons-Ruth, 2009–2014; Bronfman, Parsons, & Lyons-Ruth, 1992–2004) is a widely used and well-validated measure for assessing disrupted forms of caregiver responsiveness within parent–child interactions. However, it requires evaluating approximately 150 behavioral items from videotape and extensive training to code, thus making its use impractical in most clinical contexts. Accordingly, the primary aim of the current study was to identify a reduced set of behavioral indicators most central to the AMBIANCE coding system using latent-trait item response theory (IRT) models. Observed mother–infant interaction data previously coded with the AMBIANCE was pooled from laboratories in both North America and Europe (N = 343). Using 2-parameter logistic IRT models, a reduced set of 45 AMBIANCE items was identified. Preliminary convergent and discriminant validity was evaluated in relation to classifications of maternal disrupted communication assigned using the full set of AMBIANCE indicators, to infant attachment disorganization, and to maternal sensitivity. The results supported the construct validity of the refined item set, opening the way for development of a brief screening measure for disrupted maternal communication. IRT models in clinical scale refinement and their potential for bridging clinical and research objectives in developmental psychopathology are discussed.
Sea-ice thickness distributions from 12 submarine cruises under the North Pole are used to evaluate and enhance the results of sea-ice model simulations. The sea-ice models include versions with cavitating fluid and elastic-viscous-plastic rheologies, and versions with a single thickness and with multiple (5–27) thicknesses in each gridcell. A greater portion of the interannual variance of observed mean thickness at the Pole is captured by the multiple-thickness models than by the single-thickness models, although even the highest correlations are only about 0.6. After The observed thickness distributions are used to ˚tune" the model to capture the primary mode of the distribution, the largest model-data discrepancies are in the thin-ice tail of the distribution. In a 41 year simulation ending in 1998, the model results show a pronounced decrease of mean ice thickness at the Pole around 1990; the minimum simulated thickness occurs in summer 1998. The decrease coincides with a shift of the Arctic Oscillation to its positive phase. The smallest submarine-derived mean thickness occurs in 1990, but no submarine data were available after 1992. The submarine-derived thicknesses for 1991 and 1992 are only slightly smaller than the 12–case mean.
Field studies were conducted at five locations in North Carolina and Virginia during 1996 and 1997 to evaluate weed control, peanut (Arachis hypogaea) response, and peanut yield following diclosulam applied preplant incorporated (PPI) and in systems with commercial herbicide standards. All plots received a PPI treatment of ethalfluralin at 840 g ai/ha. Ethalfluralin plus diclosulam controlled entireleaf morningglory (Ipomoea hederacea var. integriuscula), ivyleaf morningglory (I. hederacea), pitted morningglory (I. lacunosa), common lambsquarters (Chenopodium album), eclipta (Eclipta prostrata), and prickly sida (Sida spinosa) as well as and frequently better than ethalfluralin PPI followed by (fb) acifluorfen plus bentazon postemergence (POST), paraquat plus bentazon early postemergence (EPOST) fb imazapic POST, or imazapic POST. Systems with ethalfluralin plus diclosulam PPI at 26 g ai/ha fb acifluorfen plus bentazon POST controlled a broader spectrum of weeds and yielded greater than systems of ethalfluralin PPI fb imazapic POST or ethalfluralin PPI fb acifluorfen plus bentazon POST. Peanut exhibited excellent tolerance to diclosulam PPI at 17, 26, or 35 g/ha.
Soybean response to simulated drift of the corn herbicides nicosulfuron and primisulfuron applied POST at 10 to 50% (3.5 to 17.4 and 4.0 to 20.2 g ai ha−1, respectively) of the total rates at the V3 and R1 growth stages was evaluated in field studies in 1991 and 1992. Primisulfuron reduced soybean height and increased leaf chlorosis, cupping, and necrosis more than nicosulfuron with both applications at all five rates. The symptoms of injury caused by both herbicides often increased linearly with increasing rate. At 50% of label rate, primisulfuron reduced height 75% and decreased yield 58%. Nicosulfuron reduced soybean height as much as 27%, but did not reduce seed yield either year. Height reduction, leaf chlorosis, cupping, and necrosis were correlated with yield loss caused by primisulfuron.
Field studies were conducted at five locations in North Carolina and Virginia in 1996 and 1997 to evaluate weed control and peanut (Arachis hypogaea) response to diclosulam that was applied preemergence (PRE) and in systems with commercial standards. All plots received a preplant incorporated (PPI) treatment of ethalfluralin at 840 g ai/ha. Diclosulam controlled common lambsquarters (Chenopodium album L.), eclipta (Eclipta prostrata L.), entireleaf morningglory (Ipomoea hederacea var. integriuscula Gray), ivyleaf morningglory [Ipomoea hederacea (L.) Jacq.], pitted morningglory (Ipomoea lacunosa L.), and prickly sida (Sida spinosa L.) as well as and frequently better than the commercial standards of acifluorfen plus bentazon applied postemergence (POST), paraquat plus bentazon early POST followed by (fb) imazapic POST, or imazapic POST. Systems with ethalfluralin PPI plus diclosulam PRE at 26 g ai/ha fb acifluorfen plus bentazon POST controlled a broader spectrum of weeds and yielded greater than systems of ethalfluralin PPI fb imazapic POST or ethalfluralin PPI fb acifluorfen plus bentazon POST. Peanut exhibited excellent tolerance to diclosulam PRE at 17, 26, or 35 g/ha.
Field experiments were conducted in 1991 and 1992 to evaluate corn tolerance to nicosulfuron following in-furrow or surface-band applications of seven soil-applied insecticides. The organophosphate insecticides evaluated included terbufos (15G and 20CR formulations), phorate, chlorpyrifos, fonofos, and DPX-43898, as well as the synthetic pyrethroid tefluthrin. Potential for a nicosulfuron/ insecticide interaction to injure corn was ranked as follows: terbufos 15G = phorate > terbufos 20CR > fonofos. Chlorpyrifos, DPX-43898, or tefluthrin followed by nicosulfuron caused no significant corn injury. Corn injury and yield reduction were greater in corn treated in-furrow compared with surface-band. Soil moisture at nicosulfuron application was more influential in causing corn injury and loss of yield than growth stage of corn at nicosulfuron application.
Rural communities face barriers to disaster preparedness and considerable risk of disasters. Emergency preparedness among rural communities has improved with funding from federal programs and implementation of a National Incident Management System. The objective of this project was to design and implement disaster exercises to test decision making by rural response partners to improve regional planning, collaboration, and readiness. Six functional exercises were developed and conducted among three rural Nebraska (USA) regions by the Center for Preparedness Education (CPE) at the University of Nebraska Medical Center (Omaha, Nebraska USA). A total of 83 command centers participated. Six functional exercises were designed to test regional response and command-level decision making, and each 3-hour exercise was followed by a 3-hour regional after action conference. Participant feedback, single agency debriefing feedback, and regional After Action Reports were analyzed. Functional exercises were able to test command-level decision making and operations at multiple agencies simultaneously with limited funding. Observations included emergency management jurisdiction barriers to utilization of unified command and establishment of joint information centers, limited utilization of documentation necessary for reimbursement, and the need to develop coordinated public messaging. Functional exercises are a key tool for testing command-level decision making and response at a higher level than what is typically achieved in tabletop or short, full-scale exercises. Functional exercises enable evaluation of command staff, identification of areas for improvement, and advancing regional collaboration among diverse response partners.
ObaidJM, BaileyG, WheelerH, MeyersL, MedcalfSJ, HansenKF, SangerKK, LoweJJ. Utilization of Functional Exercises to Build Regional Emergency Preparedness among Rural Health Organizations in the US. Prehosp Disaster Med. 2017;32(2):224–230.
Field experiments were conducted in 1996 and 1997 to evaluate the tolerance of imidazolinone-resistant (IR) and non-IR corn cultivars to preemergence (PRE) and postemergence (POST) treatments of diclosulam. Crop injury was evaluated early- (5 to 6 wk after planting [WAP]), mid- (10 to 11 WAP), and late-season (13 to 15 WAP). Early-season injury of IR corn was no more than 12% in systems that included diclosulam PRE or POST at 18, 27, or 36 g ai/ha. Early-season injury of non-IR corn ranged from 85 to 89% in systems that included diclosulam PRE at any rate. At the mid-season evaluation, crop injury to IR corn was 1% or less. Non-IR corn was injured 73 to 94% in systems that included diclosulam PRE, while systems that included diclosulam POST caused 45 to 58% injury at mid-season. At the late-season evaluation, non-IR corn was injured 56, 88, and 96% with diclosulam PRE at 18, 27, and 36 g/ha, respectively, whereas systems that included diclosulam POST had 11 to 14% injury. Injury to IR corn from diclosulam PRE or POST was not apparent at the late-season evaluation. Weed-free yield of IR corn treated with diclosulam was 6,490 to 6,850 kg/ha and was equivalent to or better than yield from IR corn treated only with atrazine plus metolachlor PRE. Yield from non-IR corn treated with any diclosulam-containing system did not exceed 3,770 kg/ha.