We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Cophinforma spp. are gall-inducing fungi that can infect the highly invasive Brazilian peppertree (Schinus terebinthifolia Raddi), in its introduced range in southern Florida, USA. A classical biological control agent, the thrips Pseudophilothrips ichini, has been released to mitigate the invasive potential of S. terebinthifolia. We investigated the synergistic potential of thrips feeding damage and gall formation on S. terebinthifolia management program success. A group of potted S. terebinthifolia saplings were inoculated with Cophinforma. Galled, symptomatic and ungalled, asymptomatic saplings were then paired in laboratory cages for a two-choice test with 40 P. ichini adults. Galled and ungalled plants were assessed for dead stem tips and necrotic stem tissue, with and without thrips present. Larval F1 thrips were also counted on each plant. Thrips feeding damage significantly increased the number of dead tips and extent of stem necrosis. Regardless of thrips presence, stem tip mortality and extent of necrosis were not significantly different between galled and ungalled plant pairs. Additionally, the maximum number of F1 larvae counted on stems did not differ between those on galled versus ungalled plants. Gall growth on heavily thrips-damaged plants nearly stopped, while galls continued to grow on plants with little thrips damage. While our results suggest the Cophinforma galls do not impact damage potential or plant preference from P. ichini, more work is needed to understand other factors that may contribute to at least additive impacts on S. terebinthifolia in the field, such as more advanced stages of the fungal infection on mature plants and prolonged thrips feeding damage.
Stem galls and witch’s broom–like growths are locally abundant on the highly invasive Brazilian peppertree (Schinus terebinthifolia) at field sites in southern Florida where a thrips biological control agent (Pseudophilothrips ichini) is being released to reduce the invasive potential of the plant. Galls have also been observed on potted plants in nursery stock grown to feed laboratory colonies of the agent. Herein, our objective was to isolate and identify the causal agent of the galls and assess its ability to induce galls in naive plants. We obtained stem galls from both field- and nursery-grown plants, aseptically isolated a fungus in acidic potato dextrose agar, and purified fungal colonies. Stems of potted naive saplings were wound-inoculated with purified hyphal fragments from the purified colonies, which readily induced galls like those observed in the field and nursery. Simultaneous molecular analysis of the fungal DNA obtained from the galls of field and nursery plants, experimentally induced galls, and fungal colony isolates identified this gall-inducing fungus as Cophinforma sp.. We demonstrated that this Cophinforma sp. can infect S. terebinthifolia stems via mechanical wounds and induce visibly discernible stem galls in saplings within 3 mo. This will serve as a model for galled plant production for assessing the impacts of the gall-inducing fungus on S. terebinthifolia, with potential for further study to investigate interactions between the thrips and this naturalized fungus, which may synergistically and/or additively enhance S. terebinthifolia management efficacy.
Patients tested for Clostridioides difficile infection (CDI) using a 2-step algorithm with a nucleic acid amplification test (NAAT) followed by toxin assay are not reported to the National Healthcare Safety Network as a laboratory-identified CDI event if they are NAAT positive (+)/toxin negative (−). We compared NAAT+/toxin− and NAAT+/toxin+ patients and identified factors associated with CDI treatment among NAAT+/toxin− patients.
Design:
Retrospective observational study.
Setting:
The study was conducted across 36 laboratories at 5 Emerging Infections Program sites.
Patients:
We defined a CDI case as a positive test detected by this 2-step algorithm during 2018–2020 in a patient aged ≥1 year with no positive test in the previous 8 weeks.
Methods:
We used multivariable logistic regression to compare CDI-related complications and recurrence between NAAT+/toxin− and NAAT+/toxin+ cases. We used a mixed-effects logistic model to identify factors associated with treatment in NAAT+/toxin− cases.
Results:
Of 1,801 cases, 1,252 were NAAT+/toxin−, and 549 were NAAT+/toxin+. CDI treatment was given to 866 (71.5%) of 1,212 NAAT+/toxin− cases versus 510 (95.9%) of 532 NAAT+/toxin+ cases (P < .0001). NAAT+/toxin− status was protective for recurrence (adjusted odds ratio [aOR], 0.65; 95% CI, 0.55–0.77) but not CDI-related complications (aOR, 1.05; 95% CI, 0.87–1.28). Among NAAT+/toxin− cases, white blood cell count ≥15,000/µL (aOR, 1.87; 95% CI, 1.28–2.74), ≥3 unformed stools for ≥1 day (aOR, 1.90; 95% CI, 1.40–2.59), and diagnosis by a laboratory that provided no or neutral interpretive comments (aOR, 3.23; 95% CI, 2.23–4.68) were predictors of CDI treatment.
Conclusion:
Use of this 2-step algorithm likely results in underreporting of some NAAT+/toxin− cases with clinically relevant CDI. Disease severity and laboratory interpretive comments influence treatment decisions for NAAT+/toxin− cases.
OBJECTIVES/GOALS: The goal of this study was to develop a clinically applicable technique to increase the precision of in vivo dose monitoring during radiation therapy by mapping the dose deposition and resolving the temporal dose accumulation while the treatment is being delivered in real time. METHODS/STUDY POPULATION: Ironizing radiation acoustic imaging (iRAI) is a novel imaging concept with the potential to map the delivered radiation dose on anatomic structure in real time during external beam radiation therapy without interrupting the clinical workflow. The iRAI system consisted of a custom-designed two-dimensional (2D) matrix transducer array with integrated preamplifier array, driven by a clinic-ready ultrasound imaging platform. The feasibility of iRAI volumetric imaging in mapping dose delivery and real-time monitoring of temporal dose accumulation in a clinical treatment plan were investigated with a phantom, a rabbit model, and a cancer patient. RESULTS/ANTICIPATED RESULTS: The total dose deposition and temporal dose accumulation in 3D space of a clinical C-shape treatment plan in a targeted region were first imaged and optimized in a phantom. Then, semi-quantitative iRAI measurements were achieved in an in vivo rabbit model. Finally, for the first time, real-time visualization of radiation dose delivered deep in a patient with liver metastases was performed with a clinical linear accelerator. These studies demonstrate the potential of iRAI to monitor and quantify the radiation dose deposition during treatment. DISCUSSION/SIGNIFICANCE: Described here is the pioneering role of an iRAI system in mapping the 3D radiation dose deposition of a complex clinical radiotherapy treatment plan. iRAI offers a cost-effective and practical solution for real-time visualization of 3D radiation dose delivery, potentially leading to personalized radiotherapy with optimal efficacy and safety.
Coordinated specialty care (CSC) is widely accepted as an evidence-based treatment for first episode psychosis (FEP). The NAVIGATE intervention from the Recovery After an Initial Schizophrenia Episode Early Treatment Program (RAISE-ETP) study is a CSC intervention which offers a suite of evidence-based treatments shown to improve engagement and clinical outcomes, especially in those with shorter duration of untreated psychosis (DUP). Coincident with the publication of this study, legislation was passed by the United States Congress in 2014–15 to fund CSC for FEP via a Substance Abuse and Mental Health Services Administration (SAMHSA) block grant set-aside for each state. In Michigan (MI) the management of this grant was delegated to Network180, the community mental health authority in Kent County, with the goal of making CSC more widely available to the 10 million people in MI. Limited research describes the outcomes of implementation of CSC into community practices with no published accounts evaluating the use of the NAVIGATE intervention in a naturalistic setting. We describe the outcomes of NAVIGATE implementation in the state of MI.
Methods
In 2014, 3 centers in MI were selected and trained to provide NAVIGATE CSC for FEP. In 2016 a 4th center was added, and 2 existing centers were expanded to provide additional access to NAVIGATE. Inclusion: age 18–31, served in 1 of 4 FEP centers in MI. Data collection began in 2015 for basic demographics, global illness (CGI q3 mo), hospital/ED use and work/school (SURF q3 mo) and was expanded in 2016 to include further demographics, diagnosis, DUP, vital signs; and in 2018 for clinical symptoms with the modified Colorado Symptom Inventory (mCSI q6 mo), reported via an online portal. This analysis used data until 12/31/19. Mixed effects models adjusted by age, sex and race were used to account for correlated data within patients.
Results
N=283 had useable demographic information and were included in the analysis. Age at enrollment was 21.6 ± 3.0 yrs; 74.2% male; 53.4% Caucasian, 34.6% African American; 12.9 ± 1.7 yrs of education (N=195). 18 mo retention was 67% with no difference by sex or race. CGI scores decreased 20% from baseline (BL) to 18 mo (BL=3.5, N=134; 15–18 mo=2.8, N=60). Service utilization via the SURF was measured at BL (N=172) and 18 mo (N=72): psychiatric hospitalizations occurred in 37% at BL and 6% at 18 mo (p<0.01); ER visits occurred in 40% at BL and 13% at 18 mo (p<0.01). 44% were working or in school at BL and 68% at 18 mo (p<0.01). 21% were on antipsychotics (AP) at BL (N=178) and 85% at 18 mo (N=13) with 8% and 54% on long acting injectable-AP at BL and 18 mo, respectively. Limitations include missing data and lack of a control group.
Conclusion
The implementation of the NAVIGATE CSC program for FEP in MI resulted in meaningful clinical improvement for enrollees. Further support could make this evidence-based intervention available to more people with FEP.
Funding
Supported by funds from the SAMHSA Medicaid State Block Grant set-aside awarded to Network180 (Achtyes, Kempema). The funders had no role in the design of the study, the analysis or the decision to publish the results.
Stigma against mental illness and the mentally ill is well known. However, stigma against psychiatrists and mental health professionals is known but not discussed widely. Public attitudes and also those of other professionals affect recruitment into psychiatry and mental health services. The reasons for this discriminatory attitude are many and often not dissimilar to those held against mentally ill individuals. In this Guidance paper we present some of the factors affecting the image of psychiatry and psychiatrists which is perceived by the public at large. We look at the portrayal of psychiatry, psychiatrists in the media and literature which may affect attitudes. We also explore potential causes and explanations and propose some strategies in dealing with negative attitudes. Reduction in negative attitudes will improve recruitment and retention in psychiatry. We recommend that national psychiatric societies and other stakeholders, including patients, their families and carers, have a major and significant role to play in dealing with stigma, discrimination and prejudice against psychiatry and psychiatrists.
Psychiatry is that branch of the medical profession, which deals with the origin, diagnosis, prevention, and management of mental disorders or mental illness, emotional and behavioural disturbances. Thus, a psychiatrist is a trained doctor who has received further training in the field of diagnosing and managing mental illnesses, mental disorders and emotional and behavioural disturbances. This EPA Guidance document was developed following consultation and literature searches as well as grey literature and was approved by the EPA Guidance Committee. The role and responsibilities of the psychiatrist include planning and delivering high quality services within the resources available and to advocate for the patients and the services. The European Psychiatric Association seeks to rise to the challenge of articulating these roles and responsibilities. This EPA Guidance is directed towards psychiatrists and the medical profession as a whole, towards other members of the multidisciplinary teams as well as to employers and other stakeholders such as policy makers and patients and their families.
A large proportion of older adults are affected by impaired glucose metabolism. Previous studies with fish protein have reported improved glucose regulation in healthy adults, but the evidence in older adults is limited. Therefore, we wanted to assess the effect of increasing doses of a cod protein hydrolysate (CPH) on postprandial glucose metabolism in older adults. The study was a double-blind cross-over trial. Participants received four different doses (10, 20, 30 or 40 mg/kg body weight (BW)) of CPH daily for 1 week with 1-week washout periods in between. The primary outcome was postprandial response in glucose metabolism, measured by samples of serum glucose and insulin in 20 min intervals for 120 min. The secondary outcome was postprandial response in plasma glucagon-like peptide 1 (GLP-1). Thirty-one subjects aged 60–78 years were included in the study. In a mixed-model statistical analysis, no differences in estimated maximum value of glucose, insulin or GLP-1 were observed when comparing the lowest dose of CPH (10 mg/kg BW) with the higher doses (20, 30 or 40 mg/kg BW). The estimated maximum value of glucose was on average 0·28 mmol/l lower when the participants were given 40 mg/kg BW CPH compared with 10 mg/kg BW (P = 0·13). The estimated maximum value of insulin was on average 5·14 mIU/l lower with 40 mg/kg BW of CPH compared with 10 mg/kg BW (P = 0·20). Our findings suggest that serum glucose and insulin levels tend to decrease with increasing amounts of CPH. Due to preliminary findings, the results require further investigation.
Introduction: In-hospital cardiac arrest (IHCA) most commonly occurs in non-monitored areas, where we observed a 10min delay before defibrillation (Phase I). Nurses (RNs) and respiratory therapists (RTs) cannot legally use Automated External Defibrillators (AEDs) during IHCA without a medical directive. We sought to evaluate IHCA outcomes following usual implementation (Phase II) vs. a Theory-Based educational program (Phase III) allowing RNs and RTs to use AEDs during IHCA. Methods: We completed a pragmatic before-after study of consecutive IHCA. We used ICD-10 codes to identify potentially eligible cases and included IHCA cases for which resuscitation was attempted. We obtained consensus on all data definitions before initiation of standardized-piloted data extraction by trained investigators. Phase I (Jan.2012-Aug.2013) consisted of baseline data. We implemented the AED medical directive in Phase II (Sept.2013-Aug.2016) using usual implementation strategies. In Phase III (Sept.2016-Dec.2017) we added an educational video informed by key constructs from a Theory of Planned Behavior survey. We report univariate comparisons of Utstein IHCA outcomes using 95% confidence intervals (CI). Results: There were 753 IHCA for which resuscitation was attempted with the following similar characteristics (Phase I n = 195; II n = 372; III n = 186): median age 68, 60.0% male, 79.3% witnessed, 29.7% non-monitored medical ward, 23.9% cardiac cause, 47.9% initial rhythm of pulseless electrical activity and 27.2% ventricular fibrillation/tachycardia (VF/VT). Comparing Phases I, II and III: an AED was used 0 times (0.0%), 21 times (5.6%), 15 times (8.1%); time to 1st rhythm analysis was 6min, 3min, 1min; and time to 1st shock was 10min, 10min and 7min. Comparing Phases I and III: time to 1st shock decreased by 3min (95%CI -7; 1), sustained ROSC increased from 29.7% to 33.3% (AD3.6%; 95%CI -10.8; 17.8), and survival to discharge increased from 24.6% to 25.8% (AD1.2%; 95%CI -7.5; 9.9). In the VF/VT subgroup, time to first shock decreased from 9 to 3 min (AD-6min; 95%CI -12; 0) and survival increased from 23.1% to 38.7% (AD15.6%; 95%CI -4.3; 35.4). Conclusion: The implementation of a medical directive allowing for AED use by RNs and RRTs successfully improved key outcomes for IHCA victims, particularly following the Theory-Based education video. The expansion of this project to other hospitals and health care professionals could significantly impact survival for VF/VT patients.
Common practices for invasive species control and management include physical, chemical, and biological approaches. The first two approaches have clear limitations and may lead to unintended (negative) consequences, unless carefully planned and implemented. For example, physical removal rarely completely eradicates the targeted invasive species and can cause disturbances that facilitate new invasions by nonnative species from nearby habitats. Chemical treatments can harm native, and especially rare, species through unanticipated side effects. Biological methods may be classified as biocontrol and the ecological approach. Similar to physical and chemical methods, biocontrol also has limitations and sometimes leads to unintended consequences. Therefore, a relatively safer and more practical choice may be the ecological approach, which has two major components: (1) restoration of native species and (2) biomass manipulation of the restored community, such as selective grazing or prescribed burning (to achieve and maintain viable population sizes). Restoration requires well-planned and implemented planting designs that consider alpha-, beta-, and gamma-diversity and the abundance of native and invasive component species at local, landscape, and regional levels. Given the extensive destruction or degradation of natural habitats around the world, restoration could be most effective for enhancing ecosystem resilience and resistance to biotic invasions. At the same time, ecosystems in human-dominated landscapes, especially those newly restored, require close monitoring and careful intervention (e.g., through biomass manipulation), especially when successional trajectories are not moving as intended. Biomass management frequently uses prescribed burning, grazing, harvesting, and thinning to maintain overall ecosystem health and sustainability. Thus, the resulting optimal, balanced, and relatively stable ecological conditions could more effectively limit the spread and establishment of invasive species. Here we review the literature (especially within the last decade) on ecological approaches that involve biodiversity, biomass, and productivity, three key community/ecosystem variables that reciprocally influence one another. We focus on the common and most feasible ecological practices that can aid in resisting new invasions and/or suppressing the dominance of existing invasive species. We contend that, because of the strong influences from neighboring areas (i.e., as exotic species pools), local restoration and management efforts in the future need to consider the regional context and projected climate changes.
Significant reductions recently seen in the size of wide-bandgap power electronics have not been accompanied by a relative decrease in the size of the corresponding magnetic components. To achieve this, a new generation of materials with high magnetic saturation and permeability are needed. Here, we develop gram-scale syntheses of superparamagnetic Fe/FexOy core–shell nanoparticles and incorporate them as the magnetic component in a strongly magnetic nanocomposite. Nanocomposites are typically formed by the organization of nanoparticles within a polymeric matrix. However, this approach can lead to high organic fractions and phase separation; reducing the performance of the resulting material. Here, we form aminated nanoparticles that are then cross-linked using epoxy chemistry. The result is a magnetic nanoparticle component that is covalently linked and well separated. By using this ‘matrix-free’ approach, we can substantially increase the magnetic nanoparticle fraction, while still maintaining good separation, leading to a superparamagnetic nanocomposite with strong magnetic properties.
Water from several bodies of surface water and sections from several plants were collected from eastern North Carolina and bioassayed for stimulation of witchweed (Striga lutea Lour.) seed germination. Stimulatory activity was detected in water from 22 of 29 different ponds, streams, and lakes and in sections from 118 (57 families) of 163 plant species. Witchweed germination stimulants evidently occur widely in nature.
Chlorimuron applied postemergence at 2.2, 4.4, 8.8, 18, and 35 g ai/ha to cotton at either the 4-leaf, pinhead-square, first-bloom, or full-bloom growth stage was evaluated for potential as a plant growth regulator. Chlorimuron did not reduce bolls per plant at any rate or time of application, but the proportion of open to closed bolls decreased as rate increased. Seed cotton yields decreased with increasing chlorimuron rate and cotton age. The use of chlorimuron as a plant growth regulator for cotton appears limited.
Field experiments were conducted in Alabama from 1992 through 1994 to evaluate the potential of the methyl ester of bensulfuron applied at sublethal rates as a plant growth regulator for reducing plant height and boll rot in cotton. Bensulfuron at 0.017 and 0.034 g ai/ha or mepiquat chloride at 10 g ai/ha was applied POST alone at the pinhead square or early-bloom stage of cotton growth or sequentially at 0.017 followed by (fb) 0.017 g/ha, 0.034 fb 0.034 g/ha of bensulfuron and 5 fb 5, 10 fb 10, 10 fb 20, or 20 fb 20 g/ha of mepiquat chloride. Mepiquat chloride had no effect on yield in 1992 and 1994 but decreased yield when applied sequentially in 1993. Bensulfuron was generally detrimental to first position fruit retention, and it delayed maturity. Treatments that reduced plant height did not reduce boll rot. Bensulfuron treatments that reduced plant height also reduced yield; therefore, the potential for its use as a growth regulator in cotton appears limited.
Several application modes and methods (schemes) of using herbicides are available to control undesirable vegetation on electric transmission line rights-of-way (ROW). Preferential use of a management scheme can be based on its cost effectiveness, i.e., degree of vegetation control and treatment cost. A treatment that increases/maintains desirable plants, decreases/maintains undesirable plants, and has relatively low cost, can be considered cost effective. Three common herbicides, 2,4-D, picloram and triclopyr, were applied in the field to test treatment mode (selective and nonselective) and method (cut stump, basal, and stem-foliar) effects on cost effectiveness during initial clearing and first and second conversion cycles on one electric transmission line ROW in Upstate New York. Clear or selective cutting with no herbicide was most cost effective during initial clearing. Nonselective and selective stem-foliar schemes were most cost effective during the first and second conversion cycles, respectively.
Field trials were conducted from 1991 to 1993 in the northern cotton-producing area of Alabama to evaluate the interaction of various production inputs for pest management and cotton development, maturity, and yield. Two levels of tillage (conventional and no-till), herbicide (1.12 and 2.24 kg ai ha−1 fluometuron, preemergence, with post-directed herbicides), insecticide (0.5 and 0.85 kg ai ha−1 aldicarb, in-furrow), and fungicide (0.9 kg ai ha−1 quintozene plus ethridazole, in-furrow or 0.14 kg ai ha−1 metalaxyl, hopper-box) programs were evaluated. The inputs investigated did not interact significantly to change the overall production strategy. Prickly sida required a higher level of herbicide input than did the entireleaf/ivyleaf morningglory complex. Both species were sufficiently controlled using reduced levels of fluometuron without sacrificing yield; however, cultivation was necessary in conventional tillage treatments to maintain control. A postemergence-directed herbicide treatment was also necessary for weed control, regardless of tillage. Decreasing the levels of each input simultaneously did not interact to affect cotton stand, height, early-season thrips counts, cotton maturity, or yield. Cotton was shorter in no-till than conventional plots.
Field experiments were conducted at one location in Georgia (1994) and at two locations in Alabama (1994 and 1995) to evaluate the effects of MSMA or DSMA plus pyrithiobac applied postemergence (POST) in cotton. Pyrithiobac at 0.07 kg ai/ha was applied POST alone or in combination with MSMA at 1.1 kg ai/ha or DSMA at 1.7 kg ai/ha at the pinhead square stage of cotton growth. Cotton was tolerant to the POST applications of pyrithiobac. Adding MSMA or DSMA to pyrithiobac injured cotton similar to MSMA or DSMA applied alone. Plant mapping data indicated that all treatments had no effect on height : node ratio, reproductive or vegetative node production, or square retention at the first or second fruiting position. Cotton maturity response to MSMA and DSMA ranged from no effect to delayed maturity. Adding DSMA to pyrithiobac increased Florida beggarweed and common cocklebur control over pyrithiobac applied alone in 1995 but did not increase control in 1994. Adding DSMA to pyrithiobac increased sicklepod control over pyrithiobac applied alone in the three site years it was rated in the Alabama tests. Where sicklepod is present, the addition of an arsenical herbicide to pyrithiobac will generally increase control but has the potential to delay maturity and decrease cotton yield equal to the arsenical herbicide applied alone.
In laboratory studies, ethylene and an ethylene-releasing agent, 2-chloroethylphosphonic acid (hereinafter referred to as CEPA), stimulated germination of aged, pretreated but still dormant witchweed (Striga lutea Lour.) seed. Ethylene gas at 10−1 μl/L produced maximal (89 to 98%) seed germination. Witchweed seed also germinated when incubated directly in Eustis loamy sand treated with CEPA. A half-maximal response was obtained with 10 mg of CEPA per kg of soil. Vapors produced by apple (Pyrus malus L. Mill) slices, by an alkaline solution of CEPA, and by soil treated with CEPA also stimulated germination. Vapors from soil contained an inhibitor of the ethylene-induced seed germination. Exposure of the soil vapors to 20% potassium hydroxide removed the inhibitor. Germination studies with 10% carbon dioxide, carbon dioxide-free air, and ethylene indicated that carbon dioxide inhibited the ethylene-induced germination of witchweed seed.