We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
There is evidence of increasing rates of hospital presentations for suicidal crisis, and emergency departments (EDs) are described as an intervention point for suicide prevention. Males account for three in every four suicides in Ireland and are up to twice as likely as females to eventually die by suicide following a hospital presentation for suicidal crisis. This study therefore aimed to profile the characteristics of ED presentations for suicidal ideation and self-harm acts among males in Ireland, using clinical data collected by self-harm nurses within a dedicated national service for crisis presentations to EDs.
Methods:
Using ED data from 2018–2021, variability in the sociodemographic characteristics of male presentations was examined, followed by age-based diversity in the characteristics of presentations and interventions delivered. Finally, likelihood of onward referral to subsequent care was examined according to presentation characteristics.
Results:
Across 45,729 presentations, males more commonly presented with suicidal ideation than females (56% v. 44%) and less often with self-harm (42% v. 58%). Drug- and alcohol-related overdose was the most common method of self-harm observed. A majority of males presenting to ED reported no existing linkage with mental health services.
Conclusions:
Emergency clinicians have an opportunity to ensure subsequent linkage to mental health services for males post-crisis, with the aim of prevention of suicides.
Clinical outcomes of repetitive transcranial magnetic stimulation (rTMS) for treatment of treatment-resistant depression (TRD) vary widely and there is no mood rating scale that is standard for assessing rTMS outcome. It remains unclear whether TMS is as efficacious in older adults with late-life depression (LLD) compared to younger adults with major depressive disorder (MDD). This study examined the effect of age on outcomes of rTMS treatment of adults with TRD. Self-report and observer mood ratings were measured weekly in 687 subjects ages 16–100 years undergoing rTMS treatment using the Inventory of Depressive Symptomatology 30-item Self-Report (IDS-SR), Patient Health Questionnaire 9-item (PHQ), Profile of Mood States 30-item, and Hamilton Depression Rating Scale 17-item (HDRS). All rating scales detected significant improvement with treatment; response and remission rates varied by scale but not by age (response/remission ≥ 60: 38%–57%/25%–33%; <60: 32%–49%/18%–25%). Proportional hazards models showed early improvement predicted later improvement across ages, though early improvements in PHQ and HDRS were more predictive of remission in those < 60 years (relative to those ≥ 60) and greater baseline IDS burden was more predictive of non-remission in those ≥ 60 years (relative to those < 60). These results indicate there is no significant effect of age on treatment outcomes in rTMS for TRD, though rating instruments may differ in assessment of symptom burden between younger and older adults during treatment.
Experimental data on structure formation in highly concentrated aqueous dispersions of kaolinite were analyzed using rheological models. The physicochemical properties of the clay mineral surface were studied during heating at a range of temperatures, and correlation of acid-base properties with physicomechanical characteristics of the spatial structures formed during heating was obtained. It was shown that interparticle interactions and plastic yield mechanisms under load are dependent upon interfacial phenomena. A method for estimating optimal structural parameters was developed for semidry dispersions, enabling regulation of physicochemical and mechanical properties of ceramic mixtures during processing.
In 2016, the National Center for Advancing Translational Science launched the Trial Innovation Network (TIN) to address barriers to efficient and informative multicenter trials. The TIN provides a national platform, working in partnership with 60+ Clinical and Translational Science Award (CTSA) hubs across the country to support the design and conduct of successful multicenter trials. A dedicated Hub Liaison Team (HLT) was established within each CTSA to facilitate connection between the hubs and the newly launched Trial and Recruitment Innovation Centers. Each HLT serves as an expert intermediary, connecting CTSA Hub investigators with TIN support, and connecting TIN research teams with potential multicenter trial site investigators. The cross-consortium Liaison Team network was developed during the first TIN funding cycle, and it is now a mature national network at the cutting edge of team science in clinical and translational research. The CTSA-based HLT structures and the external network structure have been developed in collaborative and iterative ways, with methods for shared learning and continuous process improvement. In this paper, we review the structure, function, and development of the Liaison Team network, discuss lessons learned during the first TIN funding cycle, and outline a path toward further network maturity.
To describe national trends in testing and detection of carbapenemasesproduced by carbapenem-resistant Enterobacterales (CRE) and associatetesting with culture and facility characteristics.
Design:
Retrospective cohort study.
Setting:
Department of Veterans’ Affairs medical centers (VAMCs).
Participants:
Patients seen at VAMCs between 2013 and 2018 with cultures positive for CRE,defined by national VA guidelines.
Interventions:
Microbiology and clinical data were extracted from national VA data sets.Carbapenemase testing was summarized using descriptive statistics.Characteristics associated with carbapenemase testing were assessed withbivariate analyses.
Results:
Of 5,778 standard cultures that grew CRE, 1,905 (33.0%) had evidence ofmolecular or phenotypic carbapenemase testing and 1,603 (84.1%) of these hadcarbapenemases detected. Among these cultures confirmed ascarbapenemase-producing CRE, 1,053 (65.7%) had molecular testing for≥1 gene. Almost all testing included KPC (n = 1,047, 99.4%), with KPCdetected in 914 of 1,047 (87.3%) cultures. Testing and detection of otherenzymes was less frequent. Carbapenemase testing increased over the studyperiod from 23.5% of CRE cultures in 2013 to 58.9% in 2018. The South USCensus region (38.6%) and the Northeast (37.2%) region had the highestproportion of CRE cultures with carbapenemase testing. High complexity (vslow) and urban (vs rural) facilities were significantly associated withcarbapenemase testing (P < .0001).
Conclusions:
Between 2013 and 2018, carbapenemase testing and detection increased in theVA, largely reflecting increased testing and detection of KPC. Surveillanceof other carbapenemases is important due to global spread and increasingantibiotic resistance. Efforts supporting the expansion of carbapenemasetesting to low-complexity, rural healthcare facilities and standardizationof reporting of carbapenemase testing are needed.
The Hierarchical Taxonomy of Psychopathology (HiTOP) has emerged out of the quantitative approach to psychiatric nosology. This approach identifies psychopathology constructs based on patterns of co-variation among signs and symptoms. The initial HiTOP model, which was published in 2017, is based on a large literature that spans decades of research. HiTOP is a living model that undergoes revision as new data become available. Here we discuss advantages and practical considerations of using this system in psychiatric practice and research. We especially highlight limitations of HiTOP and ongoing efforts to address them. We describe differences and similarities between HiTOP and existing diagnostic systems. Next, we review the types of evidence that informed development of HiTOP, including populations in which it has been studied and data on its validity. The paper also describes how HiTOP can facilitate research on genetic and environmental causes of psychopathology as well as the search for neurobiologic mechanisms and novel treatments. Furthermore, we consider implications for public health programs and prevention of mental disorders. We also review data on clinical utility and illustrate clinical application of HiTOP. Importantly, the model is based on measures and practices that are already used widely in clinical settings. HiTOP offers a way to organize and formalize these techniques. This model already can contribute to progress in psychiatry and complement traditional nosologies. Moreover, HiTOP seeks to facilitate research on linkages between phenotypes and biological processes, which may enable construction of a system that encompasses both biomarkers and precise clinical description.
Early in the COVID-19 pandemic, the World Health Organization stressed the importance of daily clinical assessments of infected patients, yet current approaches frequently consider cross-sectional timepoints, cumulative summary measures, or time-to-event analyses. Statistical methods are available that make use of the rich information content of longitudinal assessments. We demonstrate the use of a multistate transition model to assess the dynamic nature of COVID-19-associated critical illness using daily evaluations of COVID-19 patients from 9 academic hospitals. We describe the accessibility and utility of methods that consider the clinical trajectory of critically ill COVID-19 patients.
To examine the association between adherence to plant-based diets and mortality.
Design:
Prospective study. We calculated a plant-based diet index (PDI) by assigning positive scores to plant foods and reverse scores to animal foods. We also created a healthful PDI (hPDI) and an unhealthful PDI (uPDI) by further separating the healthy plant foods from less-healthy plant foods.
Setting:
The VA Million Veteran Program.
Participants:
315 919 men and women aged 19–104 years who completed a FFQ at the baseline.
Results:
We documented 31 136 deaths during the follow-up. A higher PDI was significantly associated with lower total mortality (hazard ratio (HR) comparing extreme deciles = 0·75, 95 % CI: 0·71, 0·79, Ptrend < 0·001]. We observed an inverse association between hPDI and total mortality (HR comparing extreme deciles = 0·64, 95 % CI: 0·61, 0·68, Ptrend < 0·001), whereas uPDI was positively associated with total mortality (HR comparing extreme deciles = 1·41, 95 % CI: 1·33, 1·49, Ptrend < 0·001). Similar significant associations of PDI, hPDI and uPDI were also observed for CVD and cancer mortality. The associations between the PDI and total mortality were consistent among African and European American participants, and participants free from CVD and cancer and those who were diagnosed with major chronic disease at baseline.
Conclusions:
A greater adherence to a plant-based diet was associated with substantially lower total mortality in this large population of veterans. These findings support recommending plant-rich dietary patterns for the prevention of major chronic diseases.
Two introduced carnivores, the European red fox Vulpes vulpes and domestic cat Felis catus, have had extensive impacts on Australian biodiversity. In this study, we collate information on consumption of Australian birds by the fox, paralleling a recent study reporting on birds consumed by cats. We found records of consumption by foxes on 128 native bird species (18% of the non-vagrant bird fauna and 25% of those species within the fox’s range), a smaller tally than for cats (343 species, including 297 within the fox’s Australian range, a subset of that of the cat). Most (81%) bird species eaten by foxes are also eaten by cats, suggesting that predation impacts are compounded. As with consumption by cats, birds that nest or forage on the ground are most likely to be consumed by foxes. However, there is also some partitioning, with records of consumption by foxes but not cats for 25 bird species, indicating that impacts of the two predators may also be complementary. Bird species ≥3.4 kg were more likely to be eaten by foxes, and those <3.4 kg by cats. Our compilation provides an inventory and describes characteristics of Australian bird species known to be consumed by foxes, but we acknowledge that records of predation do not imply population-level impacts. Nonetheless, there is sufficient information from other studies to demonstrate that fox predation has significant impacts on the population viability of some Australian birds, especially larger birds, and those that nest or forage on the ground.
Microscopic examination of blood smears remains the gold standard for laboratory inspection and diagnosis of malaria. Smear inspection is, however, time-consuming and dependent on trained microscopists with results varying in accuracy. We sought to develop an automated image analysis method to improve accuracy and standardization of smear inspection that retains capacity for expert confirmation and image archiving. Here, we present a machine learning method that achieves red blood cell (RBC) detection, differentiation between infected/uninfected cells, and parasite life stage categorization from unprocessed, heterogeneous smear images. Based on a pretrained Faster Region-Based Convolutional Neural Networks (R-CNN) model for RBC detection, our model performs accurately, with an average precision of 0.99 at an intersection-over-union threshold of 0.5. Application of a residual neural network-50 model to infected cells also performs accurately, with an area under the receiver operating characteristic curve of 0.98. Finally, combining our method with a regression model successfully recapitulates intraerythrocytic developmental cycle with accurate lifecycle stage categorization. Combined with a mobile-friendly web-based interface, called PlasmoCount, our method permits rapid navigation through and review of results for quality assurance. By standardizing assessment of Giemsa smears, our method markedly improves inspection reproducibility and presents a realistic route to both routine lab and future field-based automated malaria diagnosis.
Iron-rich meteorites are significantly underrepresented in collection statistics from Antarctica. This has led to a hypothesis that there is a sparse layer of iron-rich meteorites hidden below the surface of the ice, thereby explaining the apparent shortfall. As standard Antarctic meteorite collecting techniques rely upon a visual surface search approach, the need has thus arisen to develop a system that can detect iron objects under a few tens of centimetres of ice, where the expected number density is of the order one per square kilometre. To help answer this hypothesis, a large-scale pulse induction metal detector array has been constructed for deployment in Antarctica. The metal detector array is 6 m wide, able to travel at 15 km h-1 and can scan 1 km2 in ~11 hours. This paper details the construction of the metal detector system with respect to design criteria, notably the ruggedization of the system for Antarctic deployment. Some preliminary results from UK and Antarctic testing are presented. We show that the system performs as specified and should reach the pre-agreed target of the detection of a 100 g iron meteorite at 300 mm when deployed in Antarctica.
The intent of this study was to determine whether there are differences in disaster preparedness between urban and rural community hospitals across New York State.
Methods
Descriptive and analytical cross-sectional survey study of 207 community hospitals; thirty-five questions evaluated 6 disaster preparedness elements: disaster plan development, on-site surge capacity, available materials and resources, disaster education and training, disaster preparedness funding levels, and perception of disaster preparedness.
Results
Completed surveys were received from 48 urban hospitals and 32 rural hospitals.There were differences in disaster preparedness between urban and rural hospitals with respect to disaster plan development, on-site surge capacity, available materials and resources, disaster education and training, and perception of disaster preparedness. No difference was identified between these hospitals with respect to disaster preparedness funding levels.
Conclusions
The results of this study provide an assessment of the current state of disaster preparedness in urban and rural community hospitals in New York. Differences in preparedness between the two settings may reflect differing priorities with respect to perceived threats, as well as opportunities for improvement that may require additional advocacy and legislation. (Disaster Med Public Health Preparedness. 2019;13:424-428)
Timing of weed emergence and seed persistence in the soil influence the ability to implement timely and effective control practices. Emergence patterns and seed persistence of kochia populations were monitored in 2010 and 2011 at sites in Kansas, Colorado, Wyoming, Nebraska, and South Dakota. Weekly observations of emergence were initiated in March and continued until no new emergence occurred. Seed was harvested from each site, placed into 100-seed mesh packets, and buried at depths of 0, 2.5, and 10 cm in fall of 2010 and 2011. Packets were exhumed at 6-mo intervals over 2 yr. Viability of exhumed seeds was evaluated. Nonlinear mixed-effects Weibull models were fit to cumulative emergence (%) across growing degree days (GDD) and to viable seed (%) across burial time to describe their fixed and random effects across site-years. Final emergence densities varied among site-years and ranged from as few as 4 to almost 380,000 seedlings m−2. Across 11 site-years in Kansas, cumulative GDD needed for 10% emergence were 168, while across 6 site-years in Wyoming and Nebraska, only 90 GDD were needed; on the calendar, this date shifted from early to late March. The majority (>95%) of kochia seed did not persist for more than 2 yr. Remaining seed viability was generally >80% when seeds were exhumed within 6 mo after burial in March, and declined to <5% by October of the first year after burial. Burial did not appear to increase or decrease seed viability over time but placed seed in a position from which seedling emergence would not be possible. High seedling emergence that occurs very early in the spring emphasizes the need for fall or early spring PRE weed control such as tillage, herbicides, and cover crops, while continued emergence into midsummer emphasizes the need for extended periods of kochia management.
The goal of this research was to develop herbicide programs for controlling acetolactate synthase (ALS)–, propanil-, quinclorac-, and clomazone-resistant barnyardgrass. Two applications of imazethapyr alone at 70 g ai ha−1 failed to control the ALS-resistant biotype more than 36%; however, when imazethapyr at 70 g ha−1 was applied early POST (EPOST) followed by imazethapyr at 70 g ha−1 plus fenoxaprop at 120 g ai ha−1 immediately prior to flooding (PREFLD), barnyardgrass control improved to 78% at 10 wk after planting. When imazethapyr was applied twice following PRE or delayed PRE applications of clomazone at 336 g ai ha−1, quinclorac at 560 g ai ha−1, pendimethalin at 1,120 g ai ha−1, or thiobencarb at 4,480 g ai ha−1 control was 92 to 100%. A single-pass program consisting of a delayed PRE application of clomazone at 336 g ha−1 plus quinclorac at 560 g ha−1 plus pendimethalin at 1,120 g ha−1 plus thiobencarb at 4,480 g ha−1 controlled all herbicide-resistant barnyardgrass biotypes at the same level as a standard multiple application program.
Nine sites of cogongrass were included in a study of genotypic diversity and spread dynamics at the point of introduction and its adjacent areas in the southern United States. Clones evaluated with two primer pairs yielded a total of 137 amplified fragment length polymorphism (AFLP) loci of which 102 (74.4%) were polymorphic. Genetic diversity was measured as the percentage of polymorphic, Shannon's information index, Nei's gene diversity, and panmictic heterozygosity. Nei's gene diversity (HS) across all nine sites was estimated to be 0.11 and within site gene diversity ranged from 0.06 to 0.16. Bayesian estimate of gene diversity and Shannon's information index were higher (0.17 and 0.17, respectively). The samples from the point of introduction (Pi) had the lowest genetic diversity for all types of estimates. Within site variance accounted for 56% of the total variation and among site variance 44% (P < 0.05). Differentiation among sites was assessed using FST. The greatest difference was found between the Pi and the others. No relationship was found between genetic and geographic distances. Principal component analysis as well as cluster analysis separated individuals into three main clusters. The Pi formed a separate subcluster. Gene flow (Nm), inferred from Φ-statistics describing the genetic differentiation between pairs of sites ranged from 0.6 to 5.55. The lack of significant relationship between gene flow and geographic distance as well as genetic and geographic distances suggests that the invasion dynamics of cogongrass into the southern United States is primarily through anthropogenic activities and to the lesser extent through natural forces.
Field experiments were conducted in Alabama during 1999 and 2000 to test the hypothesis that any glyphosate-induced yield suppression in glyphosate-resistant cotton would be less with irrigation than without irrigation. Yield compensation was monitored by observing alterations in plant growth and fruiting patterns. Glyphosate treatments included a nontreated control, 1.12 kg ai/ha applied POST at the 4-leaf stage, 1.12 kg/ha applied DIR at the prebloom stage, and 1.12 kg/ha applied POST at 4-leaf and postemergence directed (DIR) at the prebloom cotton stages. The second variable, irrigation treatment, was established by irrigating plots individually with overhead sprinklers or maintaining them under dryland, nonirrigated conditions. Cotton yield and all measured parameters including lint quality were positively affected by irrigation. Irrigation increased yield 52% compared to nonirrigated cotton. Yield and fiber quality effects were independent of glyphosate treatments. Neither yield nor any of the measured variables that reflected whole plant response were influenced by glyphosate treatment or by a glyphosate by irrigation interaction.
Research was conducted in 2007 and 2008 to evaluate weed-control options in an imazethapyr-resistant rice production system. Raised beds were formed, and imidazolinone-resistant hybrid rice ‘CL 730’ was drill-seeded on beds. Five herbicide programs applied up to the four- to six-leaf stage of rice were evaluated with and without additional “as-needed” herbicide at later stages. All the herbicide combinations and as-needed herbicides tested in this research were labeled for rice, and only minor transient injury (< 5%) was initially observed. Weeds emerged throughout the growing season, and as-needed herbicides were applied after the four- to six-leaf stage of rice to control these late-emerging weeds and weeds not effectively controlled with earlier applications, primarily Palmer amaranth. Most of the Palmer amaranth at this site was insensitive to imazethapyr (possibly acetolactate synthase resistant). Therefore, application of as-needed herbicides with different modes of action, such as 2,4-D, were used to improve Palmer amaranth control. Rice yields were often numerically higher in plots that received additional herbicide after the six-leaf stage of rice, but yields were not significantly improved.
In North America, terrestrial records of biodiversity and climate change that span Marine Oxygen Isotope Stage (MIS) 5 are rare. Where found, they provide insight into how the coupling of the ocean–atmosphere system is manifested in biotic and environmental records and how the biosphere responds to climate change. In 2010–2011, construction at Ziegler Reservoir near Snowmass Village, Colorado (USA) revealed a nearly continuous, lacustrine/wetland sedimentary sequence that preserved evidence of past plant communities between ~140 and 55 ka, including all of MIS 5. At an elevation of 2705 m, the Ziegler Reservoir fossil site also contained thousands of well-preserved bones of late Pleistocene megafauna, including mastodons, mammoths, ground sloths, horses, camels, deer, bison, black bear, coyotes, and bighorn sheep. In addition, the site contained more than 26,000 bones from at least 30 species of small animals including salamanders, otters, muskrats, minks, rabbits, beavers, frogs, lizards, snakes, fish, and birds. The combination of macro- and micro-vertebrates, invertebrates, terrestrial and aquatic plant macrofossils, a detailed pollen record, and a robust, directly dated stratigraphic framework shows that high-elevation ecosystems in the Rocky Mountains of Colorado are climatically sensitive and varied dramatically throughout MIS 5.