We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In this editorial we, as members of the 2022 NICE Guideline Committee, highlight and discuss what, in our view, are the key guideline recommendations (generated through evidence synthesis and consensus) for mental health professionals when caring for people after self-harm, and we consider some of the implementation challenges.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Evidence-based psychotherapies (EBPs) are underused in health care settings. Aligning implementation of EBPs with the needs of health care leaders (i.e., operational stakeholders) can potentially accelerate their uptake into routine practice. Operational stakeholders (such as hospital leaders, clinical directors, and national program officers) can influence development and oversight of clinical programs as well as policy directives at local, regional, and national levels. Thus, engaging these stakeholders during the implementation and dissemination of EBPs is critical when targeting wider use in health care settings. This article describes how research–operations partnerships were leveraged to increase implementation of an empirically supported psychotherapy – brief Cognitive Behavioral Therapy (brief CBT) – in Veterans Health Administration (VA) primary care settings. The partnered implementation and dissemination efforts were informed by the empirically derived World Health Organization’s ExpandNet framework. A steering committee was formed and included several VA operational stakeholders who helped align the brief CBT program with the implementation needs of VA primary care settings. During the first 18 months of the project, partnerships facilitated rapid implementation of brief CBT at eight VA facilities, including training of 12 providers who saw 120 patients, in addition to expanded program elements to better support sustainability (e.g., train-the-trainer procedures).
This study compared the level of education and tests from multiple cognitive domains as proxies for cognitive reserve.
Method:
The participants were educationally, ethnically, and cognitively diverse older adults enrolled in a longitudinal aging study. We examined independent and interactive effects of education, baseline cognitive scores, and MRI measures of cortical gray matter change on longitudinal cognitive change.
Results:
Baseline episodic memory was related to cognitive decline independent of brain and demographic variables and moderated (weakened) the impact of gray matter change. Education moderated (strengthened) the gray matter change effect. Non-memory cognitive measures did not incrementally explain cognitive decline or moderate gray matter change effects.
Conclusions:
Episodic memory showed strong construct validity as a measure of cognitive reserve. Education effects on cognitive decline were dependent upon the rate of atrophy, indicating education effectively measures cognitive reserve only when atrophy rate is low. Results indicate that episodic memory has clinical utility as a predictor of future cognitive decline and better represents the neural basis of cognitive reserve than other cognitive abilities or static proxies like education.
Postoperative cognitive impairment is among the most common medical complications associated with surgical interventions – particularly in elderly patients. In our aging society, it is an urgent medical need to determine preoperative individual risk prediction to allow more accurate cost–benefit decisions prior to elective surgeries. So far, risk prediction is mainly based on clinical parameters. However, these parameters only give a rough estimate of the individual risk. At present, there are no molecular or neuroimaging biomarkers available to improve risk prediction and little is known about the etiology and pathophysiology of this clinical condition. In this short review, we summarize the current state of knowledge and briefly present the recently started BioCog project (Biomarker Development for Postoperative Cognitive Impairment in the Elderly), which is funded by the European Union. It is the goal of this research and development (R&D) project, which involves academic and industry partners throughout Europe, to deliver a multivariate algorithm based on clinical assessments as well as molecular and neuroimaging biomarkers to overcome the currently unsatisfying situation.
Introduction: Emergency hospital admissions are a growing concern for patients and health systems, globally. The objective of this study was to systematically review the evidence for diagnostic, medical, and surgical interventions that reduce emergency hospital admissions. Methods: We conducted a systematic review of systematic reviews by searching MEDLINE, PubMED, the Cochrane Database of Systematic Reviews, Google Scholar, and grey literature. Systematic reviews of any diagnostic, surgical, or medical interventions examining the effect on emergency hospital admissions among adults were included. The quality of reviews was assessed using AMSTAR and the quality of evidence was assessed using GRADE. The subsequent analysis was restricted to interventions with moderate or high-quality evidence only. Results: 13 051 titles and abstracts and 1 791 full-text articles were screened from which 42 systematic reviews were included. The reviews included an underlying evidence base of 215 randomized controlled trials with 135 282 patients. Of 20 unique diagnostic, medical, and surgical interventions identified, four had moderate (n = 4) or high (n = 0) quality evidence for significant reductions in hospital admissions in five patient populations. These were: cardiac resynchronization therapy for heart failure and atrial fibrillation, percutaneous aspiration for pneumothorax, early/routine coronary angiography for acute coronary syndrome (alone or comorbid with chronic kidney disease), and natriuretic peptide guided therapy for heart failure. Conclusion: We identified four interventions across five populations that when optimized, may lead to reductions in emergency hospital admissions. These finding can therefore help guide the development of quality indicators, standards, or practice guidelines.
The Crab pulsar was first detected soon after the discovery of pulsars, and has long been studied for its unique traits. One of these traits, giant pulses that can be upwards of 1000 times brighter than the average pulse, was key to the Crab’s initial detection. Giant pulses are only seen in a few pulsars, and their energy distributions distinguish them from normal pulsed emission. There have been many studies over a period of decades to measure the power-law slope of these energy distributions, which provide insight into the possible emission mechanism of these giant pulses.
The 42-foot telescope at Jodrell Bank Observatory monitors the Crab pulsar on a daily basis. We have single-pulse data dating back to 2012, containing roughly 1,000,000 giant pulses, the largest sample of Crab giant pulses to date. This large set of giant pulses allows us to do a range of science, including pulse-width studies and in-depth studies of giant-pulse energy distributions. The latter are particularly interesting, as close inspection of the high-energy tail of the energy distribution allows us to investigate the detectability of extragalactic giant-pulsing pulsars. Also, by calculating rates from these energy distributions, we may be able to shed light on a possible link between Fast Radio Bursts and giant pulses.
It should be fairly obvious that, in contemplating the design of a future combat aircraft, it is important at the outset to relate the effort to some forecast definition of what is needed. This may take the form of an official or an unofficial requirement; however, its precise form is immaterial to the central argument. Either way, it is necessary to recognise several limitations to a so-called requirement:
i) the chances of it being ‘right’, some 10-20 years before the realisation of its potential wartime application, is fairly remote. This is particularly true with today’s rapid advancement in, for example, weapons and sensor technology. These developments make it doubly difficult to predict the nature and strength of the threat force;
ii) the postulated scenarios will change from day to day, and indeed, during the day–according to the course of the main battle; therefore the military commander's requirements of the forces at his disposal can change by the hour;
iii) the environment, both militarily and climatically, plays a large part in the definition of a design. The variability of the environment is neglected only with grave risk to the eventual utility of the weapon system.
During 1990 we surveyed the southern sky using a multi-beam receiver at frequencies of 4850 and 843 MHz. The half-power beamwidths were 4 and 25 arcmin respectively. The finished surveys cover the declination range between +10 and −90 degrees declination, essentially complete in right ascension, an area of 7.30 steradians. Preliminary analysis of the 4850 MHz data indicates that we will achieve a five sigma flux density limit of about 30 mJy. We estimate that we will find between 80 000 and 90 000 new sources above this limit. This is a revised version of the paper presented at the Regional Meeting by the first four authors; the surveys now have been completed.
Although several cases of acute hepatic failure associated with the administration of valproic acid (VPA) have been reported, the pathogenesis of this problem remains unclear. We report the case of a 40 month old male with a chronic seizure disorder treated with VPA and phenytoin for two years, who developed sudden progressive neurological deterioration with evidence of increased intracranial pressure and hepatic failure. The clinical features closely resembled those of Reye’s syndrome suggesting the possibility of a common pathogenesis.
Reintroductions are used to re-establish populations of species within their indigenous range, but their outcomes are variable. A key decision when developing a reintroduction strategy is whether to include a temporary period of confinement prior to release. Pre-release confinement is primarily used for the purpose of quarantine or as a delayed-release tactic to influence the performance or behaviour of founders post-release. A common difference between these approaches is that quarantine tends to be conducted in ex situ captivity, whereas delayed releases tend to involve in situ confinement at the release site. Although these practices are commonly viewed independently, it may be possible for a single confinement period to be used for both purposes. We tested whether temporarily holding wild eastern bettongs Bettongia gaimardi in ex situ captivity for 95–345 days prior to release (delayed release) influenced their body mass, pouch occupancy or survival during the first 1.5 years post-release, compared to founders released without confinement (immediate release). Our results suggest that exposing founders to captivity did not alter their body mass or performance post-release, despite being heavier and having fewer pouch young when released. We conclude that, for this species, ex situ captivity does not represent a tactical opportunity to improve post-release performance but can be used for quarantine without affecting the probability of establishment.
A Chebyshev set is a subset of a normed linear space that admits unique best approximations. In the first part of this paper we present some basic results concerning Chebyshev sets. In particular, we investigate properties of the metric projection map, sufficient conditions for a subset of a normed linear space to be a Chebyshev set, and sufficient conditions for a Chebyshev set to be convex. In the second half of the paper we present a construction of a nonconvex Chebyshev subset of an inner product space.
The Hospital Anxiety and Depression Scale (HADS) has established use with older adult populations in New Zealand but few studies have evaluated its psychometric properties. Research with the psychometric properties of the HADS in elderly populations has primarily used correlational methods that do not allow for the effects of measurement error to be observed. The hypothesized tripartite model of anxiety and depression within the HADS was evaluated using confirmatory factor analysis (CFA) methods.
Methods:
Overall, 203 community-dwelling older adults who were recruited from older adult community groups completed the HADS. Competing two- and three-factor structures were trialled using CFA.
Results:
A three-factor model indicated a lack of differentiation between factors and poor clinical utility and was rejected in favor of a two-factor model. Significant correlations were observed between the anxiety and depression factors on the two-factor model, but it was considered to have validity for older adult samples. Good internal consistency was found for the HADS.
Conclusions:
A two-factor model of the HADS was favored due to the lack of differentiation between factors on the three-factor model, and the higher clinical utility of a two-factor solution. The validity of the HADS may be limited by over-diagnosing anxiety in non-clinical populations. It is recommended that the HADS be used to measure change over time through treatment and not be used as a diagnostic tool until future research establishes appropriate norms and cut-offs.
Achieving an understanding of the extent of micronutrient adequacy across Europe is a major challenge. The main objective of the present study was to collect and evaluate the prevalence of low micronutrient intakes of different European countries by comparing recent nationally representative dietary survey data from Belgium, Denmark, France, Germany, The Netherlands, Poland, Spain and the United Kingdom. Dietary intake information was evaluated for intakes of Ca, Cu, I, Fe, Mg, K, Se, Zn and the vitamins A, B1, B2, B6, B12, C, D, E and folate. The mean and 5th percentile of the intake distributions were estimated for these countries, for a number of defined sex and age groups. The percentages of those with intakes below the lower reference nutrient intake and the estimated average requirement were calculated. Reference intakes were derived from the UK and Nordic Nutrition Recommendations. The impact of dietary supplement intake as well as inclusion of apparently low energy reporters on the estimates was evaluated. Except for vitamin D, the present study suggests that the current intakes of vitamins from foods lead to low risk of low intakes in all age and sex groups. For current minerals, the study suggests that the risk of low intakes is likely to appear more often in specific age groups. In spite of the limitations of the data, the present study provides valuable new information about micronutrient intakes across Europe and the likelihood of inadequacy country by country.
Infants with Spina Bifida (SB) were compared to typically developing infants (TD) using a conjugate reinforcement paradigm at 6 months-of-age (n = 98) to evaluate learning, and retention of a sensory-motor contingency. Analyses evaluated infant arm-waving rates at baseline (wrist not tethered to mobile), during acquisition of the sensory-motor contingency (wrist tethered), and immediately after the acquisition phase and then after a delay (wrist not tethered), controlling for arm reaching ability, gestational age, and socioeconomic status. Although both groups responded to the contingency with increased arm-waving from baseline to acquisition, 15% to 29% fewer infants with SB than TD were found to learn the contingency depending on the criterion used to determine contingency learning. In addition, infants with SB who had learned the contingency had more difficulty retaining the contingency over time when sensory feedback was absent. The findings suggest that infants with SB do not learn motor contingencies as easily or at the same rate as TD infants, and are more likely to decrease motor responses when sensory feedback is absent. Results are discussed with reference to research on contingency learning in infants with and without neurodevelopmental disorders, and with reference to motor learning in school-age children with SB. (JINS, 2013, 19, 1–10)