We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A stimulus (or stimulus-complex) is pictured as giving rise to a random series of sensory nerve “pulses,” which manifest themselves in contractions of individual muscle fibers. Assuming the expected time-frequency of these pulses to be proportional to the intensity of the stimulus, probability distributions are computed representing the cumulative effect of these pulses on the state of the organism, that is, on its degree of awareness of the stimulus. Preliminary results suggest a modification of the Weber-Fechner formula for intensity discrimination for certain types of stimuli: the psychological scale to be measured by I1/2 instead of log I.
This chapter emphasises the key role that comic revues and music hall acts played in ensuring the British army had a continuous stream of recruits throughout the First World War. Through examining the songs, sketches and characters through which this was achieved, the chapter demonstrates the varied strategies used and the ways they drew on earlier modes of performance such as nineteenth-century melodrama. Particular emphasis is placed on the gendered ideology that was implicit in the performances examined, for example in looking at the dramatization of atrocity stories which were circulating in the press and the treatment of women in these plays and wider narratives. The chapter also focusses on music hall songs and performances by male impersonators such as Marie Lloyd. It encourages us to question the simple alignment of propaganda and popular entertainment and offers a more nuanced understanding of these performances through the lens of satire. By doing so, it demonstrates how satirising and parodying wartime experiences provided a release from anxiety. Stage satire and comedy, it concludes, offer a unique perspective on how modern total war saturated public life.
The National Institutes of Health’s (NIH) K99/R00 Pathway to Independence Award offers promising postdoctoral researchers and clinician-scientists an opportunity to receive research support at both the mentored and the independent levels with the goal of facilitating a timely transition to a tenure-track faculty position. This transitional program has been generally successful, with most K99/R00 awardees successfully securing R01-equivalent funding by the end of the R00 period. However, often highly promising proposals fail because of poor grantsmanship. This overview provides guidance from the perspective of long-standing members of the National Heart, Lung, and Blood Institute’s Mentored Transition to Independence study section for the purpose of helping mentors and trainees regarding how best to assemble competitive K99/R00 applications.
Nitrogen fixation from pasture legumes is a fundamental process that contributes to the profitability and sustainability of dryland agricultural systems. The aim of this research was to determine whether well-managed pastures, based on aerial-seeding pasture legumes, could partially or wholly meet the nitrogen (N) requirements of subsequent grain crops in an annual rotation. Fifteen experiments were conducted in Western Australia with wheat, barley or canola crops grown in a rotation that included the pasture legume species French serradella (Ornithopus sativus), biserrula (Biserrula pelecinus), bladder clover (Trifolium spumosum), annual medics (Medicago spp.) and the non-aerial seeded subterranean clover (Trifolium subterraneum). After the pasture phase, five rates of inorganic N fertilizer (Urea, applied at 0, 23, 46, 69 and 92 kg/ha) were applied to subsequent cereal and oil seed crops. The yields of wheat grown after serradella, biserrula and bladder clover, without the use of applied N fertilizer, were consistent with the target yields for growing conditions of the trials (2.3 to 5.4 t/ha). Crop yields after phases of these pasture legume species were similar or higher than those following subterranean clover or annual medics. The results of this study suggest a single season of a legume-dominant pasture may provide sufficient organic N in the soil to grow at least one crop, without the need for inorganic N fertilizer application. This has implications for reducing inorganic N requirements and the carbon footprint of cropping in dryland agricultural systems.
Drug development is a long and arduous process that requires many researchers at different types of institutions. These include researchers in university settings, researchers in government settings, researchers in non-profit organizations and researchers in the pharmaceutical industry. The pharmaceutical industry itself is heterogeneous, ranging from tiny biotech companies to large multi-national organizations. This chapte emphasizes drug development efforts by the pharmaceutical industry but will also make note of the many collaborations between pharma and researchers at other types of institutions.
Relapse and recurrence of depression are common, contributing to the overall burden of depression globally. Accurate prediction of relapse or recurrence while patients are well would allow the identification of high-risk individuals and may effectively guide the allocation of interventions to prevent relapse and recurrence.
Aims
To review prognostic models developed to predict the risk of relapse, recurrence, sustained remission, or recovery in adults with remitted major depressive disorder.
Method
We searched the Cochrane Library (current issue); Ovid MEDLINE (1946 onwards); Ovid Embase (1980 onwards); Ovid PsycINFO (1806 onwards); and Web of Science (1900 onwards) up to May 2021. We included development and external validation studies of multivariable prognostic models. We assessed risk of bias of included studies using the Prediction model risk of bias assessment tool (PROBAST).
Results
We identified 12 eligible prognostic model studies (11 unique prognostic models): 8 model development-only studies, 3 model development and external validation studies and 1 external validation-only study. Multiple estimates of performance measures were not available and meta-analysis was therefore not necessary. Eleven out of the 12 included studies were assessed as being at high overall risk of bias and none examined clinical utility.
Conclusions
Due to high risk of bias of the included studies, poor predictive performance and limited external validation of the models identified, presently available clinical prediction models for relapse and recurrence of depression are not yet sufficiently developed for deploying in clinical settings. There is a need for improved prognosis research in this clinical area and future studies should conform to best practice methodological and reporting guidelines.
Angiostrongylus cantonensis is a pathogenic nematode and the cause of neuroangiostrongyliasis, an eosinophilic meningitis more commonly known as rat lungworm disease. Transmission is thought to be primarily due to ingestion of infective third stage larvae (L3) in gastropods, on produce, or in contaminated water. The gold standard to determine the effects of physical and chemical treatments on the infectivity of A. cantonensis L3 larvae is to infect rodents with treated L3 larvae and monitor for infection, but animal studies are laborious and expensive and also raise ethical concerns. This study demonstrates propidium iodide (PI) to be a reliable marker of parasite death and loss of infective potential without adversely affecting the development and future reproduction of live A. cantonensis larvae. PI staining allows evaluation of the efficacy of test substances in vitro, an improvement upon the use of lack of motility as an indicator of death. Some potential applications of this assay include determining the effectiveness of various anthelmintics, vegetable washes, electromagnetic radiation and other treatments intended to kill larvae in the prevention and treatment of neuroangiostrongyliasis.
Several grass and broadleaf weed species around the world have evolved multiple-herbicide resistance at alarmingly increasing rates. Research on the biochemical and molecular resistance mechanisms of multiple-resistant weed populations indicate a prevalence of herbicide metabolism catalyzed by enzyme systems such as cytochrome P450 monooxygenases and glutathione S-transferases and, to a lesser extent, by glucosyl transferases. A symposium was conducted to gain an understanding of the current state of research on metabolic resistance mechanisms in weed species that pose major management problems around the world. These topics, as well as future directions of investigations that were identified in the symposium, are summarized herein. In addition, the latest information on selected topics such as the role of safeners in inducing crop tolerance to herbicides, selectivity to clomazone, glyphosate metabolism in crops and weeds, and bioactivation of natural molecules is reviewed.
Preplant applied herbicides were compared for their effect on three varieties of sugarbeets when seeds were planted at six depths during 1987 through 1989. More sugarbeet seedlings emerged and at a faster rate as the depth of seeding decreased from 4.5 to 1.6 cm. Herbicide injury to sugarbeet seedlings increased as depth of seeding increased from less than to greater than 2.5 cm. Herbicide treatments reduced sugarbeet stand and decreased early season sugarbeet height but had little effect on root yield or sucrose content.
Management systems for direct-seeded and transplanted sugarbeets (Beta vulgaris L. ‘Mono Hy D2′) were compared for weed control and sugarbeet selectivity from 1983 through 1985 in western Nebraska. Broadleaf weed density was similar, but yellow foxtail [Setaria glauca (L.) Beauv. # SETLU] density was lower in transplanted compared to direct-seeded sugarbeets. Preplant soil-incorporated applications of cycloate (S-ethyl cyclohexylethylcarbamothioate) plus trifluralin [2,6-dinitro-N,N-dipropyl-4-(trifluoromethyl)benzenamine] at 3.3 plus 0.6 kg ai/ha or ethofumesate [(±)-2-ethoxy-2,3-dihydro-3,3-dimethyl-5-benzofuranyl methanesulfonate] plus trifluralin at 2.2 plus 0.6 kg/ha was noninjurious to transplanted sugarbeets but caused severe injury to direct-seeded sugarbeets. The combination of cycloate or ethofumesate with trifluralin improved weed control over that obtained when cycloate or ethofumesate was used alone. By combining the improved weed control obtained from cycloate plus trifluralin or ethofumesate plus trifluralin with the transplanting crop establishment technique, a superior sugarbeet weed control program was developed.
Three herbaceous regimes were established, using herbicides, to examine the effects of interference on growth and biomass partitioning in loblolly pine (Pinus taeda L.). Trees were sampled near Auburn and Tallassee, AL. Trees at the Auburn site grown with low weed interference (LWI) had 4, 10, 10, 8, and 4 times greater total aboveground biomass than did trees with high weed interference (HWI) for ages one through five, respectively. Medium weed interference (MWI, Auburn site only) resulted in three times greater biomass the first 4 yr and two times greater total biomass by the fifth year compared to trees grown with HWI. Trees growing with LWI were 5, 8, 10, and 6 times larger than those with HWI for ages one through four, respectively, at the Tallassee site. At all levels of interference, the percentage of total biomass in foliage decreased, and stem and branch components increased, with increasing tree size at both sites. Trees growing with HWI had a lower percentage of total biomass in foliage and a greater percentage of total biomass in stem than those growing with LWI when compared over a common size. Growth efficiency per tree, expressed as annual increase in stem biomass per unit leaf area (g m−2), was slightly greater for trees growing with LWI compared to HWI when leaf area index (LAI3, total surface) was less than 0.2. For LAI values greater than 0.2 the relationship was reversed. The latter contradicts the idea that growth efficiency can be used as a measure of vigor for young loblolly pine. Changes in carbon partitioning to the development of leaf area are suggested to be driving the accelerated growth responses associated with a reduction of weed interference.
Stands of four-year-old loblolly pines grown with and without herbaceous competition were compared to determine whether early increases in soil moisture and plant water status had been maintained throughout the first four years. Non-weeded stands tended to have greater soil moisture than weeded stands, although these differences were never statistically significant (P > 0.05). Plant water potential was remarkably similar between treatments, as were photosynthesis and stomatal conductance. The increase due to weed control in foliage production early in stand development apparently caused a depletion in available soil moisture to levels similar to non-weeded stands. Therefore, the direct benefit of increased soil resources with weed control is short lived.
Field trials were conducted in 1999 and 2000 to determine the influence of weed size and the number of glyphosate or glufosinate applications on weed control and sugarbeet yield. Glyphosate at 840 g/ha or glufosinate at 390 g/ha was applied one, two, or three times, beginning when the average weed height was 3, 10, 15, or 25 cm. Two sequential applications of glyphosate applied to 10-cm weeds or three sequential applications of glufosinate applied to 3-cm weeds provided weed control comparable to three sequential applications of desmedipham plus phenmedipham plus triflusulfuron plus clopyralid. Weed control and sugarbeet root yield were optimal for two postemergence applications of glyphosate and for three applications of glufosinate. Glyphosate provided greater control of redroot pigweed and common lambsquarters than glufosinate. Sugarbeet sucrose yield with both glyphosate and glufosinate weed control programs was nearly 10,000 kg/ha. Compared with two sequential applications of glyphosate, sucrose yield of glyphosate-resistant sugarbeet was reduced 15% by three sequential applications of desmedipham plus phenmedipham plus triflusulfuron plus clopyralid. Sucrose yields were similar between three sequential applications of glufosinate and three applications of desmedipham plus phenmedipham plus triflusulfuron plus clopyralid.
Field trials were conducted at five sites from 2001 through 2003 to determine the influence on sugarbeet and weeds of repeated broadcast and banded reduced rates of desmedipham plus phenmedipham, triflusulfuron, and clopyralid in combination with either 1.5 or 3% v/v methylated seed oil (MSO). Desmedipham plus phenmedipham, triflusulfuron, and clopyralid were applied POST three times at 5 to 7 d intervals at either 25, 50, 75, or 100% of a 180 plus 180 plus 18 plus 100 g ai/ha dosage (full rate). When averaged over all herbicide rates, crop injury was 6% greater, but common lambsquarters control was 5% higher, and crop yield was 15% greater with broadcast compared with banded herbicide application. In most situations, adding MSO at 3% rather than 1.5% did not improve weed control. Sugarbeet injury was lowest (11%) and the average weed control was 86% when herbicide rates (with 1.5% MSO) were 25% of the full rate (microrate). Applying an herbicide rate (with 1.5% MSO) that was 50% of the full rate (half rate) increased crop injury from 11% with the microrate to 18% with the half rate and elevated average weed control from 86% with the microrate to 92% with the half rate. Common lambsquarters control increased from 81% with the microrate to 89% with the half rate. Sugarbeet root yield was 23 t/ha when no herbicide was used, 48 t/ha with the microrate, and 49 t/ha with the half rate compared with 54 t/ha when the full rate was applied without MSO. Increasing herbicide rates to 75% of the full rate (three-quarter rate) (with 1.5% MSO) increased crop injury to 27% and average weed control to 96%. Applying 1.5% MSO to the full rate increased crop injury to 35% with no improvement in average weed control over that achieved with the full rate without MSO.
Field trials were conducted in 1995 through 2002 to expand the development of chicory by determining the potential for tank mixtures of benefin, trifluralin, or pronamide applied preplant incorporated (PPI) and triflusulfuron methyl or imazamox postemergence (POST) for selective weed control in chicory. Lack of early-season weed control resulted in an 88% reduction in chicory root yield in 1995 to 1996 and an 85% reduction in 2001 to 2002 and demonstrated the susceptibility of chicory plants to early-season weed competition. In the first experiment, pronamide at 1.1 kg ai/ha PPI plus benefin at 1.3 kg ai/ha or trifluralin at 0.56 kg/ha were selective for chicory and controlled weed populations 90% on average with root yields that were 89% of the hand-weeded treatment. Triflusulfuron methyl POST at 17 g/ha caused early-season chicory injury. In the second experiment, trifluralin PPI at 0.56 kg/ha followed by imazamox POST at 36 g/ha controlled weeds 95% on average with a chicory root yield of 74 t/ha, which was 109% of the yield of the hand-weeded treatment.
Various medications and devices are available for facilitation of emergent endotracheal intubations (EETIs). The objective of this study was to survey which medications and devices are being utilized for intubation by Canadian physicians.
Methods
A clinical scenario-based survey was developed to determine which medications physicians would administer to facilitate EETI, their first choice of intubation device, and backup strategy should their first choice fail. The survey was distributed to Canadian emergency medicine (EM) and intensive care unit (ICU) physicians using web-based and postal methods. Physicians were asked questions based on three scenarios (trauma; pneumonia; heart failure) and responded using a 5-point scale ranging from “always” to “never” to capture usual practice.
Results
The survey response rate was 50.2% (882/1,758). Most physicians indicated a Macintosh blade with direct laryngoscopy would “always/often” be their first choice of intubation device in the three scenarios (mean 85% [79%-89%]) followed by video laryngoscopy (mean 37% [30%-49%]). The most common backup device chosen was an extraglottic device (mean 59% [56%-60%]). The medications most physicians would “always/often” administer were fentanyl (mean 45% [42%-51%]) and etomidate (mean 38% [25%-50%]). EM physicians were more likely than ICU physicians to paralyze patients for EETI (adjusted odds ratio 3.40; 95% CI 2.90-4.00).
Conclusions
Most EM and ICU physicians utilize direct laryngoscopy with a Macintosh blade as a primary device for EETI and an extraglottic device as a backup strategy. This survey highlights variation in Canadian practice patterns for some aspects of intubation in critically ill patients.
Fifty patients with myocardial infarction were recruited from a hospital based Cardiac Education and Assessment Program (CEAP) in Sydney, Australia. The Exercise Motivation Inventory-2 (EMI-2) and the Depression, Anxiety and Stress Scale (DASS) were administered prior to commencement in the program and re-administered by telephone interview at 5-month followup. Four exercise adherence measures were completed: attendance, exercise stress test, self-report ratings and a 7-day activity recall interview. There was a 46% adherence rate for MI patients during the hospital based CEAP. Of those individuals who completed CEAP, 91% obtained functional improvement on the exercise stress test. For the 38 patients who were followed-up by telephone interview at 5 months, 71% were exercising according to CEAP prescription. Higher levels of anxiety were associated with lower levels of self-reported exercise adherence. The 3 strongest motivations for exercise in this group of MI patients were all health related; wanting to be free from illness, maintaining good health and recovering from the effects of coronary heart disease. The discussion highlights the implications of these findings for cardiac rehabilitation programs and the need for empirically driven guidelines for measuring exercise adherence.