We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This paper reports an experiment designed to assess the effects of a rotation in the marginal cost curve on convergence in a repeated Cournot triopoly. Increasing the cost curve's slope both reduces the serially-undominated set to the Nash prediction, and increases the peakedness of earnings. We observe higher rates of Nash equilibrium play in the design with the steeper marginal cost schedule, but only when participants are also rematched after each decision. Examination of response patterns suggests that the treatment with a steeper marginal cost curve and with a re-matching of participants across periods induces the selection of Nash Consistent responses.
We investigate cooperation using an incremental investment game in which the first-mover has the ability to make small, but increasing incremental investments in their counterpart. Our experiment is designed to test whether establishing trust in small increments is more effective than alternatives, including a one-shot investment game, a decrease only condition where the amount the first-mover sends to the second-mover must be less than the amount previously sent, and an unrestricted condition where the first-mover is not restricted by the amount previously sent. Although results were mixed, broadly, iteration affords greater cooperation than one-shot games and, when given the choice, participants seem to prefer to build trust gradually. Implications for institutional design are discussed.
Fifty-three tests designed to measure aspects of creative thinking were administered to 410 air cadets and student officers. The scores were intercorrelated and 16 factors were extracted. Orthogonal rotations resulted in 14 identifiable factors, a doublet, and a residual. Nine previously identified factors were: verbal comprehension, numerical facility, perceptual speed, visualization, general reasoning, word fluency, associational fluency, ideational fluency, and a factor combining Thurstone's closure I and II. Five new factors were identified as originality, redefinition, adaptive flexibility, spontaneous flexibility, and sensitivity to problems.
Item response theory models posit latent variables to account for regularities in students' performances on test items. Wilson's “Saltus” model extends the ideas of IRT to development that occurs in stages, where expected changes can be discontinuous, show different patterns for different types of items, or even exhibit reversals in probabilities of success on certain tasks. Examples include Piagetian stages of psychological development and Siegler's rule-based learning. This paper derives marginal maximum likelihood (MML) estimation equations for the structural parameters of the Saltus model and suggests a computing approximation based on the EM algorithm. For individual examinees, empirical Bayes probabilities of learning-stage are given, along with proficiency parameter estimates conditional on stage membership. The MML solution is illustrated with simulated data and an example from the domain of mixed number subtraction.
Transdisciplinary research knits together knowledge from diverse epistemic communities in addressing social-environmental challenges, such as biodiversity loss, climate crises, food insecurity, and public health. This article reflects on the roles of philosophy of science in transdisciplinary research while focusing on Indigenous and other subjugated forms of knowledge. We offer a critical assessment of demarcationist approaches in philosophy of science and outline a constructive alternative of transdisciplinary philosophy of science. While a focus on demarcation obscures the complex relations between epistemic communities, transdisciplinary philosophy of science provides resources for meeting epistemic and political challenges of collaborative knowledge production.
The Hierarchical Taxonomy of Psychopathology (HiTOP) has emerged out of the quantitative approach to psychiatric nosology. This approach identifies psychopathology constructs based on patterns of co-variation among signs and symptoms. The initial HiTOP model, which was published in 2017, is based on a large literature that spans decades of research. HiTOP is a living model that undergoes revision as new data become available. Here we discuss advantages and practical considerations of using this system in psychiatric practice and research. We especially highlight limitations of HiTOP and ongoing efforts to address them. We describe differences and similarities between HiTOP and existing diagnostic systems. Next, we review the types of evidence that informed development of HiTOP, including populations in which it has been studied and data on its validity. The paper also describes how HiTOP can facilitate research on genetic and environmental causes of psychopathology as well as the search for neurobiologic mechanisms and novel treatments. Furthermore, we consider implications for public health programs and prevention of mental disorders. We also review data on clinical utility and illustrate clinical application of HiTOP. Importantly, the model is based on measures and practices that are already used widely in clinical settings. HiTOP offers a way to organize and formalize these techniques. This model already can contribute to progress in psychiatry and complement traditional nosologies. Moreover, HiTOP seeks to facilitate research on linkages between phenotypes and biological processes, which may enable construction of a system that encompasses both biomarkers and precise clinical description.
The Minnesota Center for Twin and Family Research (MCTFR) comprises multiple longitudinal, community-representative investigations of twin and adoptive families that focus on psychological adjustment, personality, cognitive ability and brain function, with a special emphasis on substance use and related psychopathology. The MCTFR includes the Minnesota Twin Registry (MTR), a cohort of twins who have completed assessments in middle and older adulthood; the Minnesota Twin Family Study (MTFS) of twins assessed from childhood and adolescence into middle adulthood; the Enrichment Study (ES) of twins oversampled for high risk for substance-use disorders assessed from childhood into young adulthood; the Adolescent Brain (AdBrain) study, a neuroimaging study of adolescent twins; and the Siblings Interaction and Behavior Study (SIBS), a study of adoptive and nonadoptive families assessed from adolescence into young adulthood. Here we provide a brief overview of key features of these established studies and describe new MCTFR investigations that follow up and expand upon existing studies or recruit and assess new samples, including the MTR Study of Relationships, Personality, and Health (MTR-RPH); the Colorado-Minnesota (COMN) Marijuana Study; the Adolescent Brain Cognitive Development (ABCD) study; the Colorado Online Twins (CoTwins) study and the Children of Twins (CoT) study.
Medical procedures and patient care activities may facilitate environmental dissemination of healthcare-associated pathogens such as methicillin-resistant Staphylococcus aureus (MRSA).
Design:
Observational cohort study of MRSA-colonized patients to determine the frequency of and risk factors for environmental shedding of MRSA during procedures and care activities in carriers with positive nares and/or wound cultures. Bivariate analyses were performed to identify factors associated with environmental shedding.
Setting:
A Veterans Affairs hospital.
Participants:
This study included 75 patients in contact precautions for MRSA colonization or infection.
Results:
Of 75 patients in contact precautions for MRSA, 55 (73%) had MRSA in nares and/or wounds and 25 (33%) had positive skin cultures. For the 52 patients with MRSA in nares and/or wounds and at least 1 observed procedure, environmental shedding of MRSA occurred more frequently during procedures and care activities than in the absence of a procedure (59 of 138, 43% vs 8 of 83, 10%; P < .001). During procedures, increased shedding occurred ≤0.9 m versus >0.9 m from the patient (52 of 138, 38% vs 25 of 138, 18%; P = .0004). Contamination occurred frequently on surfaces touched by personnel (12 of 38, 32%) and on portable equipment used for procedures (25 of 101, 25%). By bivariate analysis, the presence of a wound with MRSA was associated with shedding (17 of 29, 59% versus 6 of 23, 26%; P = .04).
Conclusions:
Environmental shedding of MRSA occurs frequently during medical procedures and patient care activities. There is a need for effective strategies to disinfect surfaces and equipment after procedures.
In “Toward a Theory of Race, Crime, and Urban Inequality,” Sampson and Wilson (1995) argued that racial disparities in violent crime are attributable in large part to the persistent structural disadvantages that are disproportionately concentrated in African American communities. They also argued that the ultimate causes of crime were similar for both Whites and Blacks, leading to what has been labeled the thesis of “racial invariance.” In light of the large scale social changes of the past two decades and the renewed political salience of race and crime in the United States, this paper reassesses and updates evidence evaluating the theory. In so doing, we clarify key concepts from the original thesis, delineate the proper context of validation, and address new challenges. Overall, we find that the accumulated empirical evidence provides broad but qualified support for the theoretical claims. We conclude by charting a dual path forward: an agenda for future research on the linkages between race and crime, and policy recommendations that align with the theory’s emphasis on neighborhood level structural forces but with causal space for cultural factors.
Antineuronal antibodies are associated with psychosis, although their clinical significance in first episode of psychosis (FEP) is undetermined.
Aims
To examine all patients admitted for treatment of FEP for antineuronal antibodies and describe clinical presentations and treatment outcomes in those who were antibody positive.
Method
Individuals admitted for FEP to six mental health units in Queensland, Australia, were prospectively tested for serum antineuronal antibodies. Antibody-positive patients were referred for neurological and immunological assessment and therapy.
Results
Of 113 consenting participants, six had antineuronal antibodies (anti-N-methyl-D-aspartate receptor antibodies [n = 4], voltage-gated potassium channel antibodies [n = 1] and antibodies against uncharacterised antigen [n = 1]). Five received immunotherapy, which prompted resolution of psychosis in four.
Conclusions
A small subgroup of patients admitted to hospital with FEP have antineuronal antibodies detectable in serum and are responsive to immunotherapy. Early diagnosis and treatment is critical to optimise recovery.
Timing of weed emergence and seed persistence in the soil influence the ability to implement timely and effective control practices. Emergence patterns and seed persistence of kochia populations were monitored in 2010 and 2011 at sites in Kansas, Colorado, Wyoming, Nebraska, and South Dakota. Weekly observations of emergence were initiated in March and continued until no new emergence occurred. Seed was harvested from each site, placed into 100-seed mesh packets, and buried at depths of 0, 2.5, and 10 cm in fall of 2010 and 2011. Packets were exhumed at 6-mo intervals over 2 yr. Viability of exhumed seeds was evaluated. Nonlinear mixed-effects Weibull models were fit to cumulative emergence (%) across growing degree days (GDD) and to viable seed (%) across burial time to describe their fixed and random effects across site-years. Final emergence densities varied among site-years and ranged from as few as 4 to almost 380,000 seedlings m−2. Across 11 site-years in Kansas, cumulative GDD needed for 10% emergence were 168, while across 6 site-years in Wyoming and Nebraska, only 90 GDD were needed; on the calendar, this date shifted from early to late March. The majority (>95%) of kochia seed did not persist for more than 2 yr. Remaining seed viability was generally >80% when seeds were exhumed within 6 mo after burial in March, and declined to <5% by October of the first year after burial. Burial did not appear to increase or decrease seed viability over time but placed seed in a position from which seedling emergence would not be possible. High seedling emergence that occurs very early in the spring emphasizes the need for fall or early spring PRE weed control such as tillage, herbicides, and cover crops, while continued emergence into midsummer emphasizes the need for extended periods of kochia management.
A survey to determine the frequency and weed control impact of enhanced degradation of butylate or EPTC in field soils receiving repeat applications of these herbicides was conducted in a sugarbeet and three corn growing areas of Nebraska. All seven of the sugarbeet field soils exhibited enhanced EPTC degradation. In the corn areas, none of the 13 north central and southeast field soils displayed accelerated degradation; however, 10 of the 16 south central field soils did. In south central Nebraska, 60% and 45% of the surveyed growers were dissatisfied with weed control from butylate or EPTC in 1983 and 1984, respectively, compared to 24% and none in other survey areas. Enhanced herbicide degradation and the presence of shattercane were the main reasons for the disparity among areas.
Corn (Zea mays L. ‘Pioneer 3732′) showed little to no injury following the postemergence-directed application of sethoxydim {2-[1-(ethoxyimino)butyl]-5-[2-(ethylthio) propyl]-3-hydroxy-2-cyclohexen-1-one} plus crop oil concentrate (COC) at 56 g/ha plus 1.25% (v/v) at nine locations across Midwestern U.S. in 1984 and 1985. Little corn injury also occurred for the postemergence-directed application of sethoxydim plus COC at 110 g/ha plus 1.25% (v/v) at most locations in both years. Considerable variation in tolerance was seen across locations for over-the-top applications of sethoxydim at all rates tested and for the directed application at 220 g/ha. Although corn at most locations showed no yield reduction with the over-the-top application of sethoxydim plus COC at 56 g/ha plus 1.25% (v/v), a 70% yield reduction occurred in one location in one year. For an over-the-top application of sethoxydim plus COC at 110 g/ha plus 1.25% (v/v), yields ranged from 3 to 95% of the untreated check in 1984, and from 3 to 88% in 1985. Stand reductions from an over-the-top application of sethoxydim plus COC at 220 g/ha plus 1.25% (v/v) ranged from 0 to 99%. A significant negative correlation was found between yield of corn treated over the top with sethoxydim and precipitation on the day of application and in the week following application. Air temperature on the day of application was positively correlated with corn injury from over-the-top and directed sethoxydim applications, but no correlation existed between percent relative humidity and corn injury. Open pan evaporation and solar radiation before and after application were not correlated with corn injury from sethoxydim.
Effectiveness of rotary hoeing with cultivation and comparison of an in-row cultivator with a standard row-crop cultivator were determined in dry edible bean. The effectiveness of in-row cultivation conducted at various timings and frequencies was examined. The in-row cultivator was more effective in reducing weed populations than the standard cultivator, although at least two mechanical weeding operations were needed to reduce weed populations to levels of the herbicide check (EPTC [S-ethyl dipropyl carbamothioate] plus ethalfluralin). When the in-row cultivation was delayed until the second trifoliolate stage or later, weed populations were greater than those in the herbicide check. In situations with high weed populations, rotary hoeing prior to cultivation was required to reduce weed populations to levels similar to the herbicide check. An in-row cultivator has potential to improve mechanical weed control options in a crop such as dry edible bean. The types of adjustments made in combination with soil textures, soil moisture, and operator experience affect overall weed control. Thus, it is expected that the level of weed control will vary from year to year and even field to field for the same operator.
The effects of the dimethylamine salt of dicamba (3,6-dichloro-2-methoxybenzoic acid) and the dimethylamine salt of 2,4-D [(2,4-dichlorophenoxy)acetic acid] on fieldbeans (Phaseolus vulgaris L. ‘Great Northern Valley’) were studied in order to assess the potential hazards of using these herbicides in areas adjoining fieldbean production. Dicamba and 2,4-D were applied to fieldbeans at three different rates (1.1, 11.2, and 112.5 g ai/ha) and four different growth stages (preemergence, second trifoliolate leaf, early bloom, and early pod). Application of 2,4-D preemergence or in the second trifoliolate leaf stage of growth did not reduce seed yield, delay maturity, or reduce germination of seed obtained from treated plants. Dicamba or 2,4-D applied at 112.5 g/ha to fieldbeans in the early bloom or early pod stages of growth consistently reduced seed yield, delayed maturity, and reduced germination percentage. Fieldbeans exhibited a greater overall sensitivity to dicamba than to 2,4-D.
Research conducted since 1979 in the north central United States and southern Canada demonstrated that after repeated annual applications of the same thiocarbamate herbicide to the same field, control of some difficult-to-control weed species was reduced. Laboratory studies of herbicide degradation in soils from these fields indicated that these performance failures were due to more rapid or “enhanced” biodegradation of the thiocarbamate herbicides after repeated use with a shorter period during which effective herbicide levels remained in the soils. Weeds such as wild proso millet [Panicum miliaceum L. spp. ruderale (Kitagawa) Tzevelev. #3 PANMI] and shattercane [Sorghum bicolor (L.) Moench. # SORVU] which germinate over long time periods were most likely to escape these herbicides after repeated use. Adding dietholate (O,O-diethyl O-phenyl phosphorothioate) to EPTC (S-ethyl dipropyl carbamothioate) reduced problems caused by enhanced EPTC biodegradation in soils treated previously with EPTC alone but not in soils previously treated with EPTC plus dietholate. While previous use of other thiocarbamate herbicides frequently enhanced biodegradation of EPTC or butylate [S-ethyl bis(2-methylpropyl)carbamothioate], previous use of other classes of herbicides or the insecticide carbofuran (2,3 -dihydro-2,2 -dimethyl-7-benzofuranyl methylcarbamate) did not. Enhanced biodegradation of herbicides other than the thiocarbamates was not observed.
Field experiments, conducted from 1991 to 1994, generated information on weed seedbank emergence for 22 site-years from Ohio to Colorado and Minnesota to Missouri. Early spring seedbank densities were estimated through direct extraction of viable seeds from soil cores. Emerged seedlings were recorded periodically, as were daily values for air and soil temperature, and precipitation. Percentages of weed seedbanks that emerged as seedlings were calculated from seedbank and seedling data for each species, and relationships between seedbank emergence and microclimatic variables were sought. Fifteen species were found in 3 or more site-years. Average emergence percentages (and coefficients of variation) of these species were as follows: giant foxtail, 31.2 (84%); velvetleaf, 28.2 (66); kochia, 25.7 (79); Pennsylvania smartweed, 25.1 (65); common purslane, 15.4 (135); common ragweed, 15.0 (110); green foxtail, 8.5 (72); wild proso millet, 6.6 (104); hairy nightshade, 5.2 (62); common sunflower, 5.0 (26); yellow foxtail, 3.4 (67); pigweed species, 3.3 (103); common lambsquarters, 2.7 (111); wild buckwheat, 2.5 (63), and prostrate knotweed, 0.6 (79). Variation among site-years, for some species, could be attributed to microclimate variables thought to induce secondary dormancy in spring. For example, total seasonal emergence percentage of giant foxtail was related positively to the 1st date at which average daily soil temperature at 5 to 10 cm soil depth reached 16 C. Thus, if soil warmed before mid April, secondary dormancy was induced and few seedlings emerged, whereas many seedlings emerged if soil remained cool until June.
To determine the impact of an environmental disinfection intervention on the incidence of healthcare-associated Clostridium difficile infection (CDI).
DESIGN
A multicenter randomized trial.
SETTING
In total,16 acute-care hospitals in northeastern Ohio participated in the study.
INTERVENTION
We conducted a 12-month randomized trial to compare standard cleaning to enhanced cleaning that included monitoring of environmental services (EVS) personnel performance with feedback to EVS and infection control staff. We assessed the thoroughness of cleaning based on fluorescent marker removal from high-touch surfaces and the effectiveness of disinfection based on environmental cultures for C. difficile. A linear mixed model was used to compare CDI rates in the intervention and postintervention periods for control and intervention hospitals. The primary outcome was the incidence of healthcare-associated CDI.
RESULTS
Overall, 7 intervention hospitals and 8 control hospitals completed the study. The intervention resulted in significantly increased fluorescent marker removal in CDI and non-CDI rooms and decreased recovery of C. difficile from high-touch surfaces in CDI rooms. However, no reduction was observed in the incidence of healthcare-associated CDI in the intervention hospitals during the intervention and postintervention periods. Moreover, there was no correlation between the percentage of positive cultures after cleaning of CDI or non-CDI rooms and the incidence of healthcare-associated CDI.
CONCLUSIONS
An environmental disinfection intervention improved the thoroughness and effectiveness of cleaning but did not reduce the incidence of healthcare-associated CDI. Thus, interventions that focus only on improving cleaning may not be sufficient to control healthcare-associated CDI.
Field studies were conducted in 2003 and 2004 near Scottsbluff and Sidney, NE, to identify efficacious chemical weed-control options for irrigated and dryland chickpea production. Weed control had a greater relative effect on chickpea yield in the irrigated system than the dryland system, with yield from the hand-weeded check exceeding the nontreated check by 1,500% in the irrigated system and 87% in the dryland system. Imazethapyr, applied preemergence at the rate of 0.053 kg ai/ha, reduced plant height, delayed plant maturity, and caused leaf chlorosis. At Scottsbluff, preplant-incorporated ethalfluralin caused significant crop injury in 2003, but the ethalfluralin treatment also maintained weed densities 4 wk after crop emergence that were not significantly different than the hand-weeded check at both locations in 2003 and 2004. Treatments containing sulfentrazone provided a similar level of weed control but without any evidence of crop injury. Pendimethalin and pendimethalin + dimethenamid-P applied preemergence provided acceptable weed control in the irrigated system, where water was applied within 4 d after herbicide application, but did not provide acceptable control in the dryland system.