We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Cover crop adoption is increasing among growers with the occurrence of herbicide-resistant weed species. A field study conducted at three sites from autumn 2021 through the crop harvest in 2022 in Alabama aimed to evaluate the combined effect of cover crop residue and herbicides for weed control and improved cotton lint yield. The experiment was conducted in split-plot design with main plots consisting of six cover crop treatments: cereal rye, crimson clover, oat, radish, cover crop mixture, and winter fallow. The subplots included four herbicide treatments: (i) preemergence, pendimethalin + fomesafen, (ii) postemergence, dicamba + glyphosate + S-metolachlor, (iii) preemergence followed by postemergence, and (iv) nontreated (NT) check. Cover crops, excluding radish, exhibited greater weed biomass reduction than winter fallow with corresponding herbicide treatments of either preemergence, postemergence, or preemergence + postemergence as compared to control (winter fallow and NT check). Considering preemergence + postemergence treatment, cereal rye, crimson clover, oat, and cover crop mixture provided >95% weed biomass reduction as compared to control. Looking at the overall effect of cover crop, cereal rye outperformed and showed greater weed biomass reduction than radish relative to control. Preemergence + postemergence herbicide treatment resulted in greater lint yield than other treatments. Cotton in cereal rye plots had a greater lint yield than in winter fallow at one out of three locations. In conclusion, integrating herbicides and incorporating high-residue cover crops such as cereal rye is an effective weed management strategy to control troublesome weeds.
In mid-southern, southeastern, and northeastern U.S. soybean production regions, the evolution of herbicide-resistant weeds has become a significant management challenge for growers. The rising herbicide costs for managing herbicide-resistant weeds are also a growing concern, leading to the utilization of cover crops as an integrated weed management strategy for addressing these challenges. Field experiments were conducted at two locations in Alabama in 2022 to evaluate winter cereal cover crops, including a mixture and herbicide system integration in soybean. Treatments included five cover crops: oats, cereal rye, crimson clover, radish, and a cover crop mixture. Cover crops were evaluated for their weed-suppressive characteristics compared to a winter fallow treatment. Additionally, four herbicide treatments were applied: a preemergence (PRE) herbicide, a postemergence (POST) herbicide, PRE plus POST herbicides, and a nontreated (NT) check. The PRE herbicide was S-metolachlor; the POST treatment contained a mixture of dicamba and glyphosate. The PRE plus POST system contained the PRE application followed by POST application. Results show that cereal rye and the cover crop mixture provided weed biomass reduction compared to all cover crop treatments across both locations. Furthermore, we observed greater soybean yield following the cereal rye cover crop than following the winter fallow treatment at one location. POST and PRE plus POST herbicide treatment resulted in greater weed biomass reduction and improved soybean yield than the PRE herbicide treatment alone and the NT check at both locations.
Assessment of medication management, an instrumental activity of daily living (IADL), is particularly important among Veterans, who are prescribed an average of 2540 prescriptions per year (Nguyen et al., 2017). The Pillbox Test (PT) is a brief, performance-based measure that was designed as an ecologically valid measure of executive functioning (EF; Zartman, Hilsabeck, Guarnaccia, & Houtz, 2013), the cognitive domain most predictive of successful medication schedule management (Suchy, Ziemnik, Niermeyer, & Brothers, 2020). However, a validation study by Logue, Marceaux, Balldin, and Hilsabeck (2015) found that EF predicted performance on the PT more so than processing speed (PS), but not the language, attention, visuospatial, and memory domains combined. Thus, this project sought to increase generalizability of the latter study by replicating and extending their investigation utilizing a larger set of neuropsychological tests.
Participants and Methods:
Participants included 176 patients in a mixed clinical sample (5.1% female, 43.2% Black/African American, 55.7% white, Mage = 70.7 years, SDage = 9.3, Medu = 12.6 years, SDedu = 2.6) who completed a comprehensive neuropsychological evaluation in a VA medical center. All participants completed the PT where they had five minutes to organize five pill bottles using a seven-day pillbox according to standardized instructions on the labels. Participants also completed some combination of 26 neuropsychological tests (i.e., participants did not complete every test as evaluations were tailored to disparate referral questions). Correlations between completed tests and number of pillbox errors were evaluated. These tests were then combined into the following six domains: language, visuospatial, working memory (WM), psychomotor/PS, memory, and EF. Hierarchical multiple regression was completed using these domains to predict pillbox errors.
Results:
Spearman’s correlation coefficients indicated that 25 tests had a weak to moderate relationship with PT total errors (rs = 0.23 -0.51); forward digit span was not significantly related (rs = 0.13). A forced-entry multiple regression was run to predict PT total errors from the six domains. The model accounted for 29% of the variance in PT performance, F(6, 169) = 11.56, p < .001. Of the domains, psychomotor/PS made the greatest contribution, f(169) = 2.73, p = .007, followed by language, f(169) = 2.41, p = .017, and WM, f(169) = 2.15, p = .033. Visuospatial performance and EF did not make significant contributions (ps>.05). Next, two hierarchical multiple regressions were run. Results indicated that EF predicted performance on the PT beyond measures of PS, AR2 = .02, p = .044, but not beyond the combination of all cognitive domains, AR2 = .00, p = .863.
Conclusions:
Results of this study partially replicated the findings of Logue et al. (2015). Namely, EF predicted PT performance beyond PS, but not other cognitive domains. However, when all predictors were entered into the same model, visuospatial performance did not significantly contribute to the prediction of pillbox errors. These results suggest that providers may benefit from investigating medication management abilities when deficits in PS, WM, and/or language are identified. Further research is needed to better understand which domains best predict PT failure.
Anterior temporal lobectomy is a common surgical approach for medication-resistant temporal lobe epilepsy (TLE). Prior studies have shown inconsistent findings regarding the utility of presurgical intracarotid sodium amobarbital testing (IAT; also known as Wada test) and neuroimaging in predicting postoperative seizure control. In the present study, we evaluated the predictive utility of IAT, as well as structural magnetic resonance imaging (MRI) and positron emission tomography (PET), on long-term (3-years) seizure outcome following surgery for TLE.
Participants and Methods:
Patients consisted of 107 adults (mean age=38.6, SD=12.2; mean education=13.3 years, SD=2.0; female=47.7%; White=100%) with TLE (mean epilepsy duration =23.0 years, SD=15.7; left TLE surgery=50.5%). We examined whether demographic, clinical (side of resection, resection type [selective vs. non-selective], hemisphere of language dominance, epilepsy duration), and presurgical studies (normal vs. abnormal MRI, normal vs. abnormal PET, correctly lateralizing vs. incorrectly lateralizing IAT) were associated with absolute (cross-sectional) seizure outcome (i.e., freedom vs. recurrence) with a series of chi-squared and t-tests. Additionally, we determined whether presurgical evaluations predicted time to seizure recurrence (longitudinal outcome) over a three-year period with univariate Cox regression models, and we compared survival curves with Mantel-Cox (log rank) tests.
Results:
Demographic and clinical variables (including type [selective vs. whole lobectomy] and side of resection) were not associated with seizure outcome. No associations were found among the presurgical variables. Presurgical MRI was not associated with cross-sectional (OR=1.5, p=.557, 95% CI=0.4-5.7) or longitudinal (HR=1.2, p=.641, 95% CI=0.4-3.9) seizure outcome. Normal PET scan (OR= 4.8, p=.045, 95% CI=1.0-24.3) and IAT incorrectly lateralizing to seizure focus (OR=3.9, p=.018, 95% CI=1.2-12.9) were associated with higher odds of seizure recurrence. Furthermore, normal PET scan (HR=3.6, p=.028, 95% CI =1.0-13.5) and incorrectly lateralized IAT (HR= 2.8, p=.012, 95% CI=1.2-7.0) were presurgical predictors of earlier seizure recurrence within three years of TLE surgery. Log rank tests indicated that survival functions were significantly different between patients with normal vs. abnormal PET and incorrectly vs. correctly lateralizing IAT such that these had seizure relapse five and seven months earlier on average (respectively).
Conclusions:
Presurgical normal PET scan and incorrectly lateralizing IAT were associated with increased risk of post-surgical seizure recurrence and shorter time-to-seizure relapse.
A variety of dimensions of psychopathology are observed in psychosis. However, the validation of clinical assessment scales, and their latent variable structure, is often derived from cross-sectional rather than longitudinal data, limiting our understanding of how variables interact and reinforce one another.
Objectives
Using experience sampling methodology (ESM) and analytic approaches optimised for longitudinal data, we assess potential latent variables of commonly-reported symptoms in psychosis, and explore the temporal relationship between them.
Methods
N=36 participants with a diagnosis of schizophrenia or schizoaffective disorder provided data for up to one year, as part of the Sleepsight study. Using a smartphone app, participants self-reported clinical symptoms once daily for a mean duration of 323 days (SD: 88), with a response rate of 69%. Symptoms were rated using seven-point Likert scale items. Items included symptoms traditionally implicated in psychosis (feeling “cheerful”, “anxious”, “relaxed”, “irritable”, “sad”, “in control”, “stressed”, “suspicious”, “trouble concentrating”, “preoccupied by thoughts”, “others dislike me”, “confused”, “others influence my thoughts” and “unusual sights and sounds”). We used a sparse PCA (SPCA) model to identify latent variables in the longitudinal data. SPCA has previously been applied to longitudinal ESM data, and was developed to achieve a compromise between the explained variance and the interpretability of the principal components. We then used a multistage exploratory and confirmatory differential time-varying effect model (DTVEM) to explore the temporal relationship between the latent variables. DTVEM generates a standardised β coefficient reflecting the strength of relationship between variables across multiple time lags. Only significant lags (p<0.05) are reported here.
Results
The SPCA analysis identified five latent variables, explaining 61.4% of the total variance. Tentative interpretation of the SPCA loadings suggested these latent variables corresponded to i) cognitive symptoms, ii) feeling in-control, iii) thought interference and perceptual disturbance, iv) irritability and stress and v) paranoia. Time lag analysis revealed an effect of feeling in-control on subsequent cognitive symptoms (β=-0.19), and of cognitive symptoms on subsequent thought interference and perceptual disturbance (β=0.14). Irritability and stress was also associated with subsequent cognitive symptoms (β=0.09).
Conclusions
Using longitudinal data, we employ novel methodology to identify potential latent symptoms among commonly reported symptoms in psychosis. We identify five latent symptoms, and elucidate important temporal relationships between them. These findings may inform our understanding of the psychopathology of psychosis, potentially offering data-driven simplification of clinical assessment and novel insights for future research.
A complex system is composed of many elements that interact with each other and their environment. The term emergence is used to describe how the large-scale features of the complex system arise from interactions between the components, and these system-level features are called emergent phenomena. This chapter reviews the multidisciplinary study of complex systems in physics, biology, and social sciences. This chapter reviews three topics: first, research on how people learn how to think about complex systems; second, how learning environments themselves can be analyzed as complex systems; and finally, how the analytic methods of complexity science – such as computer modeling – can be applied to the learning sciences. The chapter summarizes challenges and future opportunities for helping students learn about complex systems and for research in the learning sciences that considers educational systems to be complex phenomena.
To determine the usefulness of adjusting antibiotic use (AU) by prevalence of bacterial isolates as an alternative method for risk adjustment beyond hospital characteristics.
AU in days of therapy per 1,000 patient days and microbiologic data from 2015 and 2016 were collected from 26 hospitals. The prevalences of Pseudomonas aeruginosa, extended-spectrum β-lactamase (ESBL)–producing bacteria, methicillin-resistant Staphylococcus aureus (MRSA), and vancomycin-resistant enterococci (VRE) were calculated and compared to the average prevalence of all hospitals in the network. This proportion was used to calculate the adjusted AU (a-AU) for various categories of antimicrobials. For example, a-AU of antipseudomonal β-lactams (APBL) was the AU of APBL divided by (prevalence of P. aeruginosa at that hospital divided by the average prevalence of P. aeruginosa). Hospitals were categorized by bed size and ranked by AU and a-AU, and the rankings were compared.
Results:
Most hospitals in 2015 and 2016, respectively, moved ≥2 positions in the ranking using a-AU of APBL (15 of 24, 63%; 22 of 26, 85%), carbapenems (14 of 23, 61%; 22 of 25; 88%), anti-MRSA agents (13 of 23, 57%; 18 of 26, 69%), and anti-VRE agents (18 of 24, 75%; 15 of 26, 58%). Use of a-AU resulted in a shift in quartile of hospital ranking for 50% of APBL agents, 57% of carbapenems, 35% of anti-MRSA agents, and 75% of anti-VRE agents in 2015 and 50% of APBL agents, 28% of carbapenems, 50% of anti-MRSA agents, and 58% of anti-VRE agents in 2016.
Conclusions:
The a-AU considerably changes how hospitals compare among each other within a network. Adjusting AU by microbiological burden allows for a more balanced comparison among hospitals with variable baseline rates of resistant bacteria.
Adherence to practice guidelines for diagnosing and treating attention-deficit/hyperactivity disorder (ADHD) by primary care providers (PCPs) is important for optimizing care for many children and youth. However, adherence is often low. To address this problem, we implemented an intensive intervention in 2009 aimed at improving diagnosis and management of ADHD among PCPs.
Objectives
The study objective is to assess the sustainability of intervention-attributable outcomes.
Aims
The study aims are to assess the sustained effect of the intervention on PCP intentions to implement, attitudes toward, and obstacles to implement ADHD practice guidelines.
Methods
During November 2009, 48 PCPs from 31 clinical practices completed a 3-day training, 6 months of biweekly telephone peer group reinforcement, and baseline questionnaires; follow-up questionnaires were completed at 12 months. To assess sustainability, we tracked PCPs and administered the questionnaire in 2016.
Results
Intentions to implement ADHD guidelines remained stable over seven years, with all mean values ranging from “probably will” to “definitely will” implement guidelines.
Conclusions
Generally, favorable self-reported intentions (see Exhibits 1 & 2), attitudes and obstacles to implementing ADHD guidelines were sustained seven years after the intensive training and follow-up intervention.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
A principal mode of corrosion in combustion or fuel cell environments is the formation of volatile hydroxides and oxyhydroxides from metal or oxide surfaces at high temperatures. It is important to determine the degree of volatility and accurate thermodynamic properties for these hydroxides. Significant gaseous metal hydroxides/oxyhydroxides are discussed, along with available experimental and theoretical methods of characterizing species and determining their thermodynamic properties.
Virtual Engineering (VE), also known as Model-Based Systems Engineering (MBSE), is necessary in both current operational engineering qualifications and to help reduce the costs of future vertical lift design and analysis. As computational power continues to provide increasing capability to the rotorcraft engineering community to perform simulations in both real time and off line, it is imperative that the community develop verification and validation protocols and processes to certify these methods so that they can be reliably used to help reduce engineering cost and schedule. Computational Fluid Dynamics (CFD) has become a major Computational Science and Engineering (CSE) tool in the fixed wing and vertical lift communities, but it has not been developed to the point where it is accepted as a replacement for testing in certification of new or existing systems or vehicles. Since the rise of modern CFD in the 1980s, the promise of CFD’s capabilities has been met or exceeded, but its role in certification arguably remains less prominent than projected. The ability to implement transformative technologies further drives the need for CFD in design. To meet CFD’s role in certification, several goals must be met to provide a true “numerical experiment” from which accuracies (error estimates), sensitivities, and consistent application results can be extracted. This paper discusses the progress and direction towards developing CFD strategies for certification.
An adverse early life environment can increase the risk of metabolic and other disorders later in life. Genetic variation can modify an individual’s susceptibility to these environmental challenges. These gene by environment interactions are important, but difficult, to dissect. The nucleus is the primary organelle where environmental responses impact directly on the genetic variants within the genome, resulting in changes to the biology of the genome and ultimately the phenotype. Understanding genome biology requires the integration of the linear DNA sequence, epigenetic modifications and nuclear proteins that are present within the nucleus. The interactions between these layers of information may be captured in the emergent spatial genome organization. As such genome organization represents a key research area for decoding the role of genetic variation in the Developmental Origins of Health and Disease.
Introduction: Our tertiary care institution embarked on the Choosing Wisely campaign to reduce unnecessary testing, and selected the reduction of ankle x-rays as part of its top five priority initiatives. The Low Risk Ankle Rule (LRAR), an evidence-based decision rule, has been derived and validated to clinically evaluate ankle injuries which do not require radiography. The LRAR, is cost-effective, has 100% sensitivity for clinically important ankle injuries and reduces ankle imaging rates by 30-60% in both academic and community setting. Our objective was to significantly reduce the proportion of ankle x-rays ordered for acute ankle injuries presenting to our pediatric Emergency Department (ED). Methods: Medical records were reviewed for all patients presenting to our tertiary care pediatric ED (ages 3- 18 years) with an isolated acute ankle injury from Jan 1, 2016-Sept 30, 2016. Children with outside imaging, an injury that occurred >72 hours prior, or those who had a repeat ED visit for same injury were excluded. Quality improvement (QI) initiatives included multidisciplinary staff education about the LRAR, posters placed within the ED highlighting the LRAR, development of a new diagnostic imaging requisition for ankle x-rays requiring use of the LRAR and collaboration with the Division of Radiology to ensure compliance with new requisition. The proportion of patients presenting to the ED with acute ankle injuries who received x-rays was measured. ED length of stay (LOS), return visits to the ED and orthopedic referrals were collected as balancing measures. Results: At baseline 88% of patients with acute ankle injuries received x-rays. Following our multiple interventions, the proportion of x-rays decreased significantly to 54%, (p<0.001). This decrease in x-ray rate was not associated with an increase in ED LOS, ED return visits or orthopedic referrals. There was an increase uptake of the dedicated x-ray requisition over time to 71%. Conclusion: This QI initiative to increase uptake of the LRAR, resulted in a significant reduction of ankle x-rays rates for children presenting with acute ankle injuries in our pediatric ED without increasing LOS, return visits or need for orthopedic referrals for missed injuries. Just as in the derivation and validation studies, the reductions have been sustained and reduced unnecessary testing and ionizing radiation.
Background. For irrational fears and their associated phobias, epidemiological studies suggest sex differences in prevalence and twin studies report significant genetic effects. How does sex impact on the familial transmission of liability to fears and phobias?
Methods. In personal interviews with over 3000 complete pairs (of whom 1058 were opposite-sex dizygotic pairs), ascertained from a population-based registry, we assessed the lifetime prevalence of five phobias and their associated irrational fears analysed using a multiple threshold model. Twin resemblance was assessed by polychoric correlations and biometrical model-fitting incorporating sex-specific effects.
Results. For agoraphobia, situational and blood/injury fear/phobia, the best fit model suggested equal heritability in males and females and genetic correlations between the sexes of less than +0·50. For animal fear/phobias by contrast, the best fit model suggested equal heritability in males and females and a genetic correlation of unity. No evidence was found for an impact of family environment on liability to these fears or phobias. For social phobias, twin resemblance in males was explained by genetic factors and in females by familial–environmental factors.
Conclusion. The impact of sex on genetic risk may differ meaningfully across phobia subtypes. Sex-specific genetic risk factors may exist for agoraphobia, social, situational and blood-injury phobias but not for animal fear/phobia. These results should be interpreted in the context of the limited power of twin studies, even with large sample sizes, to resolve sex-specific genetic effects.
In a recent article, I discussed vocative uses of οὗτος in the works of Aeschylus, Sophocles, Euripides and Aristophanes, showing that there are two types of vocatives: ‘calls’, which are utterance-initial and directed at one whose attention is turned elsewhere, and ‘addresses’, which are non-initial, employed by a speaker who is already conversing with a hearer, and typically indicate a speaker's annoyance at the hearer. Menander uses οὗτος as a vocative in the same ways as the other dramatic poets, but there is one instance in Dyscolus that has been routinely misconstrued and merits clarification.
Data on soils with six Neoglacial moraines of the Klutlan Glacier have been compared with those from moraines at the warm, moist coastal site of Glacier Bay, 160 km south. Percentage organic matter increases rapidly for the first 100 to 150 yr of soil development and then continues to rise gradually for the next 100 yr. Soil pH falls from 8.0 in recent till to approximately 6.0 in 200-yr-old soils. Nitrogen levels in the mineral soil increase from near zero in recent tills to 0.7% in soils 175–200 yr old; organic horizons of soils associated with spruce forests in later successional stages contain approximately 1% nitrogen. Concentrations of certain inorganic phosphate ions in the different-aged soils increase continually throughout the succession. Data for nine chemical variables were subjected to a principal components analysis; the major pattern in the data reflects the differences between soils of low organic content and high pH present in early successional stages, and nutrient-rich soils with high organic content and low pH present after succession has progressed toward the spruce forest. These trends in soil development with time are strikingly similar to those reported from Glacier Bay, except that the changes in soil properties appear to be delayed by 50–100 yr at the Klutlan terminus. Although numerous signs of nitrogen deficiency have been identified in plants growing on new soils at Glacier Bay, none was observed visually in living plants or in nutrients measured in samples of foliage from three plant taxa (Epilobium latifolium, Salix spp., and Populus balsamifera) taken from the Klutlan moraines. Concentrations of nitrogen and other nutrients (Ca, Mg, K, total P) in the foliage samples show no clear trends with increasing soil development. Low temperatures, a short growing season, and very low mean annual precipitation probably limit plant growth and account for the delayed soil development on the Klutlan moraines.
In this paper, we present novel algorithms for finding small relations and ideal factorizations in the ideal class group of an order in an imaginary quadratic field, where both the norms of the prime ideals and the size of the coefficients involved are bounded. We show how our methods can be used to improve the computation of large-degree isogenies and endomorphism rings of elliptic curves defined over finite fields. For these problems, we obtain improved heuristic complexity results in almost all cases and significantly improved performance in practice. The speed-up is especially high in situations where the ideal class group can be computed in advance.
Trypanosomes and Leishmania are vector-borne parasites associated with high morbidity and mortality. Trypanosoma lewisi, putatively introduced with black rats and fleas, has been implicated in the extinction of two native rodents on Christmas Island (CI) and native trypanosomes are hypothesized to have caused decline in Australian marsupial populations on the mainland. This study investigated the distribution and prevalence of Trypanosoma spp. and Leishmania spp. in two introduced pests (cats and black rats) for three Australian locations. Molecular screening (PCR) on spleen tissue was performed on cats from CI (n = 35), Dirk Hartog Island (DHI; n = 23) and southwest Western Australia (swWA) (n = 58), and black rats from CI only (n = 46). Despite the continued presence of the intermediate and mechanical hosts of T. lewisi, there was no evidence of trypanosome or Leishmania infection in cats or rats from CI. Trypanosomes were not identified in cats from DHI or swWA. These findings suggest T. lewisi is no longer present on CI and endemic Trypanosoma spp. do not infect cats or rats in these locations.
Organized as a series of authoritative discussions, this book presents the application of Jewish law - or Halakhah - to contemporary social and political issues. Beginning with the principle of divine revelation, it describes the contents and canons of interpretation of Jewish law. Though divinely received, the law must still be interpreted and 'completed' by human minds, often leading to the conundrum of divergent but equally authentic interpretations. Examining topics from divorce to war and from rabbinic confidentiality to cloning, this book carefully delineates the issues presented in each case, showing the various positions taken by rabbinic scholars, clarifying areas of divergence, and analyzing reasons for disagreement. Written by widely recognized scholars of both Jewish and secular law, this book will be an invaluable source for all who seek authoritative guidance in understanding traditional Jewish law and practice.