We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Three kinds of opal-cristobalite, differentiated by the sharpness of the 4·1 Å XRD peak, were isolated from the Helms (Texas) bentonite by selective chemical dissolution followed by specific gravity separation. The δ18O value (oxygen isotope abundance) for these cristobalite isolates ranged from approximately 26–30‰ (parts per thousand), increasing with increased breadth of the 4·1 Å XRD peak. Opal-cristobalite isolated from the Monterey diatomite had a δ18O value of 34‰. These δ18O values are in the range for Cretaceous cherts (approximately 32‰) and are unlike the values of 9–11‰ obtained for low-cristobalite (XRD peaks at 4·05, 3·13, 2·4, and 2·49) formed hydrothermally or isolated from the vesicles of obsidian. The morphology pseudomorphic after diatoms, observed with the scanning electron microscope, was more apparent in the opal-cristobalite from the Monterey diatomite of Miocene age (approximately 10 million yr old) than in the spongy textured opal-cristobalite from the Helms bentonite, reflecting the 40 million yr available for crystallization since Upper Eocene.
The oxygen isotope abundance of Helms montmorillonite (δ18O = 26‰) indicates that it was formed in sea water while the δ18O values of the associated opal-cristobalite indicate that this SiO2 polymorph probably formed at approximately 25°C in meteoric water. Although both cristobalite and mont-montmorillonite in the bentonite were authigenic, the crystallization of the SiO2 phase apparently required a considerably longer period and occurred mainly after tectonic uplift.
In contrast to the results for cristobalite, quartz from the Helms and Upton (Wyoming) bentonites had δ18O values of 15 and 21‰ respectively. Such intermediate values, similar to those of aerosolic dusts of the Northern Hemisphere, loess, and many fluvial sediments and shales of the North Central United States (U.S.A.), preclude either a completely authigenic or a completely igneous origin for the quartz. These values probably result from a mixing of quartz from high and low temperature sources, detritally added to the ash or bentonite bed.
Jesus' response to the Syrophoenician woman in Mark 7.27 is sometimes seen as sexist, racist, or abusive. The force of his response depends in part on the diminutive form κυναριον, which is often dismissed as a faded diminutive that lacks true force. But a statistical, semantic, and contextual analysis of the word indicates that it does, in fact, have diminutive force in Mark 7:27. Because of this, the pejorative force found in direct insults employing the word ‘dog’ is lacking in Jesus' response. In addition to failing to recognise the diminutive force of κυναριον, interpreters sometimes assume a social context in which Jews routinely referred to Gentiles as dogs. Finally, the analogy that Jesus makes is often read allegorically, assuming that ‘children’ and ‘dogs’ have direct counterparts in ‘Jews’ and ‘Gentiles’. These assumptions are found to be dubious. The point of Jesus' analogy is about the proper order of events: children eat before the puppies; Jews receive the benefits of his ministry before Gentiles. The Syrophoenician woman outwits Jesus by arguing that the puppies may eat simultaneously with the children. The interpretive upshot is that Jesus' saying is unlikely to be misogynistic or abusive, but simply asserts Jewish priority, a priority that admits of exceptions and change.
From 2 to 28% opal-cristobalite was isolated from the 2–20 µm fraction of rhyolitic and andesitic tuffaceous pyroclastics from the Island of Honshu, Japan, where it had been formed in hydrothermal springs at temperatures of ∼25–170°C as calculated from the oxygen isotopic ratios (18O/16O). Three of the isolates gave X-ray powder diffractograms with strong peaks at 4.07 Å. Two of these also had very weak peaks at 4.32 Å indicative of the presence of traces of tridymite. The fourth isolate had a strong 4.11 Å cristobalite peak and a very weak 4.32 Å peak. The morphology, determined by the scanning electron microscope, varied with the formation temperature indicated by the oxygen isotopic ratio (δ18O), from spheroidal and spongy for the opal-cristobalite formed at ∼25°C (δ18O = 26.0‰) in contrast to angular irregular plates and prisms for that formed at ∼115°C (11.9‰), ∼135°C (7.9 ‰) and ∼170°C (6.8 ‰). The differences in δ18O values are attributed to variation in hydrothermal temperature, but some variability in oxygen isotopic composition of the water is possible. The field-measured temperatures related roughly with the calculated fractionation temperatures except in one site, while the contrast in cristobalite morphology related well to calculated low and high fractionation temperatures. Low-cristobalite of hydrothermal origin in New Zealand (δ18O = 9‰) had characteristic rounded grains with some evidence of platiness. Co-existing quartz grains (δ18O = 10‰) showed more subhedral and irregular prismatic morphology.
To assess the burden of respiratory virus coinfections with severe acute respiratory coronavirus virus 2 (SARS-CoV-2), this study reviewed 4,818 specimens positive for SARS-CoV-2 and tested using respiratory virus multiplex testing. Coinfections with SARS-CoV-2 were uncommon (2.8%), with enterovirus or rhinovirus as the most prevalent target (88.1%). Respiratory virus coinfection with SARS-CoV-2 remains low 1 year into the coronavirus disease 2019 (COVID-19) pandemic.
Several patristic authors witness to a tradition of ancient instruction frequently identified as “Teaching of Apostles” (didache apostolōn). While the precise nature of that corpus stayed hidden to scholars for centuries, clearly its contents were considered important in various regions of the early Christian Mediterranean world. This renown is demonstrated, for example, by the fourth-century Alexandrian bishop Athanasius, who observed in his annual letter declaring the time for Easter observance in 367 CE that “Teaching of the Apostles” (didache tōn apostolōn) was accepted in his diocese, though he omitted it from any “canon” of works thought worthy for liturgy. What could be identified about the features of the tradition remained vague from patristic sources generally, however, clarified only in part during the ninth century when Patriarch Nicephorus of Constantinople described the length of the text in his Stichometry as 200 “lines” (stichoi). Beyond this, little was known of the tradition prior to 1873, at which time Metropolitan Philotheos Bryennios of Nicomedia came upon a version of the text within Codex Hierosolymitanus 54 (= H), stored at the Jerusalem Monastery of the Holy Sepulcher in Constantinople (modern Istanbul). That codex, dated by inscription to June 11, 1056, contains a tractate bearing two distinct titles: a brief heading “Teaching of the Twelve Apostles” (didache tōn dōdeka apostolōn) and a longer one at the beginning of the opening line, “Teaching of the Lord through the Twelve Apostles to the Nations” (didache kuriou dia tōn dōdeka apostolōn tois ethnesin). While neither header was necessarily original to the tradition – the shorter title having possibly served as an incipit based on the longer form – such markers indicate this manuscript represents some form of the ancient Christian tradition of teaching now known by early church historians as the “Didache” (didache).
Introduction: Atrial Fibrillation (AF) is the most common arrhythmia seen in patients presenting to the emergency department (ED). AF increases the risk of ischemic stroke which can be mitigated by anticoagulant prescription. National guidelines advise that emergency physicians initiate anticoagulation when AF is first diagnosed. We aimed to evaluate the 90-day incidence of stroke and major bleeding among emergency patients discharged home with a new diagnosis of AF. Methods: This was a health records review of patients diagnosed with AF in two EDs. We included patients ≥ age 18, with a new diagnosis of AF who were discharged from the ED, between 1st May 2014 and 1st May 2017. Using a structure review we collected data on CHADS65 and CHADS2 scores, contraindications to direct oral anticoagulant (DOAC) prescription and initiation of anticoagulation in the ED. Patient charts were reviewed for the diagnosis of stroke, transient ischemic attack (TIA), ischemic gut, ischemic limb or other systemic embolism within 90 days of the index ED presentation. We extracted data on major bleeding events within 90 days, defined by the International Society of Thrombosis and Haemostasis criteria. All data were extracted in duplicate for validation. Results: We identified 399 patients fulfilling the inclusion criteria, median age 68 (IQR 57-79), 213 (53%) male. 11 patients were already prescribed an anticoagulant for another indication and 19 had a contraindication to prescription of a DOAC. 48/299 (16%) CHADS65 positive patients were initiated on an anticoagulant, 3 of whom had a contra-indication to initiation of anticoagulation in the ED (1 dual antiplatelet therapy, 2 liver cirrhosis). 1/100 CHADS65 negative patients was initiated on anticoagulation. The median CHADS2 score was 1 (IQR 0-2). Among the 49 patients initiated on anticoagulation, 3 patients had a stroke/TIA within 90 days, 6.1% (95% CI; 2.1-16.5%). There were no bleeding events 0.0% (95% CI; 0.0-7.3%). Among the 350 patients who were not initiated on anticoagulation in the ED, 4 patients had a stroke/TIA 1.1% (95% CI; 1.1-2.9%) within 90 days and 2 patients had a major bleeding event. Conclusion: Prescription of anticoagulation for new diagnoses of AF was under-utilized in these EDs. The 90-day stroke/TIA rate was high, even among those given an anticoagulant prescription in the ED. No patient had an anticoagulant-associated bleeding event.
Introduction: Participant interviews are often considered the ‘gold standard’ for measuring outcomes in diagnostic and prognostic studies. Participant exposure data are frequently collected during study interviews, but the reliability of this information often remains unknown. The objective of this study was to compare patient-reported medication exposures and outcomes to data extracted from electronic medical records (EMRs) to determine reliability. Methods: This was a secondary data analysis from a prospective observational cohort study enrolling older (≥ 65 years) patients who presented to one of three emergency departments after a fall. After patients had consented to participate in the study, they were asked about their use of antiplatelet and anticoagulation medications (exposures of interest). During follow up, participants were asked if a physician had told them they had bleeding in their head (diagnosis of intracranial hemorrhage). Patient-reported responses were compared to data extracted from a structured EMR review. Trained research assistants extracted medication exposure and outcome data from the hospital EMRs in duplicate for all visits to any hospital within 42 days. Inter-rater agreement was estimated using Cohen's kappa (K) statistics with 95% confidence intervals (CIs). Results: 1275 patients completed study interviews. 1163 (91%) responded to questioning about antiplatelet use and 1159 (91%) to anticoagulant use. Exact agreement between patient reported antiplatelet use compared to EMR review was 77%, with K = 0.50 (95% CI: 0.44 to 0.55). For anticoagulation use, exact agreement was 87%, with K = 0.68 (95% CI: 0.63 to 0.72). 986 (78%) patients had a follow up interview after 42 days. Exact agreement between patient reported intracranial bleeding and EMR review was 95%, with K = 0.30 (95% CI: 0.15 to 0.45). Using the EMR review as the reference standard, the sensitivity and specificity of patient reported intracranial bleeding was 34% (95% CI: 20 to 52%) and 97% (95% CI: 96 to 98%), respectively. Conclusion: In this population of older adults who presented to the ED after a fall, patient reported use of antiplatelet and anticoagulant medications was not a reliable method to identify medication use. Patients who were diagnosed with intracranial bleeding were particularly poor at reporting this diagnosis.
Background: Atrial fibrillation (AF) is a risk for stroke. The Canadian Cardiovascular Society advises patients who are CHADS65 positive should be started on oral anticoagulation (OAC). Our local emergency department (ED) review showed that only 16% of CHADS65 positive patients were started on OAC and that 2% of our patients were diagnosed with stroke within 90 days. We implemented a new pathway for initiation of OAC in the ED (the SAFE pathway). Aim Statement: We report the effectiveness and safety of the SAFE pathway for initiation of OAC in patients treated for AF in the ED. Measures & Design: A multidisciplinary group of physicians and pharmacist developed the SAFE pathway for patients who are discharged home from the ED with a diagnosis of AF. Step 1: contraindications to OAC, Step 2: CHADS65 score, Step 3: OAC dosing if indicated. The pathway triggers referral to AF clinic, family physician letter and follow up call from the ED pharmacist. Patients are followed for 90 days by a structured medical record review and a structured telephone interview. We record persistence with OAC, stroke, TIA, systemic arterial embolism and major bleeding (ISTH criteria). Patient outcomes are fed back to the treating ED physician. Evaluation/ Results: The SAFE pathway was introduced in two EDs in June 2018. In total, 177 patients have had the pathway applied. The median age was 70 (interquartile range (IQR) 61-78), 48% male, median CHADS2 score 2 (IQR 0-2). 19/177 patients (11%) had a contraindication to initiating OAC. 122 patients (69%) had no contraindication to OAC and were CHADS65 positive. Of these 122 patients, 109 were given a prescription for OAC (96 the correct dose, 9 too high a dose and 4 too low a dose). 6 patients declined OAC and the physician did not want to start OAC for 7 patients. 73/122 were contacted by phone at 90 days, 15 could not be reached and 34 have not completed 90 days of follow up since their ED visit. Of the 73 who were reached by phone after 90 days, 65 were still taking an anticoagulant. To date, 1 patient who declined OAC (CHADS2 score of 2) had a stroke within 90 days and one patient prescribed OAC had a gastrointestinal bleed. Discussion/Impact: The SAFE pathway appears safe and effective although we continue to evaluate and improve the process.
Given the common view that pre-exercise nutrition/breakfast is important for performance, the present study investigated whether breakfast influences resistance exercise performance via a physiological or psychological effect. Twenty-two resistance-trained, breakfast-consuming men completed three experimental trials, consuming water-only (WAT), or semi-solid breakfasts containing 0 g/kg (PLA) or 1·5 g/kg (CHO) maltodextrin. PLA and CHO meals contained xanthan gum and low-energy flavouring (approximately 122 kJ), and subjects were told both ‘contained energy’. At 2 h post-meal, subjects completed four sets of back squat and bench press to failure at 90 % ten repetition maximum. Blood samples were taken pre-meal, 45 min and 105 min post-meal to measure serum/plasma glucose, insulin, ghrelin, glucagon-like peptide-1 and peptide tyrosine-tyrosine concentrations. Subjective hunger/fullness was also measured. Total back squat repetitions were greater in CHO (44 (sd 10) repetitions) and PLA (43 (sd 10) repetitions) than WAT (38 (sd 10) repetitions; P < 0·001). Total bench press repetitions were similar between trials (WAT 37 (sd 7) repetitions; CHO 39 (sd 7) repetitions; PLA 38 (sd 7) repetitions; P = 0·130). Performance was similar between CHO and PLA trials. Hunger was suppressed and fullness increased similarly in PLA and CHO, relative to WAT (P < 0·001). During CHO, plasma glucose was elevated at 45 min (P < 0·05), whilst serum insulin was elevated (P < 0·05) and plasma ghrelin suppressed at 45 and 105 min (P < 0·05). These results suggest that breakfast/pre-exercise nutrition enhances resistance exercise performance via a psychological effect, although a potential mediating role of hunger cannot be discounted.
Introduction: The Canadian population is aging and an increasing proportion of emergency department (ED) patients are seniors. ED visits among seniors are frequently instigated by a fall at home. Some of these patients develop intracranial hemorrhage (ICH) because of falling. There has been little research on the frequency of ICH in elderly patients who fall, and on which clinical factors are associated with ICH in these patients. The aim of this study was to identify the incidence of ICH, and the clinical features which are associated with ICH, in seniors who present to the ED having fallen. Methods: This was a prospective cohort study conducted in three EDs. Patients were included if they were age >65 years, and presented to the ED within 48 hours of a fall on level ground, off a bed/chair/toilet or down one step. Patients were excluded if they fell from a height, were knocked over by a vehicle or were assaulted. ED physicians recorded predefined clinical findings (yes/no) before any head imaging was done. Head imaging was done at the ED physician's discretion. All patients were followed for 6 weeks (both by telephone call and chart review at 6 weeks) for evidence of ICH. Associations between baseline clinical findings and the presence of ICH were assessed with multivariable logistic regression. Results: In total, 1753 patients were enrolled. The prevalence of ICH was 5.0% (88 patients), of whom 74 patients had ICH on the ED CT scan and 14 had ICH diagnosed during follow-up. 61% were female and the median age was 82 (interquartile range 75-88). History included hypertension in 76%, diabetes in 29%, dementia in 27%, stroke/TIA in 19%, major bleeding in 11% and chronic kidney disease in 11%. 35% were on antiplatelet therapy and 25% were on an anticoagulant. Only 4 clinical variables were independently associated with ICH: bruise/laceration on the head (odds ratio (OR): 4.3; 95% CI 2.7-7.0), new abnormalities on neurological examination (OR: 4.4; 2.4-8.1), chronic kidney disease (OR: 2.4; 1.3-4.6) and reduced GCS from baseline (OR: 1.9; 1.0-3.4). Neither anticoagulation (OR: 0.9; 0.5-1.6) nor antiplatelet use (OR: 1.1; 0.6-1.8) appeared to be associated with ICH. Conclusion: This prospective study found a prevalence of ICH of 5.0% in seniors after a fall, and that bruising on the head, abnormal neurological examination, abnormal GCS and chronic kidney disease were predictive of ICH.
A new mineral species, fluorlamprophyllite (IMA2013-102), ideally Na3(SrNa)Ti3(Si2O7)2O2F2, has been found in the Poços de Caldas alkaline massif, Morro do Serrote, Minas Gerais, Brazil. Alternatively, the idealized chemical formula could be written as (SrNa)[(Na3Ti)F2][Ti2(Si2O7)2O2], setting the large interlayer cations before the cations of the layer. Fluorlamprophyllite is the F-analogue of lamprophyllite. It is associated with aegirine, analcime, natrolite, nepheline and microcline. Fluorlamprophyllite crystals are brownish-orange and bladed. The mineral is transparent with a pale yellow streak and an adamantine lustre. It is brittle and has a Mohs hardness of ~3; cleavage is perfect on {100} and no parting was observed. The calculated density is 3.484 g/cm3. Optically, fluorlamprophyllite is biaxial (+), with α = 1.735(7), β = 1.749(7) and γ = 1.775(9) and 2Vmeas = 72(3)°. An electron microprobe analysis produced an average composition (wt.%) (9 points) of Na2O 10.63(30), K2O 0.47(3), SiO2 30.51(13), SrO 18.30(24), MgO 0.81(17), Al2O3 0.23(2), CaO 1.11(7), MnO 5.03(38), TiO2 27.41(87), Fe2O3 2.45(37), F 2.86(23), plus H2O 1.00 (added to bring the total close to 100%), –O = F –1.20, with the total = 98.61%. The elements Nb and Ba were sought, but contents were below microprobe detection limits. The resultant chemical formula was calculated on the basis of 18 (O + F) atoms per formula unit. The addition of 1.00 wt.% H2O brought [F+(OH)] = 2 pfu, yielding (Na2.63Sr1.35Mn0.54Ca0.15Mg0.15K0.08)Σ4.90(Ti2.63Fe0.24Al0.04)Σ2.91Si3.89O16[F1.15(OH)0.85]Σ2.00. The mineral is monoclinic, with space group C2/m and unit-cell parameters a = 19.255(2), b = 7.0715(7), c = 5.3807(6) Å, β = 96.794(2)° and V = 727.5(1) Å3. The structure is a layered silicate inasmuch as the O atoms are arranged in well-defined, though not necessarily close-packed layers.
In the more than two hundred years since his death, Cook's reputation has been much discussed, opinion ranging from celebration of his achievement to more subjective assessments of the long-term implications of his voyages in those countries of the Pacific which he visited. The thirteen essays in this book, grouped in four sections, continue the debate. 'The Years in England' cover Cook's Whitby background and the part played by the Royal Society in the Pacific ventures of the period. 'The Pacific Voyages' investigates the clash between the Endeavour's crew and the Aborigines on the banks of the Endeavour River, the process by which Cook and his crews became 'Polynesianised', Cook's visit to the Hawaiian Islands, and his call at Nootka Sound, both on his final voyage. 'Captain Cook and his Contemporaries' views other European explorers in the Pacific, and concludes with an analysis of Russian attitudes towards Cook. 'The Legacy of Captain Cook' compares Cook's death on Hawaii with the later killing of a missionary on Eromanga, examines fluctuations in Cook's reputation, and describes life on board the replica of the Endeavour. GLYNDWR WILLIAMS is Emeritus Professor of History, Queen Mary & Westfield College, University of London. His many books include an edition of Captain Cook's Voyages, 1768-79, from the official accounts derived from Cook's journals.
A 2-yr (2009 to 2010), no-till (direct-seeded) “follow-up” study was conducted at five western Canada sites to determine weed interference impacts and barley and canola yield recovery after 4 yr of variable crop inputs (seed, fertilizer, herbicide). During the initial period of the study (2005 to 2008), applying fertilizer in the absence of herbicides was often worse than applying no optimal inputs; in the former case, weed biomass levels were at the highest levels (2,788 to 4,294 kg ha−1), possibly due to better utilization of nutrients by the weeds than by the crops. After optimal inputs were restored (standard treatment), most barley and canola plots recovered to optimal yield levels after 1 yr. However, 4 yr with all optimal inputs but herbicides led to only 77% yield recovery for both crops. At most sites, when all inputs were restored for 2 yr, all plots yielded similarly to the standard treatment combination. Yield “recovery” occurred despite high weed biomass levels (> 4,000 kg ha−1) prior to the first recovery year and despite high wild oat seedbank levels (> 7,000 seeds m−2) at the end of the second recovery year. In relatively competitive narrow-row crops such as barley and canola, the negative effects of high soil weed seedbanks can be mitigated if growers facilitate healthy crop canopies with appropriate seed and fertilizer rates in combination with judicious herbicide applications to adequately manage recruited weeds.
Integrated weed management (IWM) decision strategies in herbicide-resistant canola-production systems were assessed for net returns and relative risk. Data from two field experiments conducted during 1998 to 2000 at two locations in Alberta, Canada, were evaluated. A herbicide-based experiment included combinations of herbicide system (glufosinate-, glyphosate-, and imazethapyr-resistant canola varieties), herbicide rate (50 and 100% of recommended dose), and time of weed removal (two-, four-, and six-leaf stages of canola). A seed-based experiment included canola variety (hybrid and open-pollinated), seeding rate (100, 150, and 200 seeds m−2), and time of weed removal (two-, four-, and six-leaf stages of canola). For the herbicide-based experiment, strategies with glyphosate were profitable at Lacombe, but both imazethapyr and glyphosate strategies were profitable at Lethbridge. Weed control at the four-leaf stage was at least as profitable as the two-leaf stage at both sites. For the seed-based experiment, the hybrid was more profitable than the open-pollinated cultivar, seed rates of 100 and 150 seeds m−2 were more profitable than 200 seeds m−2, and weed control at the two- and four-leaf stages was more profitable than at the six-leaf stage. When risk of returns and statistical significance was considered, several strategies were included in the risk-efficient set for risk-averse and risk-neutral attitudes at each location. However, the glyphosate-resistant cultivar, the 50% herbicide rate, and weed control at four-leaf stage were more frequent in the risk-efficient IWM strategy set. The open-pollinated cultivar, 200 seeds m−2 rate, and weed control at the six-leaf stage were less frequent in the set. The risk-efficient sets of IWM strategies were consistent across a range of canola prices.
Growing crops that exhibit a high level of competition with weeds increases opportunities to practice integrated weed management and reduce herbicide inputs. The recent development and market dominance of hybrid canola cultivars provides an opportunity to reassess the relative competitive ability of canola cultivars with small-grain cereals. Direct-seeded (no-till) experiments were conducted at five western Canada locations from 2006 to 2008 to compare the competitive ability of canola cultivars vs. small-grain cereals. The relative competitive ability of the species and cultivars was determined by assessing monocot and dicot weed biomass at different times throughout the growing season as well as oat (simulated weed) seed production. Under most conditions, but especially under warm and relatively dry environments, barley cultivars had the greatest relative competitive ability. Rye and triticale were also highly competitive species under most environmental conditions. Canada Prairie Spring Red wheat and Canada Western Red Spring wheat cultivars usually were the least competitive cereal crops, but there were exceptions in some environments. Canola hybrids were more competitive than open-pollinated canola cultivars. More importantly, under cool, low growing degree day conditions, canola hybrids were as competitive as barley, especially with dicot weeds. Under most conditions, hybrid canola growers on the Canadian Prairies are well advised to avoid the additional selection pressure inherent with a second in-crop herbicide application. Combining competitive cultivars of any species with optimal agronomic practices that facilitate crop health will enhance cropping system sustainability and allow growers to extend the life of their valuable herbicide tools.
Glyphosate-resistant (GR) crops are produced over large areas in North America. A study was conducted at six western Canada research sites to determine seed date and tillage system effects on weed populations in GR spring wheat and canola cropping systems from 2000 to 2003. Four-year wheat–canola–wheat–pea rotations were devised with varying levels of GR crops in the rotation. Weed populations were determined at pre– and post–in-crop herbicide application intervals in 2000 and 2003. Early seeding led to higher and more variable in-crop wild oat and wild buckwheat populations. High frequencies of in-crop glyphosate wheat in the rotation usually improved weed management and reduced weed density and variability. Canonical discriminant analysis (CDA) across all locations revealed that by 2003, green foxtail, redroot pigweed, sowthistle spp., wild buckwheat, and wild oat, all associated with the rotation lacking in-crop glyphosate. Similar CDA analyses for individual locations indicated specific weeds were associated with 3 yr of in-crop glyphosate (Canada thistle at Brandon, henbit at Lacombe, and volunteer wheat, volunteer canola, and round-leaved mallow at Lethbridge). However, only henbit at Lacombe and volunteer wheat at Lethbridge occurred at significant densities. Although excellent weed control was attained in rotations containing a high frequency of GR crops, the merits of more integrated approaches to weed management and crop production should also be considered. Overall, rotations including GR spring wheat did not significantly increase short-term weed management risks in conventional tillage or low soil-disturbance direct-seeding systems.
Glyphosate-resistant canola has been widely adopted in western Canada. This has prompted producer interest in the timing of glyphosate application, particularly under zero tillage, where glyphosate is often applied preseeding. Field experiments were conducted at Lacombe, Edmonton, and Beaverlodge in Alberta in 1997, 1998, and 1999 to determine the importance of preseeding glyphosate and the most effective growth stage to apply glyphosate to canola to optimize yield and weed management. Treatments consisted of zero-tillage systems, with and without preseeding glyphosate, and a conventional-tillage system involving preseeding tillage operations. Glyphosate was applied at the one- to two-, three- to four-, or five- to six-leaf stages of canola in each tillage system. Canola yield and weed dry weight were largely unaffected by the tillage system. In most instances, the highest canola yields occurred when glyphosate was applied early to the crop. The opposite occurred at Lacombe and Edmonton in 1999, however, where canola yield increased as glyphosate was applied at later crop growth stages. This yield benefit likely resulted from the control of late-emerging weeds that exerted competitive pressure on canola. Early glyphosate timing in glyphosate-resistant canola may eliminate the need for preseeding glyphosate in zero-tillage systems, and optimize yield and weed control.
As a weed, wheat has recently gained greater profile. Determining wheat persistence in cropping systems will facilitate the development of effective volunteer wheat management strategies. In October of 2000, glyphosate-resistant (GR) spring wheat seeds were scattered on plots at eight western Canada sites. From 2001 to 2003, the plots were seeded to a canola–barley–field-pea rotation or a fallow–barley–fallow rotation, with five seeding systems involving seeding dates and soil disturbance levels, and monitored for wheat plant density. Herbicides and tillage (in fallow systems) were used to ensure that no wheat plants produced seed. Seeding systems with greater levels of soil disturbance usually had greater wheat densities. Volunteer wheat densities at 2 (2002) and 3 (2003) yr after seed dispersal were close to zero but still detectable at most locations. At the end of 2003, viable wheat seeds were not detected in the soil seed bank at any location. The majority of wheat seedlings were recruited in the year following seed dispersal (2001) at the in-crop, prespray (PRES) interval. At the PRES interval in 2001, across all locations and treatments, wheat density averaged 2.6 plants m−2. At the preplanting interval (PREP), overall wheat density averaged only 0.2 plants m−2. By restricting density data to include only continuous cropping, low-disturbance direct-seeding (LDS) systems, the latter mean dropped below 0.1 plants m−2. Only at one site were preplanting GR wheat densities sufficient (4.2 plants m−2) to justify a preseeding herbicide treatment in addition to glyphosate in LDS systems. Overall volunteer wheat recruitment at all spring and summer intervals in the continuous cropping rotation in 2001 was 1.7% (3.3 plants m−2). Despite the fact that volunteer wheat has become more common in the central and northern Great Plains, there is little evidence from this study to suggest that its persistence will be a major agronomic problem.
Epidemiologic studies have consistently demonstrated a greater incidence of depressive disorders and anxiety among women. Many women experience these conditions during the reproductive years. The dramatic expanse of literature focusing on the use of medications often has failed to pay homage to the potential impact of the disorders. When considering the extant human and laboratory data on mental illness and stress during pregnancy and the postpartum period, it is evident that some degree of exposure (be it treatment or illness) always occurs. The primary goal of the risk-benefit assessment for the treatment of mental illness during these periods is to assist patients and their families in choosing the path of potential exposure that possesses the least risk for them. Once this decision is made, the goal is to limit the number of exposures for the fetus/neonate.