We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Classic period lowland Maya used iron-ore mosaic mirrors and deposited mirrors in the burials of rulers and other people. Depictions of mirrors suggest that they were used for scrying, as were mirrors in Mesoamerica at the time of the Spanish arrival. Maya mirror users of this kind were conjurors, who used a variety of other divining and conjuring instruments and materials, including plates and shallow bowls. Three rulers at El Peru-Waka', now called Waka' by researchers at the site, an ancient city in northwestern Peten, Guatemala, were buried with mirrors and associated divining and conjuring materials. Following a brief introduction to the city and its temples, we describe the arrangement of mirrors and associated materials in three royal tombs. We suggest that the mirrors in these tombs were used in conjuring supernatural beings into existence, particularly Akan, a death god and wahy spirit who was a patron of the Waka' realm. We propose that the rulers and mirror conjurors of Waka' were oracles and that Waka' was known for prophecy. References to Sihyaj K'ahk' in text and iconography at Waka', and his association with oracular paraphernalia such as mirrors, lead us to propose a prophetic aspect of the visit of Sihyaj K'ahk' to the site eight days prior to his famous arrival at Tikal in a.d. 378. We suggest that the three rulers we discuss were mirror oracles sustained by the prestige of the prophecy of Sihyaj K'ahk'.
OBJECTIVES/GOALS: Glioblastomas (GBMs) are heterogeneous, treatment-resistant tumors that are driven by populations of cancer stem cells (CSCs). In this study, we perform an epigenetic-focused functional genomics screen in GBM organoids and identify WDR5 as an essential epigenetic regulator in the SOX2-enriched, therapy resistant cancer stem cell niche. METHODS/STUDY POPULATION: Despite their importance for tumor growth, few molecular mechanisms critical for CSC population maintenance have been exploited for therapeutic development. We developed a spatially resolved loss-of-function screen in GBM patient-derived organoids to identify essential epigenetic regulators in the SOX2-enriched, therapy resistant niche. Our niche-specific screens identified WDR5, an H3K4 histone methyltransferase responsible for activating specific gene expression, as indispensable for GBM CSC growth and survival. RESULTS/ANTICIPATED RESULTS: In GBM CSC models, WDR5 inhibitors blocked WRAD complex assembly and reduced H3K4 trimethylation and expression of genes involved in CSC-relevant oncogenic pathways. H3K4me3 peaks lost with WDR5 inhibitor treatment occurred disproportionally on POU transcription factor motifs, required for stem cell maintenance and including the POU5F1(OCT4)::SOX2 motif. We incorporated a SOX2/OCT4 motif driven GFP reporter system into our CSC cell models and found that WDR5 inhibitor treatment resulted in dose-dependent silencing of stem cell reporter activity. Further, WDR5 inhibitor treatment altered the stem cell state, disrupting CSC in vitro growth and self-renewal as well as in vivo tumor growth. DISCUSSION/SIGNIFICANCE: Our results unveiled the role of WDR5 in maintaining the CSC state in GBM and provide a rationale for therapeutic development of WDR5 inhibitors for GBM and other advanced cancers. This conceptual and experimental framework can be applied to many cancers, and can unmask unique microenvironmental biology and rationally designed combination therapies.
Evidence from the earliest-known crinoids (Tremadocian, Early Ordovician), called protocrinoids, is used to hypothesize initial steps by which elements of the calyx evolved. Protocrinoid calyces are composed of extraxial primary and surrounding secondary plates (both of which have epispires along their sutures) that are unlike those of more crownward fossil and extant crinoids in which equivalent calycinal plating is strongly organized. These reductions inspired several schemes by which to name the plates in these calyces. However, the primary-secondary systems seen in protocrinoids first appeared among Cambrian stem radial echinoderms, with primaries representing centers around which secondaries were sequentially added during ontogeny. Therefore, the protocrinoid calyx represents an intermediate condition between earliest echinoderms and crownward crinoids. Position and ontogeny indicate certain primaries remained as loss of secondaries occurred, resulting in abutting of primaries into the conjoined alternating circlets characteristic of crinoids. This transformative event included suppression of secondary plating and modification or, more commonly, elimination of respiratory structures. These data indicate subradial calyx plate terminology does not correspond with most common usage, but rather, supports an alternative redefinition of these traditional expressions. Extension and adoral growth of fixed rays during calyx ontogeny preceded conjoined primaries in earliest crinoids. Restriction with modification or elimination of calyx respiratory structures also accompanied this modification. Phylogenetic analyses strongly support crinoid origination from early pentaradiate echinoderms, separate from blastozoans. Accordingly, all Tremadocian crinoids express a distinctive aggregate of plesiomorphic and apomorphic commonalities; all branch early within the crinoid clade, separate from traditional subclass-level clades. Nevertheless, each taxon within this assemblage expresses at least one diagnostic apomorphy of camerate, cladid, or disparid clades.
Lairage staff at 11 abattoirs were asked to rate which producers regularly provided pigs which were ‘easy’ (EH) or ‘difficult’ (DH) to handle, on a scale of one (very DH) to five (very EH). A postal questionnaire, dealing with various aspects of post-weaning farm management, was then given to the four or five producers sending the most EH and the four or five producers sending the most DH pigs to each abattoir. Of 105 questionnaires sent, information on 26 EH and 27 DH systems was returned. The median number of replies per abattoir was two for both EH and DH systems. In most systems (77%) pigs experienced three or four housing stages from weaning to slaughter. In each of the first five housing stages, more EH pigs had access to daylight (mean of 86% ± 11.5 (SD)) than DH pigs (mean of 64% ± 10.1 (SD), P < 0.05, two-sample t test). More EH systems provided straw in the first three housing stages, although over all stages the difference was not significant. During housing stage two, the difference in provision of straw between the systems was most marked, with 58 per cent of EH and 27 per cent of DH systems providing straw. Distance walked between housing stages three to four and four to five was significantly greater for EH compared to DH systems (EH mean of 64m ± 24.1 (SD), versus DH mean of 22m ± 14.0 (SD), and EH mean of 73m ±17.2 (SD), versus DH mean of 23m ± 8.5 (SD), P <0.001 and 0.01 respectively, two-sample t test). At loading for pre-slaughter transport, moving from daylight to daylight conditions occurred in 65 per cent of EH and 25 per cent of DH systems. Overall, the results provide circumstantial evidence that environmental factors can affect ease of handling, and hence pig welfare during pre-slaughter transport and lairage.
Twelve specimens of Eumorphocystis Branson and Peck, 1940 provide the basis for new findings and a more informed assessment of whether this blastozoan (a group including eocrinoids, blastoids, diploporites, rhombiferans) constitutes the sister taxon to crinoids, as has been recently proposed. Both Eumorphocystis and earliest-known crinoid feeding appendages express longitudinal canals, a demonstrable trait exclusive to these taxa. However, the specimen series studied here shows that Eumorphocystis canals constrict proximally and travel within ambulacrals above the thecal cavity. This relationship is congruent with a documented blastozoan pattern but very unlike earliest crinoid topology. Earliest crinoid arm cavities lie fully beneath floor plates; these expand and merge directly with the main thecal coelomic cavity at thecal shoulders. Other associated anatomical features echo this contrasting comparison. Feeding appendages of Eumorphocystis lack two-tiered cover plates, podial basins/pores, and lateral arm plating, all features of earliest crinoid ‘true arms.’ Eumorphocystis feeding appendages are buttressed by solid block-like plates added during ontogeny at a generative zone below floor plates, a pattern with no known parallel among crinoids. Eumorphocystis feeding appendages express brachioles, erect extensions of floor plates, also unknown among crinoids. These several distinctions point to nonhomology of most feeding appendage anatomy, including longitudinal canals, removing Eumorphocystis and other blastozoans from exclusive relationship with crinoids. Eumorphocystis further differs from crinoids in that thecal plates express diplopores, respiratory structures not present among crinoids, but ubiquitous among certain groups of blastozoans. Phylogenetic analysis places Eumorphocystis as a crownward blastozoan, far removed from crinoids.
Intermediate morphologies of a new fossil crinoid shed light on the pathway by which crinoids acquired their distinctive arms. Apomorphies originating deep in echinoderm history among early nonblastozoan pentaradiate echinoderms distinguish Tremadocian (earliest Ordovician) crinoid arms from later taxa. The brachial series is separated from the ambulacra, part of the axial skeleton, by lateral plate fields. Cover plates are arrayed in two tiers, and floor plates expressed podial basins and pores. Later during the Early Ordovician, floor plates contacted and nestled into brachials, then were unexpressed as stereom elements entirely and cover plates were reduced to a single tier. Incorporation of these events into a parsimony analysis supports crinoid origin deep in echinoderm history separate from blastozoans (eocrinoids, ‘cystoids’). Arm morphology is exceptionally well-preserved in the late Tremadocian to early Floian Athenacrinus broweri new genus new species. Character analysis supports a hypothesis that this taxon originated early within in the disparid clade. Athenacrinus n. gen. (in Athenacrinidae new family) is the earliest-known crinoid to express what is commonly referred to as ‘compound’ or ‘biradial’ morphology. This terminology is misleading in that no evidence for implied fusion or fission of radials exists, rather it is suggested that this condition arose through disproportionate growth.
Introduction: Access block is a pervasive problem, even during times of minimal boarding in the ED, suggesting suboptimal use of ED stretchers can contribute. A tracking board utility was embedded into the electronic health record in Calgary, AB, allowing MDs and RNs to consider patients who could be relocated from a stretcher to a chair. Objectives of this study were to evaluate the feature's impact on total stretcher time (TST) and ED length of stay (LOS) for patients relocated to a chair. We also sought to identify facilitators and barriers to the tool's use amongst ED MDs and RNs. Methods: A retrospective cohort design was used to compare TST between those where the tool was used and not used amongst patients relocated to a chair between September 1 2017 and August 15 2018. Each use of the location tool was time-stamped in an administrative database. Median TST and ED LOS were compared between patients where the tool was used and not used using a Mann-Whitney U Test. A cross sectional convenience sample survey was used to determine facilitators and barriers to the tool's use amongst ED staff. Response proportions were used to report Likert scale questions; thematic analysis was used to code themes. Results: 194882 patients met inclusion criteria. The tool was used 4301 times, with “Ok for Chairs” selected 3914(2%) times and “Not Ok for Chairs” selected 384(0.2%) times; 54462(30%) patients were moved to a chair without the tool's use. Mean age, sex, mode of arrival and triage scores were similar between both groups. Median (IQR) TST amongst patients moved to a chair via the prompt was shorter than when the prompt was not used [142.7 (100.5) mins vs 152.3 (112.3) mins, p < 0.001], resulting in 37574 mins of saved stretcher time. LOS was similar between both groups (p = 0.22). 125 questionnaires were completed by 90 ED nurses and 35 ED MDs. 95% of staff were aware of the tool and 70% agreed/strongly agreed the tool could improve ED flow; however, 38% reported only “sometimes” using the tool. MDs reported the most common barrier was forgetting to use the tool and lack of perceived action in relocating patients. Commonly reported nursing barriers were lack of chair space and increased workload. Conclusion: Despite minimal use of the tracking board utility, triggering was associated with reduced TST amongst ED patients eventually relocated to a chair. To encourage increased use, future versions should prompt staff to select a location.
Urinary tract infections (UTIs) are common among college-aged women and often recur. Some antibiotics recommended to treat UTIs trigger dysbiosis of intestinal and vaginal microbiomes – where uropathogens originate, though few studies have investigated associations between these therapies with recurrent infections. We retrospectively analysed the electronic medical records of 6651 college-aged women diagnosed with a UTI at a US university student health centre between 2006 and 2014. Women were followed for 6 months for incidence of a recurrent infection. In a secondary analysis, associations in women whose experienced UTI recurrence within 2 weeks were also considered for potential infection relapse. Logistic regression was used to assess associations between infection recurrence or relapse and antibiotics prescribed, in addition to baseline patient characteristics including age, race/ethnicity, region of origin, year of encounter, presence of symptomology, pyelonephritis, vaginal coinfection and birth control consultation. There were 1051 instances of infection recurrence among the 6620 patients, indicating a prevalence of 16%. In the analysis of patient characteristics, Asian women were statistically more likely to experience infection recurrence whereas African American were less likely. No significant associations were identified between the antibiotic administered at the initial infection and the risk of infection recurrence after multivariable adjustment. Treatment with trimethoprim-sulphamethoxazole and being born outside of the USA were significantly associated with increased odds of infection relapse in the multivariate analysis. The results of the analyses suggest that treatment with trimethoprim-sulphamethoxazole may lead to an increased risk of UTI relapse, warranting further study.
The role that vitamin D plays in pulmonary function remains uncertain. Epidemiological studies reported mixed findings for serum 25-hydroxyvitamin D (25(OH)D)–pulmonary function association. We conducted the largest cross-sectional meta-analysis of the 25(OH)D–pulmonary function association to date, based on nine European ancestry (EA) cohorts (n 22 838) and five African ancestry (AA) cohorts (n 4290) in the Cohorts for Heart and Aging Research in Genomic Epidemiology Consortium. Data were analysed using linear models by cohort and ancestry. Effect modification by smoking status (current/former/never) was tested. Results were combined using fixed-effects meta-analysis. Mean serum 25(OH)D was 68 (sd 29) nmol/l for EA and 49 (sd 21) nmol/l for AA. For each 1 nmol/l higher 25(OH)D, forced expiratory volume in the 1st second (FEV1) was higher by 1·1 ml in EA (95 % CI 0·9, 1·3; P<0·0001) and 1·8 ml (95 % CI 1·1, 2·5; P<0·0001) in AA (Prace difference=0·06), and forced vital capacity (FVC) was higher by 1·3 ml in EA (95 % CI 1·0, 1·6; P<0·0001) and 1·5 ml (95 % CI 0·8, 2·3; P=0·0001) in AA (Prace difference=0·56). Among EA, the 25(OH)D–FVC association was stronger in smokers: per 1 nmol/l higher 25(OH)D, FVC was higher by 1·7 ml (95 % CI 1·1, 2·3) for current smokers and 1·7 ml (95 % CI 1·2, 2·1) for former smokers, compared with 0·8 ml (95 % CI 0·4, 1·2) for never smokers. In summary, the 25(OH)D associations with FEV1 and FVC were positive in both ancestries. In EA, a stronger association was observed for smokers compared with never smokers, which supports the importance of vitamin D in vulnerable populations.
This book is a collection of essays addressing multiple aspects of African maritime history in attempt to counter the lack of academic research that exists in comparison to other nations and continents, and to assert the value of African topics to the global study of maritime history. Each essay addresses African maritime history whilst also demonstrating an inextricable link to the global maritime stage. The topics discussed include early human migration to Africa; early European contact with Africa; the role of West African maritime communities in the Atlantic slave trade; New World slaveholders and the exploitation of African maritime skillsets; the construction of Atlantic world racial discourses; the rise and fall of colonial rule; and African immigrant communities in Europe. These essays cover maritime topics such as seafaring labour, navigational technology, swimming, diving, surfing; plus political subjects that include colonisation, decolonisation, immigration and citizenship. The book consists of eight essays and an introduction that evaluates the existing research into African maritime history. It includes case studies from every major geographical part of the continent, bar North Africa, and covers the Early Modern period up to the twentieth century. The purpose is not to provide a comprehensive chronological history, but rather a diverse collection of topics across a range of periods and locations to reflect the wealth of maritime topics in the history of Africa and their global significance. It concludes with a call for further research into non-European maritime activity, to deepen the global historiography.
Introduction: The specialist Emergency Medicine (EM) postgraduate training program at Queens University implemented a new Competency-Based Medical Education (CBME) model on July 1 2017. This occurred one year ahead of the national EM cohort, in the model of Competence By Design (CBD) as outlined by the Royal College of Physicians and Surgeons of Canada (RCPSC). This presents an opportunity to identify critical steps, successes, and challenges in the implementation process to inform ongoing national CBME implementation efforts. Methods: A case-study methodology with Rapid Cycle Evaluation was used to explore the lived experience of implementing CBME in EM at Queens, and capture evidence of behavioural change. Data was collected at 3- and 6- months post-implementation via multiple sources and methods, including: field observations, document analysis, and interviews with key stakeholders: residents, faculty, program director, CBME lead, academic advisors, and competence committee members. Qualitative findings have been triangulated with available quantitative electronic assessment data. Results: The critical processes of implementation have been outlined in 3 domain categories: administrative transition, resident transition, and faculty transition. Multiple themes emerged from stakeholder interviews including: need for holistic assessment beyond Entrustable Professional Activity (EPA) assessments, concerns about the utility of milestones in workplace based assessment by front-line faculty, trepidation that CBME is adding to, rather than replacing, old processes, and a need for effective data visualisation and filtering for assessment decisions by competency committees. We identified a need for administrative direction and faculty development related to: new roles and responsibilities, shared mental models of EPAs and entrustment scoring. Quantitative data indicates that the targeted number of assessments per EPA and stage of training may be too high. Conclusion: Exploring the lived experience of implementing CBME from the perspectives of all stakeholders has provided early insights regarding the successes and challenges of operationalizing CBME on the ground. Our findings will inform ongoing local implementation and higher-level national planning by the Canadian EM Specialty Committee and other programs who will be implementing CBME in the near future.
Introduction: Hospital admission within 72 hours of emergency discharge is a widely accepted measure of emergency department quality of care. Patients returning for unplanned admission may reveal opportunities for improved emergency or followup care. Calgary emergency physicians, however, are rarely notified of these readmissions. Aggregate site measures provide a high level view of readmissions for managers, but dont allow for timely, individual reflection on practice and learning opportunities. These aggregations may also not correctly account for variation in planned readmissions and other workflow nuances. There was a process in place at one facility to compile and communicate readmission details to each physician, but it was manual, provided limited visit detail, and was done weeks or months following discharge. Methods: A new, realtime 72 hour readmission notification recently implemented within the Calgary Zone provides direct and automated email alerts to all emergency physicians and residents involved in the care of a patient that has been readmitted. This alert is sent within hours of a readmission occurring and contains meaningful visit detail (discharge diagnosis, readmit diagnosis, patient name, etc) to help support practice reflection. An average of 15 alerts per day are generated and have been sent since implementation in April, 2017. Although an old technology, the use of email is a central component of the solution because it allows physicians to receive notifications at home and outside the hospital network where they routinely perform administrative tasks. A secondary notification is sent to personal email accounts (Gmail, Hotmail, etc) to indicate an unplanned admission has occurred, but without visit detail or identifiable information. It also allowed implementation with no new hardware or software cost. Results: A simple thumbs up/down rating system is used to adjust the sensitivity of the alert over time. More than 66% of those providing feedback have indicated the alert is helpful for practice reflection (i.e., thumbs up). And of those that indicated it was not helpful, comments were often entered indicating satisfaction with the alert generally, or suggestions for improvement. For example, consulted admitting physicians are often responsible for discharge decisions and should be added as recipients of the alert. Conclusion: Many physicians have indicated appreciation in knowing about return patients, and that they will reflect on their care, further review the chart, or contact the admitting physician for further discussion. Most are accepting of some ‘expected’ or ‘false positive’ alerts that aren’t helpful for practice reflection. Further tuning and expansion of the alert to specialist and consult services is needed to ensure all physicians involved in a discharge decision are adequately notified.
Introduction: There is a growing interest in providing clinicians with performance reports via audit and feedback (A&F). Despite significant evidence exists to support A&F as a tool for self-reflection and identifying unperceived learning needs, there are many questions that remain such as the optimal content of the A&F reports, the method of dissemination for emergency physicians (EP) and the perceived benefit. The goal of the project was to 1. evaluate EP perceptions regarding satisfaction with A&F reports and its’ ability to stimulate physicians to identify opportunities for practice change and 2. identify areas for optimization of the A&F reports. Methods: EP practicing at any of the four adult hospital sites in Calgary were eligible. We conducted a web survey using a modified Dillman technique eliciting EP perspectives regarding satisfaction, usefulness and suggestions for improvement regarding the A&F reports. Quantitative data were analyzed descriptively and free-text were subjected to thematic analysis. Results: From 2015 onwards, EP could access their clinical performance data via an online dashboard. Despite the online reports being available, few physicians reviewed their reports stating access and perceived lack of utility as a barrier. In October 2016, we began disseminated static performance reports to all EP containing a subset of 10 clinical and operational performance metrics via encrypted e-mail. These static reports provided clinician with their performance with peer comparator data (anonymized), rationale and evidence for A&F, information on how to use the report and how to obtain continuing medical education credits for reviewing the report. Conclusion: Of 177 EP in Calgary, we received 49 completed surveys (response rate 28%). 86% of the respondents were very/satisfied with the report. 88% of EP stated they would take action based on the report including self-reflection (91%) and modifying specific aspects of their practice (63%). Respondents indicated that by receiving static reports, 77% were equally or more likely to visit the online version of the eA&F tool. The vast majority of EP felt that receiving the A&F reports on a semi-annual basis was preferred. Three improvements were made to the eA&F based on survey results: 1) addition of trend over time data, 2) new clinical metrics, and 3) optimization of report layout. We also initiated a separate, real-time 72-hour bounceback electronic notification system based on the feedback. EP value the dissemination of clinical performance indicators both in static report and dashboard format. Eliciting feedback from clinicians allows iterative optimization of eA&F. Based on these results, we plan to continue to provide physicians with A&F reports on a semi-annual basis.
Introduction: Non-variceal upper gastrointestinal bleeding (NVUGIB) is a common presentation to the emergency department (ED) accounting for significant morbidity, mortality and health care resource usage. In Alberta, a provincial care pathway was recently developed to provide an evidence informed approach to managing patients with an UGIBs in the ED. Pantoprazole infusions are a commonly used treatment despite evidence that suggests they are generally not indicated prior to endoscopy in the ED. The goal of this project was to optimize management of patients with a NVUGIB, in particular reduce pre-endoscopy pantoprazole infusions. Methods: In July 2016, we implemented a multi-faceted intervention to optimize management of ED patients with NVUGIB including 1. de-emphasizing IV pantoprazole infusions in the ED, 2. clinical decision support (CDS) embedded (for endoscopy, disposition and transfusions) within the order set and 3. educating clinicians about the care pathway. We used a pre/post-order set design, analyzing 391 days pre and 189 days post-order set changes. Data was extracted from our fully integrated electronic health records system. The primary outcome was the % of patients receiving IV pantoprazole infusion ordered by an emergency physician (EP) among all patients with NVUGIB. Secondary outcomes included % transfused with hgb >70g/L and whether using the GIB order set impacted management of NVUGIB patients. Results: In the 391 days pre-order set changes, there were 2165 patients included and in the 189 days post-order set changes, there were 901 patients. For baseline characteristics, patients in the post-order set change group were significantly older (64.4 yrs vs 60.9 yrs p-value=0.0016) and had a lower hgb (115 vs 118, p-value=0.049) but otherwise for gender, measures of severity of illness (systolic blood pressure, heart rate, CTAS, % admitted) there were no significantly differences. For the primary outcome, in the pre-order set phase, 47.1% received a pantoprazole infusion ordered by an EP, compared to 31.5% in the post-order phase, for an absolute reduction of 15.6% (p-value= <0.001). For the secondary outcomes, transfusion rates were similar pre/post (22.08% vs 22.75%). Significant inter-site variability exists with respect to the reduction in pantoprazole infusion rates across the four sites (-23.3% to +6.12%). Conclusion: Our interventions resulted in a significant overall reduction in pantoprazole infusions in ED patients with NVUGIB. Reductions in pantoprazole infusions varied significantly across the different sites, future work in our department will explore and address this variability. Keys to the success of this project included engaging clinicians as well as leveraging the SCM order sets as well as the provincial care pathway. Although there were no changes in transfusion rates, it in unclear if this a function of the CDS not being effective or whether these transfusions were clinically indicated.
Field experiments were conducted at four locations in the Central Pacific region of Costa Rica between 1994 and 1996 to determine suitable tactics for integrated control of propanil-resistant junglerice in rain-fed rice. Stubble incorporation within 3 mo after rice harvest did not affect the density of junglerice that emerged with the crop at the beginning of the rainy season at any location. However, the elimination with glyphosate of the first junglerice seedling population emerging before rice planting consistently reduced the in-crop infestation of junglerice and resulted in increased grain yields. The positive effect of eliminating the first junglerice flush remained even after the in-crop treatments were applied and lasted after rice harvest. Substitution of the two customary applications of propanil (3.8 kg ha−1 each) with a single application of pendimethalin (0.75 to 1.5 kg ha−1), preemergence or early postemergence, also reduced junglerice infestation and improved grain yield. Both propanil, in mixture with the synergist piperophos, and quinclorac controlled propanil-resistant junglerice and increased grain yields. Control of the initial junglerice population and improved in-crop weed management can serve as the basis for integrated management of herbicide-resistant junglerice in rain-fed rice.
Rapid detection methods were developed for discriminating between resistant (R) and susceptible (S) biotypes of Echinochloa colona to either propanil or fenoxaprop-P at all growth stages. In the pregerminated seed assay for fenoxaprop-P, seeds were placed on 1.0% agar containing a range of concentrations of herbicides and kept under humid conditions. For propanil, pregerminated seeds were placed on moist filter paper in the lid of a petri dish and, when one leaf had developed, the lid was inverted for 1 min into propanil solutions at a range of concentrations. For the fenoxaprop-P and propanil test, seedling length and fresh weight were measured after 1 wk. For juvenile plants with four-leaf to one-tiller, shoots and roots were trimmed and placed in 20-ml glass tubes containing 0.2% (wt/v) agar and a range of concentrations of herbicides. Shoot extension and weight were recorded after 7 d. Larger plants with several small tillers were also assayed by this method. Tillers were removed from larger plants and were evaluated as described for juvenile plants. At later growth stages from ear emergence to flowering, excised stem node segments (8 cm) were soaked in water to stimulate rooting. Rooted nodes were placed in 30-ml glass bottles containing 0.2% agar with a range of concentrations of herbicides, and the test was conducted as described for juvenile plants but with the final assessment of new shoot extension and weight being recorded after 10 d. Discrimination between R and S biotypes was possible on the basis of GR50 values for shoot length and fresh weight in all testing methods. With few exceptions, GR50 values for the length of new shoot were very similar to those for the new shoot fresh weight. We concluded that all testing methods in our study provided reliable and quick discrimination between biotypes for both propanil and fenoxaprop-P susceptibility, covering various growth stages from seed to flowering stage. Trimming plants before herbicide treatment gives a rapid method of discrimination by measuring not only newly grown shoot fresh weight but also shoot length.
Genetically modified, herbicide-resistant crop (HRC) cultivars, which allow for simplified weed control decisions compared with conventional cultivars, have considerable potential in Latin America. The number of herbicide applications can be reduced in HRCs, and otherwise difficult-to-control species, including red rice in rice or herbicide-resistant weeds, can potentially be managed. The American tropics include the centers of origin of several crops, such as corn and potato, and natural or agrestal floras containing wild near relatives of introduced crops, including rice and cotton, for which HRCs could be used. Potential direct impacts of HRC adoption on biodiversity in Latin America include changes in the genetic diversity of crops, increased volunteer crop problems, and invasion by resistant cultivars of natural areas beyond the farm boundary. Additionally, there is a risk associated with the escape of transgenes from HRCs, involving introgression to weedy relatives on a field scale leading to amplification effects of existing weeds or modification of gene pools of crop progenitors in the centers of origin or diversity. Possible indirect effects of HRCs include the potential expansion of agriculture into uncleared wild areas made economically attractive by the more efficient cropping system, and adverse effects on nontarget organisms and ecosystem processes. The occurrence of spontaneous hybrids of crop–weed or near-relative complexes is known in the United States for rice and sunflower, and the development of HRC–weed hybrids would provide a particularly difficult set of weed management problems. For crops in which HRCs are currently in use or under development, the greatest risks in Latin America appear to be with corn, cotton, and potato. However, some of the genetic and geographical barriers reduce the risk of hybridization between these crops and their wild relatives. Furthermore, unlike the case of feral rapeseed on road verges in Europe, there is no known example of a conventionally bred crop or interfertile hybrid with a near relative becoming established outside of cultivation in Latin America. Soybean, the most widely adopted HRC in Latin America to date, is an exotic beyond the range of its wild relatives. With the exception of O. glumaepatula, there appears to be little threat to endemic wild species from HRC rice as strong infertility barriers should prevent transgene flow, but possible transgene movement to the conspecific red rice may become an issue. Beyond the field boundary, a key question concerns the possible persistence of HRC x wild relative hybrids and the manner in which resistance traits affect their overall fitness. Designing practical risk assessment protocols on which to base HRC release approval is a considerable challenge, and it is vital that Latin American states continue to build and maintain functional biosafety regulatory structures.