We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To analyze antimicrobial prescribing practices in Australian emergency departments (ED), identifying prescribing areas requiring improvement. This aims to inform antimicrobial stewardship (AMS) strategies to enhance antimicrobial prescribing quality.
Design
Retrospective analysis of the Hospital National Antimicrobial Prescribing Survey (NAPS) data set.
Setting
EDs in public and private Australian hospitals (n = 652).
Participants
Hospitals (n = 652) that participated in the Hospital NAPS from 2013 to 2022.
Methods
Data were collected by trained auditors from participating hospitals with the use of a standardized auditing tool, the Hospital NAPS. Data from 2013 to 2022 were analyzed descriptively. Variables assessed included guideline compliance and appropriateness by antimicrobial and indication, and reasons for inappropriateness.
Results
There were 3,098 antimicrobial prescriptions from EDs included for analysis. Guideline compliance (63.5%) and appropriateness (70.4%) in EDs were lower compared to overall prescribing practices from all departments. The most commonly prescribed antimicrobial was ceftriaxone (16.9%, n = 523), and the most common indication was empiric prescribing for community-acquired pneumonia (16.0%, n = 497). Amoxicillin-clavulanic acid (53.2%, n = 99), and acute exacerbation of chronic obstructive pulmonary disease (54.3%, n = 57), were the antimicrobial and indication with the lowest rates of appropriateness respectively. Ceftriaxone prescribing also had a low rate of appropriateness (62.3%, n = 326). Selection of antimicrobials with too broad of a spectrum was the most common reason for inappropriateness (40.2%).
Conclusion
Antimicrobial prescribing quality in EDs warrants improvement. Recommended targets for AMS interventions are the excessive and inappropriate use of broad-spectrum antimicrobials such as ceftriaxone and amoxicillin-clavulanic acid in common respiratory and urinary tract infections.
In many regions of Canada, knowledge of the distribution of insect species is far from complete. This knowledge gap, known as the Wallacean Shortfall, is often manifest by species records separated by large, often remote areas with no records. Paradoxically, these difficult-to-access areas offer the best opportunity to study unaltered native community assemblages. Such gaps in knowledge are exemplified by ground beetles, a well-known group, yet with record gaps in many unstudied areas of Canada, including Akimiski Island, Nunavut. This postglacial rebound island, located in James Bay, has no permanently occupied human dwellings and almost no human-altered habitat. Using a combination of pitfall-malaise traps, pitfall traps, and hand captures during 2008–2014, we collected 1368 ground beetles (Coleoptera: Carabidae) as part of a larger biodiversity survey. We identified 31 species, 29 of which were first territorial records for Nunavut. Our results almost double the number of Carabidae known from Nunavut and extend the known range of eight other species. Seventeen of the species that we caught cannot fly, evidence for colonists arriving on Akimiski on floating debris. Our study fills substantial range gaps and serves as baseline information to detect future change.
Heating montmorillonites to their dehydroxylation temperatures destroyed their ability to form an aerogel. The breakdown of the aerogel structure coincided with the loss of hydroxyl water from the montmorillonite. Apparently, this loss of water was accompanied by a loss of the layer charge. Particle size and aerogel-forming ability appear to be inversely related properties for at least some montmorillonites. The kaolinite investigated did not form an aerogel in any size fraction. The formation of montmorillonite aerogels from various concentrations of clay was investigated. The texture and physical appearance of these aerogels was examined and presented herein.
Thawed clay suspensions exhibited a variety of behaviors. The Volclay bentonite, which apparently formed a true sol was unaffected by freezing. In all other clays at least some of the fine clay particles agglomerated on freezing and large clumps were observed dropping out of the melting ice. After stirring the thawed suspensions less clay was dispersed than in the unfrozen suspension counterparts. Addition of a dispersing agent to these suspensions caused more clay to remain dispersed following freezing-thawing-stirring.
Over the past 2 decades, several categorizations have been proposed for the abnormalities of the aortic root. These schemes have mostly been devoid of input from specialists of congenital cardiac disease. The aim of this review is to provide a classification, from the perspective of these specialists, based on an understanding of normal and abnormal morphogenesis and anatomy, with emphasis placed on the features of clinical and surgical relevance. We contend that the description of the congenitally malformed aortic root is simplified when approached in a fashion that recognizes the normal root to be made up of 3 leaflets, supported by their own sinuses, with the sinuses themselves separated by the interleaflet triangles. The malformed root, usually found in the setting of 3 sinuses, can also be found with 2 sinuses, and very rarely with 4 sinuses. This permits description of trisinuate, bisinuate, and quadrisinuate variants, respectively. This feature then provides the basis for classification of the anatomical and functional number of leaflets present. By offering standardized terms and definitions, we submit that our classification will be suitable for those working in all cardiac specialties, whether pediatric or adult. It is of equal value in the settings of acquired or congenital cardiac disease. Our recommendations will serve to amend and/or add to the existing International Paediatric and Congenital Cardiac Code, along with the Eleventh iteration of the International Classification of Diseases provided by the World Health Organization.
Considerable progress continues to be made with regards to the value and use of disease associated polygenic scores (PGS). PGS aim to capture a person’s genetic liability to a condition, disease, or a trait, combining information across many risk variants and incorporating their effect sizes. They are already available for clinicians and consumers to order in Australasia. However, debate is ongoing over the readiness of this information for integration into clinical practice and population health. This position statement provides the viewpoint of the Human Genetics Society of Australasia (HGSA) regarding the clinical application of disease-associated PGS in both individual patients and population health. The statement details how PGS are calculated, highlights their breadth of possible application, and examines their current challenges and limitations. We consider fundamental lessons from Mendelian genetics and their continuing relevance to PGS, while also acknowledging the distinct elements of PGS. Use of PGS in practice should be evidence based, and the evidence for the associated benefit, while rapidly emerging, remains limited. Given that clinicians and consumers can already order PGS, their current limitations and key issues warrant consideration. PGS can be developed for most complex conditions and traits and can be used across multiple clinical settings and for population health. The HGSA’s view is that further evaluation, including regulatory, implementation and health system evaluation are required before PGS can be routinely implemented in the Australasian healthcare system.
The taxonomy of species of Bivesicula Yamaguti, 1934 is analysed for samples from holocentrid, muraenid and serranid fishes from Japan, Ningaloo Reef (Western Australia), the Great Barrier Reef (Queensland), New Caledonia and French Polynesia. Analysis of three genetic markers (cox1 mtDNA, ITS2 and 28S rDNA) identifies three strongly supported clades of species and suggests that Bivesicula as presently recognized is not monophyletic. On the basis of combined morphological, molecular and biological data, 10 species are distinguished of which five are proposed as new. Bivesicula Clade 1 comprises seven species of which three are effectively morphologically cryptic relative to each other; all seven infect serranids and four also infect holocentrids. Bivesicula Clade 2 comprises three species of which two are effectively morphologically cryptic relative to each other; all three infect serranids and one also infects a muraenid. Bivesicula Clade 3 comprises two known species from apogonids and a pomacentrid, and forms a clade with species of Paucivitellosus Coil, Reid & Kuntz, 1965 to the exclusion of other Bivesicula species. Taxonomy in this genus is made challenging by the combination of low resolving power of ribosomal markers, the existence of regional cox1 mtDNA populations, exceptional and unpredictable host-specificity and geographical distribution, and significant host-induced morphological variation.
Substantial progress has been made in the standardization of nomenclature for paediatric and congenital cardiac care. In 1936, Maude Abbott published her Atlas of Congenital Cardiac Disease, which was the first formal attempt to classify congenital heart disease. The International Paediatric and Congenital Cardiac Code (IPCCC) is now utilized worldwide and has most recently become the paediatric and congenital cardiac component of the Eleventh Revision of the International Classification of Diseases (ICD-11). The most recent publication of the IPCCC was in 2017. This manuscript provides an updated 2021 version of the IPCCC.
The International Society for Nomenclature of Paediatric and Congenital Heart Disease (ISNPCHD), in collaboration with the World Health Organization (WHO), developed the paediatric and congenital cardiac nomenclature that is now within the eleventh version of the International Classification of Diseases (ICD-11). This unification of IPCCC and ICD-11 is the IPCCC ICD-11 Nomenclature and is the first time that the clinical nomenclature for paediatric and congenital cardiac care and the administrative nomenclature for paediatric and congenital cardiac care are harmonized. The resultant congenital cardiac component of ICD-11 was increased from 29 congenital cardiac codes in ICD-9 and 73 congenital cardiac codes in ICD-10 to 318 codes submitted by ISNPCHD through 2018 for incorporation into ICD-11. After these 318 terms were incorporated into ICD-11 in 2018, the WHO ICD-11 team added an additional 49 terms, some of which are acceptable legacy terms from ICD-10, while others provide greater granularity than the ISNPCHD thought was originally acceptable. Thus, the total number of paediatric and congenital cardiac terms in ICD-11 is 367. In this manuscript, we describe and review the terminology, hierarchy, and definitions of the IPCCC ICD-11 Nomenclature. This article, therefore, presents a global system of nomenclature for paediatric and congenital cardiac care that unifies clinical and administrative nomenclature.
The members of ISNPCHD realize that the nomenclature published in this manuscript will continue to evolve. The version of the IPCCC that was published in 2017 has evolved and changed, and it is now replaced by this 2021 version. In the future, ISNPCHD will again publish updated versions of IPCCC, as IPCCC continues to evolve.
To compare antimicrobial prescribing practices in Australian hematology and oncology patients to noncancer acute inpatients and to identify targets for stewardship interventions.
Design:
Retrospective comparative analysis of a national prospectively collected database.
Methods:
Using data from the 2014–2018 annual Australian point-prevalence surveys of antimicrobial prescribing in hospitalized patients (ie, Hospital National Antimicrobial Prescribing Survey called Hospital NAPS), the most frequently used antimicrobials, their appropriateness, and guideline concordance were compared among hematology/bone marrow transplant (hemBMT), oncology, and noncancer inpatients in the setting of treatment of neutropenic fever and antibacterial and antifungal prophylaxis.
Results:
In 454 facilities, 94,226 antibiotic prescriptions for 62,607 adult inpatients (2,230 hemBMT, 1,824 oncology, and 58,553 noncancer) were analyzed. Appropriateness was high for neutropenic fever management across groups (83.4%–90.4%); however, hemBMT patients had high rates of carbapenem use (111 of 746 prescriptions, 14.9%), and 20.2% of these prescriptions were deemed inappropriate. Logistic regression demonstrated that hemBMT patients were more likely to receive appropriate antifungal prophylaxis compared to oncology and noncancer patients (adjusted OR, 5.3; P < .001 for hemBMT compared to noncancer patients). Oncology had a low rate of antifungal prophylaxis guideline compliance (67.2%), and incorrect dosage and frequency were key factors. Compared to oncology patients, hemBMT patients were more likely to receive appropriate nonsurgical antibacterial prophylaxis (aOR, 8.4; 95% CI, 5.3–13.3; P < .001). HemBMT patients were also more likely to receive appropriate nonsurgical antibacterial prophylaxis compared to noncancer patients (OR, 3.1; 95% CI, 1.9–5.0; P < .001). However, in the Australian context, the hemBMT group had higher than expected use of fluoroquinolone prophylaxis (66 of 831 prescriptions, 8%).
Conclusions:
This study demonstrates why separate analysis of hemBMT and oncology populations is necessary to identify specific opportunities for quality improvement in each patient group.
How do bureaucracies remember? The conventional view is that institutional memory is static and singular, the sum of recorded files and learned procedures. There is a growing body of scholarship that suggests contemporary bureaucracies are failing at this core task. This Element argues that this diagnosis misses that memories are essentially dynamic stories. They reside with people and are thus dispersed across the array of actors that make up the differentiated polity. Drawing on four policy examples from four sectors (housing, energy, family violence and justice) in three countries (the UK, Australia and New Zealand), this Element argues that treating the way institutions remember as storytelling is both empirically salient and normatively desirable. It is concluded that the current conceptualisation of institutional memory needs to be recalibrated to fit the types of policy learning practices required by modern collaborative governance.
Background: An important aspect of antimicrobial stewardship is the qualitative assessment of antimicrobial prescribing. Owing to lack of standardized tools and resources required to design, conduct and analyze qualitative audits, these assessments are rarely performed. Objective: We designed an audit tool that was appropriate for all Australian hospital types, suited to local user requirements and including an assessment of the appropriateness of antimicrobial prescribing. Methods: In 2011, a pilot survey was conducted in 32 Australian hospitals to assess the usability and generalizability of a qualitative audit tool. The tool was revised to reflect the respondents’ feedback. A second study was performed in 2012 in 85 hospitals. In 2013, following further feedback and refinement, an online auditing tool, the Hospital National Antimicrobial Prescribing Survey (NAPS), was developed. Early audits demonstrated that surgical prophylaxis had the highest rates of inappropriate prescribing. In 2016, the Surgical NAPS was developed to further investigate reasons for this, and the NAPS program was further expanded to audit antimicrobial prescribing practices in Australian aged-care homes (ie, the Aged Care NAPS). Results: Between January 1, 2013, and November 12, 2019, 523 Australian public and private hospitals (53.8%) utilized the Hospital NAPS; 215 (22.1%) have utilized the Surgical NAPS; and 774 of Australian aged-care homes (29.0%) have utilized the Aged Care NAPS. National reporting has identified key target areas for quality improvement initiatives at both local and national levels. The following initiatives have been outlined in 14 public reports: improved documentation; prolonged antimicrobial prophylaxis; compliance with prescribing guidelines; appropriateness of prescribing; access to evidence-based guidelines; and improved microbiology sampling. Conclusions: By utilizing the Plan-Do-Study-Act cycle for healthcare improvement and by involving end users in the design and evaluation, we have created a practical and relevant auditing program to assess both quantitative and qualitative aspects of antimicrobial prescribing in a wide range of settings. This voluntary program is now endorsed by the National Strategy for Antimicrobial Resistance Surveillance, partners with the Antimicrobial Use and Resistance in Australian Surveillance System, and is utilized by facilities to meet mandatory national accreditation standard requirements. With the success of the NAPS program in Australia, it has now been implemented in New Zealand, Canada, Malaysia, Fiji, and Bhutan, with plans for other countries to implement the program soon. Current research is being conducted to expand the program to include audits for family physicians, veterinarians, and remote indigenous communities, and for antifungal use.
Background: Orthopedic procedures are performed at high volumes in Australia. Thus, they are a commonly audited procedure group when measuring surgical antimicrobial prophylaxis (SAP) appropriateness and compliance in Australia and internationally. Recent analysis of the Surgical National Antimicrobial Prescribing Survey (Surgical NAPS) revealed high rates of inappropriateness, both procedurally (39.5%) and postprocedurally (53.0%). Inappropriate use can lead to patient harm and further increases the risk of antimicrobial resistance (AMR). Identification of factors associated with inappropriate orthopedic SAP prescribing may support the development of antimicrobial stewardship (AMS) interventions that are tailored to the orthopedic surgical setting to improve SAP. Methods: Surgical NAPS has been available to all Australian hospitals to complete from 2016; it supports the assessment of SAP appropriateness. Appropriateness is a composite measure based on antibiotic choice, timing of administration, dose and duration, applying the standardized Surgical NAPS Appropriateness Assessment Guide. Logistic regression was used to identify hospital, patient, and surgical factors associated with appropriateness. Adjusted appropriateness (AA) was calculated by generating marginal means from the multivariable model and averaging across all available covariates. Significance for multivariable analysis was determined as P < .05. Additional subanalyses were conducted on smaller subsets to calculate the AA for specific orthopedic procedures. Results: In total, 140 facilities contributed to orthopedic audits in the Surgical NAPS from January 1, 2016, to April 15, 2019, including 4,032 orthopedic surgical episodes and 6,709 prescribed doses. Overall appropriateness for prescribed procedural doses (n = 3,978) was 64.7% and was lower for prescribed postprocedural doses (n = 2,731, 48.3%). When antimicrobials were not prescribed, appropriateness was higher procedurally (n = 350, 89.7%) and postprocedurally (n = 1,127, 97.8%). When SAP was indicated, the most common reasons for inappropriateness, when prophylaxis was indicated, were timing for procedural doses (50.9%) and duration for postprocedural prescriptions (49.8%). The AA of each orthopedic procedure group was low for procedural SAP, ranging from 54.1% for knee surgery to 74.1% for total knee joint replacement. The adjusted appropriateness of postprocedural prescriptions was also low, ranging from 40.7% for hand surgery to 68.7% for closed reduction fractures. Conclusions: Orthopedic surgical specialties demonstrated differences across procedural and postprocedural appropriateness. The metric of appropriateness is meaningful for both orthopedic surgeons and AMS programs. Targeted quality improvement projects are needed for orthopedic surgical procedures and to study the engagement between orthopedic surgeons, AMS, and guideline developers to support optimization of antimicrobial use in the surgical setting.
Surgical antimicrobial prophylaxis (SAP) is commonly administered in orthopedic procedures. Research regarding SAP appropriateness for specific orthopedic procedures is limited and is required to facilitate targeted orthopedic prescriber behavior change.
Objectives:
To describe SAP prescribing and appropriateness for orthopedic procedures in Australian hospitals.
Design, setting, and participants:
Multicenter, national, quality improvement study with retrospective analysis of data collected from Australian hospitals via Surgical National Antimicrobial Prescribing Survey (Surgical NAPS) audits from January 1, 2016, to April 15, 2019, were analyzed.
Methods:
Logistic regression identified hospital, patient and surgical factors associated with appropriateness. Adjusted appropriateness was calculated from the multivariable model. Additional subanalyses were conducted on smaller subsets to calculate the adjusted appropriateness for specific orthopedic procedures.
Results:
In total, 140 facilities contributed to orthopedic audits in the Surgical NAPS, including 4,032 orthopedic surgical episodes and 6,709 prescribed doses. Overall appropriateness was low, 58.0% (n = 3,894). This differed for prescribed procedural (n = 3,978, 64.7%) and postprocedural doses (n = 2,731, 48.3%). The most common reasons for inappropriateness, when prophylaxis was required, was timing for procedural doses (50.9%) and duration for postprocedural prescriptions (49.8%). The adjusted appropriateness of each orthopedic procedure group was low for procedural SAP (knee surgery, 54.1% to total knee joint replacement, 74.1%). The adjusted appropriateness for postprocedural prescription was also low (from hand surgery, 40.7%, to closed reduction fractures, 68.7%).
Conclusions:
Orthopedic surgical specialties demonstrated differences across procedural and postprocedural appropriateness. The metric of appropriateness identifies targets for quality improvement and is meaningful for clinicians. Targeted quality improvement projects for orthopedic specialties need to be developed to support optimization of antimicrobial use.
An internationally approved and globally used classification scheme for the diagnosis of CHD has long been sought. The International Paediatric and Congenital Cardiac Code (IPCCC), which was produced and has been maintained by the International Society for Nomenclature of Paediatric and Congenital Heart Disease (the International Nomenclature Society), is used widely, but has spawned many “short list” versions that differ in content depending on the user. Thus, efforts to have a uniform identification of patients with CHD using a single up-to-date and coordinated nomenclature system continue to be thwarted, even if a common nomenclature has been used as a basis for composing various “short lists”. In an attempt to solve this problem, the International Nomenclature Society has linked its efforts with those of the World Health Organization to obtain a globally accepted nomenclature tree for CHD within the 11th iteration of the International Classification of Diseases (ICD-11). The International Nomenclature Society has submitted a hierarchical nomenclature tree for CHD to the World Health Organization that is expected to serve increasingly as the “short list” for all communities interested in coding for congenital cardiology. This article reviews the history of the International Classification of Diseases and of the IPCCC, and outlines the process used in developing the ICD-11 congenital cardiac disease diagnostic list and the definitions for each term on the list. An overview of the content of the congenital heart anomaly section of the Foundation Component of ICD-11, published herein in its entirety, is also included. Future plans for the International Nomenclature Society include linking again with the World Health Organization to tackle procedural nomenclature as it relates to cardiac malformations. By doing so, the Society will continue its role in standardising nomenclature for CHD across the globe, thereby promoting research and better outcomes for fetuses, children, and adults with congenital heart anomalies.
Greenhouse experiments in central Texas assessed the relative importance of above- and belowground interactions of semidwarf Mit wheat and Marshall ryegrass during vegetative growth. One experiment used partitions to compare the effect of no (controls), aboveground only, belowground only, and full interaction for 75 d after planting (DAP) one wheat and nine ryegrass plants in soil volumes of 90, 950, and 3,800 ml. The results with the different soil volumes were similar. Wheat growth in the aboveground interaction only did not differ from controls. However, the full or belowground only interaction of wheat with ryegrass reduced wheat height, leaf number, tillering, leaf area, percent total nonstructural carbohydrates in shoot, and dry weights of leaves, stems, and roots 45 and 75 DAP compared to controls. Wheat in full and belowground interaction only did not differ from one another in growth. A replacement series experiment of 56 d also showed that the competitive advantage of ryegrass was relatively greater in root than in shoot growth. No allelopathic response of wheat to ryegrass occurred. While the tallness of the semidwarf wheat minimized aboveground interference by ryegrass, the root growth of the thinner and more fibrous roots of ryegrass greatly enhanced its belowground competitiveness.
A greenhouse experiment compared the vegetative growth in pure cultures and mixtures of winter Triticum aestivum cultivar ‘Mit’ and Lolium multiflorum cultivar ‘Marshall’ in continuously watered controls and drought treatments. Control L. multiflorum in pure culture 14 wk after planting produced more leaf area, tillers, and dry weights of stem and root than control T. aestivum in pure culture. The greater seed size, larger initial leaf area, and height allowed T. aestivum to produce greater final leaf area and dry stem weight in control mixtures than L. multiflorum. Watering following drought shifted the relative performance of the two species in pure cultures and mixtures compared to controls. The ability of T. aestivum to maintain a greater leaf expansion rate during drought and a greater leaf area afterward than L, multiflorum allowed T. aestivum to attain greater growth than L. multiflorum in pure cultures exposed to temporary drought followed by watering. Conversely, drought and its relief enhanced the relative competitiveness of L. multiflorum compared to controls in mixtures with T. aestivum. During 4 wk of watering following the drought, L. multiflorum in mixtures grew vigorously and was similar to T. aestivum in all measures except in height and dry stem weight. Thus, L. multiflorum was similar in root growth with T. aestivum in control and drought mixtures and had its aboveground competitiveness amplified by the cycle of drought and watering in this study. There was no evidence of an allelopathic interaction between the two species.
Live oak (Quercus virginiana Mill. # QUEVM) on the Texas Coastal Prairie was treated with herbicides using ground and aerial application methods. Tebuthiuron {N-[5-(1,1-dimethylethyl)-1,3,4-thiadiazol-2-yl]-N,N′-dimethylurea} and pellets of buthidazole {3-[5-(1,1-dimethylethyl)-1,3,4-thiadiazol-2-yl]-4-hydroxy-1-methyl-2-imidazolidinone} at 2.2 kg ai/ha were the most effective herbicides, killing 60 to 95% of the live oak. Tebuthiuron pellets 3.2 mm in diam were more effective than the wettable powder at 1.1 kg/ha. Bay Met 1486 {N-[5-(ethylsulfonyl)-1,3,4-thiadiazole-2-yl]-N,N′-dimethylurea}, Dowco 290 (3,6-dichloropicolinic acid), hexazinone [3-cyclohexyl-6-(dimethylamino)-1-methyl-1,3,5-triazine-2,4(1H, 3H)-dione], and picloram (4-amino-3,5,6-trichloropicolinic acid) reduced the live oak canopy at 2.2 kg/ha whereas 2,4,5-T [(2,4,5-trichlorophenoxy)acetic acid] and triclopyr {[(3,5,6-trichloro-2-pyridinyl)oxy] acetic acid} were ineffective. Foliage-active herbicides generally were most effective in reducing the live oak canopy during the year of application. The soil-active herbicides generally were most active 1 or 2 yr after herbicide application. All herbicides reduced the live oak cover sufficiently to allow an increase in grass cover 2 to 4 months after treatment. Tebuthiuron at 2.2 kg/ha maintained a high degree of grass cover at least 2 or 3 yr after treatment.
Pelleted tebuthiuron {N-[5-(1,1-dimethylethyl)-1,3,4-thiadiazol-2-yl]-N,N′-dimethylurea} was hand broadcast at 2.2 and 4.5 kg ai/ha every month for 2 yr on an area infested with live oak (Quercus virginiana Mill. ♯4 QUEVI), post oak (Q. stellata Wangenh. ♯ QUESL), parsley hawthorn (Crataegus marshallii Egglest. ♯ CSCMS), and yaupon (Ilex vomitoria Ait. ♯ ILEVO) on the Gulf Coast Prairie near Cordele, TX. Live oak, post oak, and parsley hawthorn trees were killed at most rates and dates of tebuthiuron application. Applications of 2.2 kg/ha of tebuthiuron killed 90% or more of the yaupon plants when it was applied in October and December 1975 and February, March, and June 1976 and less than 90% when applied at other dates. On another site, pelleted tebuthiuron was aerially applied at 2.2 and 4.5 kg/ha every 3 months during 1978 and 1979 in the Post Oak Savannah near Bryan, TX. At 2.2 kg/ha, tebuthiuron killed all post oak and 80% or more of the blackjack oak (Q. marilandica Muechh.), yaupon, winged elm (Ulmus alata Michx. ♯ ULMAL), and mockernut hickory (Carya tomentosa Nutt.) regardless of date treated. Tree huckleberry (Vaccinium arboreum Marsh.) killed by tebuthiuron when applied at 2.2 kg/ha ranged from 34% in July 1979 to 69% from application in February 1978. Application of 4.5 kg/ha of tebuthiuron killed 83% or more of the tree huckleberry when it was applied in January and April 1978 and January, April, July, and October 1979. Herbaceous plant cover usually increased by the second season.
Brownspine pricklypear (Opuntia phaecantha Engelm. & Bigel.) was effectively controlled within 2 yr following application of a 1:1 mixture of 2,4,5-T [(2,4,5-trichlorophenoxy)acetic acid] and picloram (4-amino-3,5,6-trichloropicolinic acid) at a rate of 0.6 kg ae/ha. Brownspine pricklypear canopy cover and dry weight declined from approximately 23% and 3800 kg/ha to 8% and 1600 kg/ha, respectively. No significant difference in total herbaceous forage dry weight was found between plants growing inside brownspine pricklypear canopy areas and plants growing outside the canopy areas. Differences between areas in species composition were significant in that cool-season grasses dominated the canopy area of the brownspine pricklypear colonies while warm-season grasses dominated the area outside the canopy. Control of brownspine pricklypear will enhance livestock carrying capacity of rangeland in the Rolling Plains of Texas by increasing forage availability but not forage production.
System dynamics models are typically used to simulate the behaviour of the problem system under discussion environment, to help understand and solve complex problems. Group model building is a social process for including client groups in the system dynamics modelling process. Recent evidence suggests group model building is useful in supporting durable group decisions by supporting the mental models of participants to become more aligned. There have been several mechanisms proposed to explain these effects. This paper creates a combined model that links the five best-supported mechanisms. The combined model suggests five core conditions of group model building that contributes to its success: completing a structured task, producing a tangible artefact, representing system complexity, the portrayal of causal links, and easy modification or transformation of the artefact by participants. Practitioners are encouraged to use group decision approaches that integrate these conditions.