We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Wild oat is a significant weed of cropping systems in the Canadian Prairies. Wild oat resistance to herbicides has increased interest in the use of nonchemical management strategies. Harvest weed seed control techniques such as impact mills or chaff collection have been of interest in Prairie crops, with wild oat identified as a key target. To evaluate the effects of crop rotation maturity, harvest management, and harvest weed seed control on wild oat, a study was conducted from 2016 to 2018 at four locations in the Canadian Prairies. Two-year crop rotations with either early, normal, or late-maturing crops were implemented before barley was seeded across all rotations in the final year. In addition, a second factor of harvest management (swathing or straight cut) was included in the study. Chaff collection was used in this study to quantify wild oat seeds that were targetable by harvest weed seed control techniques. The hypothesis was that earlier maturing crops would result in increased wild oat capture at harvest and, therefore, lower wild oat populations. Wild oat density and wild oat biomass were lowest in the early maturing rotations. In addition, wild oat exhibited lower biomass in swathed crops than straight-cut crops. Wild oat seedbank levels reflected a similar trend with the lowest densities occurring in early maturing rotation, then the normal maturity rotation, and the late maturing rotation, which had the highest seedbank densities. Wild oat densities increased in all crop rotations; however, only harvest weed seed control and crop rotation were implemented as control measures. Wild oat numbers in the chaff were not reflective of the earliness of harvest. Crop yields suggest that competitive winter wheat stands contributed to the success of the early maturing rotations compared to other treatments. Early maturing rotations resulted in reduced wild oat populations, likely through a combination of crop competitiveness and rotational diversity, and harvest weed seed control management effects from earlier maturing crops.
Objectives: Understanding how the importance of modifiable risk factors for dementia varies by cognitive status and sex is vital for the development of effective approaches to dementia prevention. We aimed to calculate population attributable fractions (PAFs) for incident dementia associated with sets of risk factors while exploring sex differences in individuals who are cognitively normal (CN) or has mild cognitive impairment (MCI).
Methods: Longitudinal data from the Rush University Memory and Aging Project (MAP) were analysed. Included participants were aged over 50 years and were CN or with a diagnosis of MCI at their baseline assessment. Analyses considered fifteen potential dementia risk factors covering cardiometabolic, lifestyle, psychosocial and sensory domains. We used Cox proportional hazard models to estimate the hazard ratios for incident dementia associated with dementia risk factors and calculated weighted PAFs. All analyses were repeated stratified by sex.
Results: The analytical sample comprised 754 cognitively normal participants (77.2% female) and 242 participants with a diagnosis of MCI (71.9% female), of whom 214 (28.4%) and 120 (49.6%) were diagnosed with dementia across the follow-up, respectively. Although the weighted overall PAF was similar for CN (24.7%) and MCI (25.2%) subgroups, sex differences were present in both. Compared to in females, PAFs were higher in males in both CN (42.5% vs. 25.1%) and MCI (51.6% vs 12.3%) subgroups. The profiles of contributing risk factors also varied by sex. In males, the highest PAFs were smoking (11.1%), vision impairment (6.2%) and stroke (6.0%) in CN and smoking (13.3%), physical inactivity (12.9%) and heart attack (7.9%) in MCI. In females, the highest PAFs were unmarried marital status (4.9%), depression (4.1%) and social isolation (3.8%) in CN and vision impairment (4.4%), increased alcohol intake (3.5%) and depression (2.6%) in MCI.
Conclusions: These findings support the notion that dementia risk is modifiable after the onset of MCI. They also highlight the potential benefits of considering an individual’s cognitive status and sex when formulating dementia prevention strategies.
A green, Lithic Torriorthent soil derived from a celadonite-rich, hydrothermally altered basalt immediately north of the Mojave Desert region in southern California was studied to investigate the fate of the celadonite in a pedogenic weathering environment. Celadonite was found to be disseminated in the highly altered rock matrix with cristobalite, chalcedony, and stilbite. X-ray powder diffraction (XRD) showed the soil material to contain celadonite having a d(060) value of 1.510 Å, indicative of its dioctahedral nature. Very little smectite was detected in the parent material, whereas Fe-rich smectite was found to be abundant in the soil. The Fe-smectite and celadonite were identified as the sole components of the green-colored clay fraction (<2 µm) of all soil horizons. The soil clay showed a single d(060) value of 1.507 Å, indicating that the smectite was also dioctahedral and that its b-dimension was the same as that of the celadonite. Mössbauer spectroscopy showed that the chemical environments of Fe in the rock-matrix celadonite and in the smectite-rich soil clay were also nearly identical. These data strongly suggest a simple transformation of the celadonite to an Fe-rich smectite during soil formation.
Supporting evidence for this transformation was obtained by artificial weathering of celadonite, using sodium tetraphenyl boron to extract interlayer K. The intensity of the 001 XRD peak (at 10.1 Å) of celadonite was greatly reduced after the treatment and a peak at 14.4 Å, absent in the pattern of the untreated material, appeared. On glycolation of the sample, this peak expanded to 17.4 Å, similar to the behavior of the soil smectite. The alteration of celadonite to smectite is a simple transformation requiring only the loss of interlayer K. The transformation is apparently possible under present-day conditions, inasmuch as the erosional landscape position, shallow depth, and lack of significant horizonation indicate that the soil is very young.
Modafinil was tested for efficacy in facilitating abstinence in cocaine-dependent patients, compared to placebo.
Methods:
This is a double-blind placebo-controlled study, with 12 weeks of treatment and a 4-week follow-up. 210 treatment-seekers with DSM-IV diagnosis of cocaine dependence consented and enrolled. 72 participants were randomized to placebo, 69 to modafinil 200mg, and 69 to modafinil 400mg, taken once daily on awakening. Participants attended the clinic three times per week for assessments and urine drug screens, and had one hour of individual psychotherapy once per week. The primary outcome was the increase in weekly percentage of non-use days. Secondary outcomes included: decrease in the weekly median log of urine benzoylecgonine, subgroup analyses of balancing factors and co-morbid conditions, self-report of alcohol use, addiction severity, craving, and risk behaviors for HIV.
Results:
125 participants completed 12 weeks of treatment (60%). The GEE regression analysis showed that for the total sample, the difference between modafinil groups and placebo in the weekly percentage of cocaine non-use days over the 12-week treatment period was not statistically significant (p=0.95). A post-hoc analysis showed a significant effect for modafinil, only in the subgroup of cocaine patients without alcohol dependence. Modafinil 200mg also showed significant effects of an increase in the total number of consecutive non-use days for cocaine (p=0.02), and a reduction in craving (p=0.04).
Conclusions:
These data suggest that modafinil, in combination with individual behavioral therapy, was effective for increasing cocaine non-use days in participants without co-morbid alcohol dependence, and in reducing craving.
The residual closure of a subgroup H of a group G is the intersection of all virtually normal subgroups of G containing H. We show that if G is generated by finitely many cosets of H and if H is commensurated, then the residual closure of H in G is virtually normal. This implies that separable commensurated subgroups of finitely generated groups are virtually normal. A stream of applications to separable subgroups, polycyclic groups, residually finite groups, groups acting on trees, lattices in products of trees and just-infinite groups then flows from this main result.
Introduction: Identification of latent safety threats (LSTs) in the emergency department is an important aspect of quality improvement that can lead to improved patient care. In situ simulation (ISS) takes place in the real clinical environment and multidisciplinary teams can participate in diverse high acuity scenarios to identify LSTs. The purpose of this study is to examine the influence that the profession of the participant (i.e. physician, registered nurse, or respiratory therapist) has on the identification of LSTs during ISS. Methods: Six resuscitation- based adult and pediatric simulated scenarios were developed and delivered to multidisciplinary teams in the Kingston General Hospital ED. Each ISS session consisted of a 10- minute scenario, followed by 3-minutes of individual survey completion and a 7- minute group debrief led by ISS facilitators. An objective assessor recorded LSTs identified during each debrief. Surveys were completed prior to debrief to reduce response bias. Data was collected on participant demographics and perceived LSTs classified in the following categories: medication; equipment; resources and staffing; teamwork and communication; or other. Two reviewers evaluated survey responses and debrief notes to formulate a list of unique LSTs across scenarios and professions. The overall number and type of LSTs from surveys was identified and stratified by health care provider. Results: Thirteen ISS sessions were conducted with a total of 59 participants. Thirty- four unique LSTs (8 medication, 15 equipment, 5 resource, 4 communication, and 2 miscellaneous issues) were identified from surveys and debrief notes. Overall, MDs (n = 12) reported 19 LSTss (n = 41) reported 77 LSTs, and RTs (n = 6) reported 4 LSTs based on individual survey data. The most commonly identified category of LSTs reported by MDs (36.8%) and RTs (75%) was equipment issues while RNs most commonly identified medication issues (36.4%). Participants with □5 years of experience in their profession, on average identified more LSTs in surveys than participants with >5 years experience (1.9 LSTs vs 1.5 LSTs respectively). Conclusion: Nursing staff identified the highest number of LSTs across all categories. There was fairly unanimous identification of major LSTs across professions, however each profession did identify unique perspectives on LSTs in survey responses. ISS programs with the purpose of LST identification would benefit from multidisciplinary participation.
In sub-Saharan Africa, there are limited data on burden of non-alcohol substance abuse (NAS) and depressive symptoms (DS), yet potential risk factors such as alcohol and intimate partner violence (IPV) are common and NAS abuse may be the rise. The aim of this study was to measure the burden of DS and NAS abuse, and determine whether alcohol use and IPV are associated with DS and/or NAS abuse. We conducted a cross-sectional study at five sites in four countries: Nigeria (nurses), South Africa (teachers), Tanzania (teachers) and two sites in Uganda (rural and peri-urban residents). Participants were selected by simple random sampling from a sampling frame at each of the study sites. We used a standardized tool to collect data on demographics, alcohol use and NAS use, IPV and DS and calculated prevalence ratios (PR). We enrolled 1415 respondents and of these 34.6% were male. DS occurred among 383 (32.3%) and NAS use among 52 (4.3%). In the multivariable analysis, being female (PR = 1.49, p = 0.008), NAS abuse (PR = 2.06, p = 0.02) and IPV (PR = 2.93, p < 0.001) were significantly associated with DS. Older age [odds ratio (OR) = 0.31, p < 0.001)], female (OR = 0.48, p = 0.036) were protective of NAS but current smokers (OR = 2.98, p < 0.001) and those reporting IPV (OR = 2.16, p = 0.024) were more likely to use NAS. Longitudinal studies should be done to establish temporal relationships with these risk factors to provide basis for interventions.
The 2013 Infection Prevention and Control (IP&C) Guideline for Cystic Fibrosis (CF) was commissioned by the CF Foundation as an update of the 2003 Infection Control Guideline for CF. During the past decade, new knowledge and new challenges provided the following rationale to develop updated IP&C strategies for this unique population:
1. The need to integrate relevant recommendations from evidence-based guidelines published since 2003 into IP&C practices for CF. These included guidelines from the Centers for Disease Control and Prevention (CDC)/Healthcare Infection Control Practices Advisory Committee (HICPAC), the World Health Organization (WHO), and key professional societies, including the Infectious Diseases Society of America (IDSA) and the Society for Healthcare Epidemiology of America (SHEA). During the past decade, new evidence has led to a renewed emphasis on source containment of potential pathogens and the role played by the contaminated healthcare environment in the transmission of infectious agents. Furthermore, an increased understanding of the importance of the application of implementation science, monitoring adherence, and feedback principles has been shown to increase the effectiveness of IP&C guideline recommendations.
2. Experience with emerging pathogens in the non-CF population has expanded our understanding of droplet transmission of respiratory pathogens and can inform IP&C strategies for CF. These pathogens include severe acute respiratory syndrome coronavirus and the 2009 influenza A H1N1. Lessons learned about preventing transmission of methicillin-resistant Staphylococcus aureus (MRSA) and multidrug-resistant gram-negative pathogens in non-CF patient populations also can inform IP&C strategies for CF.
Holdawayella juglandis Loan, a new species, and some aspects of the anatomy of the final-instar larva of the only other known species of this genus, H. tingiphaga Loan, are described. Host records and field data are reported for both species for Ontario. Though the adults of the two species are very similar morphologically, H. juglandis lacks parthenogenesis, is restricted to the tingid Corythucha juglandis Fitch that breeds only on species of Juglans L., and has specific phenological characteristics. In both species, the head sclerites of the final-instar larva are typically euphorine, and the abdomen bears 3 unpaired, medial, teat-like appendages on segments 5, 6, and 7 whose function is unknown and which do not seem to have homologues in other insect larvae. Both species are single-brooded, lay their eggs in late-instar nymphs and possibly also teneral adults of Corythucha, overwinter as first-instar larvae in adults of these tingids, and complete their endoparasitic and cocoon development in about 90 days during the following spring and summer so that adults of H. tingiphaga begin to emerge from the soil about mid-July and those of H. juglandis about 8 days later. New host records for H. tingiphaga are C. coryli O. & D., C. heidmanni Drake, and C. ulmi O. & D.
Examining the relationship between glucose intolerance and dietary intake in genetically similar populations with different dietary patterns and rates of type 2 diabetes may provide important insights into the role of diet in the pathogenesis of this disease. The objective of the present study was to assess the relationship between dietary variables and dysglycaemia/type 2 diabetes among three populations of African origin. The study design consists of a cross-sectional study of men and women of African descent aged 24–74 years from Cameroon (n 1790), Jamaica (n 857) and Manchester, UK (n 258) who were not known to have diabetes. Each participant had anthropometric measurements and underwent a 2 h 75 g oral glucose tolerance test. Habitual dietary intake was estimated with quantitative FFQ, developed specifically for each country. The age-adjusted prevalence of undiagnosed type 2 diabetes in Cameroon was low (1·1 %), but it was higher in Jamaica (11·6 %) and the UK (12·6 %). Adjusted generalised linear and latent mixed models used to obtain OR indicated that each 1·0 % increment in energy from protein, total fat and saturated fats significantly increased the odds of type 2 diabetes by 9 (95 % CI 1·02, 1·16) %, 5 (95 % CI, 1·01, 1·08) % and 16 (95 % CI 1·08, 1·25) %, respectively. A 1 % increase in energy from carbohydrates and a 0·1 unit increment in the PUFA:SFA ratio were associated with significantly reduced odds of type 2 diabetes. The results show independent effects of dietary factors on hyperglycaemia in African origin populations. Whether modifying intake of specific macronutrients helps diabetes prevention needs testing in randomised trials.
The group G streptococcus has generally not been considered a prominent pathogen. In a 1982 study of the colonization rate by β-haemoly tic streptococci in apparently healthy children, age 5–11 years, 25 of 69 isolates belonged to group G. This surprisingly high rate of group G colonization (14·3%) led to a retrospective study of school surveys in 1967 which showed that the colonization rate with this organism was 2·3% (range 1·3–3·5%). A review of bacitracin-sensitive streptococcal isolates from hospital admissions of patients with acute glomerulonephritis (AGN), rheumatic fever, and their siblings, between January 1967 and July 1980, was conducted. Of 1063 bacitracin-sensitive isolates, 63 were group G, and 52 of these were isolated from AGN patients and their siblings, i.e. 7 from skin lesions of AGN patients, 40 from the throats of siblings and only 5 from the skins of the siblings. The other 11 group G isolates were from rheumatic-fever patients and their siblings. Thus, the group G colonization rate fluctuates in the population. The isolation of only group G streptococci from skin lesions of patients with AGN suggests a possible association between group G streptococcal pyoderma and acute post-streptococcal glomerulonephritis.
Since 1973 epidemiological surveillance of laboratory-confirmed hepatitis B virus infection has been undertaken in Scotland. During the ten-year period, 1973–82, 2893 persons with laboratory evidence of infection were reported and the number increased by almost threefold between the beginning and the end of this time. Males accounted for 66 % of the patients and intravenous drug abuse was the most commonly encountered risk factor. The low risk to laboratory staff is confirmed, but among National Health Service hospital staff nurses accounted for 54% of those reported.
We have previously described the characteristics of a relatively non-pathogenic laboratory strain of S. mattheei, attenuation of which was apparently caused by passage in hamsters. We now show that chronic infection with this avirulent strain largely protects sheep from the manifestations of acute schistosomiasis when challenged with a virulent strain of S. mattheei.
Four sheep were each infected with 10 000 cercariae of the avirulent strain and, together with four worm-free sheep, challenged 63 weeks later with 10 000 S. mattheei cercariae of a pathogenic strain. Four more sheep acted as uninfected controls. Following challenge, the animals were weighed and bled weekly for PCV and serum protein determinations, and egg counts were carried out fortnightly on faeces taken from the rectum. Red cell and albumin turnover were monitored for two weeks immediately before challenge and for a similar period before necropsy, when the adult worms were recovered by perfusion and tissues sampled for histopathology and egg counting.
The unvaccinated sheep developed severe disease 6—12 weeks after exposure characterised by marked anaemia, hypoalbuminaemia and hyper-gamma globulinaemia coinciding with the passage of blood-stained faeces and progressive inappetence. In the vaccinated sheep, there was and even earlier rise in gamma globulins, but the other clinico-pathological changes were generally slower to develop and much milder in severity. The parasitological data showed that although this was partly due to a reduction in the establishment of the challenge worm population the main factor was probably a reduction in the fecundity of these worms.
Studies were made to find evidence of louping-ill virus infection in free-living red grouse and relate this to their breeding success. In areas where ticks were abundant 61 (84%) adult grouse had antibody to the virus compared with 1 (10%) in areas where ticks were relatively scarce. Of 162 chicks tested 25 were shown to be viraemic. Infected chicks were of significantly less weight than comparably aged uninfected birds and the probability that they died was much greater than that of uninfected birds. It is concluded that the relatively poor breeding success in areas of high tick numbers was principally due to infection with louping-ill virus. The susceptibility of the red grouse to infection is discussed.
The objectives of this study were to determine the prevalence of scabies in an infested village; to educate the residents on self-treatment and prevention by the use of 5 % monosulfiram soap; to evaluate the short term effectiveness of this intervention by determining, 2 weeks later, the compliance to self-treatment and prevention; and to determine the prevalence rate on the second visit. In 59 households (96·7% of the village) containing 313 persons, an educational session was held and a leaflet distributed on the use and availability of the soap. Thirteen persons (4·2%) from eight households (13·6%) had scabies. After 2 weeks, 7 persons (2·2%) (2 persisting and 5 new cases) from 5 households (8·5%) were infested. Thus a cure rate of 85 % was obtained though the prevalence rate showed no statistically significant difference. Among the under 15 year olds, the numbers infected decreased from 10 to 3 while among the over 15 year olds, the numbers infected increased from 3 to 4, neither reading significance at the 5 % level.
The effects of nutrient enrichment of natural water bodies range from small increases in plant biomass and production, to gross deterioration of water quality. The input of nutrients (e.g. nitrogen and phosphorus) to the sea off NW Europe (especially the North Sea) has increased dramatically over the last three or four decades (Folkard & Jones, 1974; Bennekom et al., 1975; Postma, 1978; Cadee, 1986a) but there is uncertainty about the effects on the ecosystem. One possible effect might be to induce changes in the phytoplankton community. Such an effect has been reported for the North Sea, where increases in flagellate algae have been observed (Gieskes & Kraay, 1977; Postma, 1985; Cadee, 1986b; Batje & Michaelis, 1986). Phaeocystis is one such alga, and its purported involvement in the formation of large quantities of foam, observed on European beaches (Batje & Michaelis, 1986; Weisse et al, 1986), together with evidence that the alga is a source of atmospheric sulphur compounds (Barnard et al, 1984) (with implications for atmospheric acidity), has attracted particular attention and concern