We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Gun culture is properly measured by a population's emotional and symbolic attachment to guns and not by rates of gun ownership. Using data from the Baylor Religion Survey (wave 6), we find that nearly all gun owners feel that guns provide them with a physical sense of security (Gun Security), but a distinct and crucial sub-set of owners express an additional and strong attachment to their weapons (Gun Sanctity). Gun Sanctity measures the extent to which owners think their guns make them more patriotic, respected, in control, and valued by their family and community. We propose that Gun Sanctity is a form of quasi-religious or magical thinking in which an object is imbued with unseen powers. To assess this proposal, we look at the extent to which gun ownership, Gun Security, and Gun Sanctity are related to traditional religion and various forms of magical thinking, namely, (a) conspiratorialism, (b) the belief that prayer can fix financial and health problems, and (c) support for Christian Statism, a form of American theocracy. We find that Gun Sanctity is highly predictive of different forms of magical thinking but is often unrelated to more traditional religious practices and beliefs.
Therapeutics targeting frontotemporal dementia (FTD) are entering clinical trials. There are challenges to conducting these studies, including the relative rarity of the disease. Remote assessment tools could increase access to clinical research and pave the way for decentralized clinical trials. We developed the ALLFTD Mobile App, a smartphone application that includes assessments of cognition, speech/language, and motor functioning. The objectives were to determine the feasibility and acceptability of collecting remote smartphone data in a multicenter FTD research study and evaluate the reliability and validity of the smartphone cognitive and motor measures.
Participants and Methods:
A diagnostically mixed sample of 207 participants with FTD or from familial FTD kindreds (CDR®+NACC-FTLD=0 [n=91]; CDR®+NACC-FTLD=0.5 [n=39]; CDR®+NACC-FTLD>1 [n=39]; unknown [n=38]) were asked to remotely complete a battery of tests on their smartphones three times over two weeks. Measures included five executive functioning (EF) tests, an adaptive memory test, and participant experience surveys. A subset completed smartphone tests of balance at home (n=31) and a finger tapping test (FTT) in the clinic (n=11). We analyzed adherence (percentage of available measures that were completed) and user experience. We evaluated Spearman-Brown split-half reliability (100 iterations) using the first available assessment for each participant. We assessed test-retest reliability across all available assessments by estimating intraclass correlation coefficients (ICC). To investigate construct validity, we fit regression models testing the association of the smartphone measures with gold-standard neuropsychological outcomes (UDS3-EF composite [Staffaroni et al., 2021], CVLT3-Brief Form [CVLT3-BF] Immediate Recall, mechanical FTT), measures of disease severity (CDR®+NACC-FTLD Box Score & Progressive Supranuclear Palsy Rating Scale [PSPRS]), and regional gray matter volumes (cognitive tests only).
Results:
Participants completed 70% of tasks. Most reported that the instructions were understandable (93%), considered the time commitment acceptable (97%), and were willing to complete additional assessments (98%). Split-half reliability was excellent for the executive functioning (r’s=0.93-0.99) and good for the memory test (r=0.78). Test-retest reliabilities ranged from acceptable to excellent for cognitive tasks (ICC: 0.70-0.96) and were excellent for the balance (ICC=0.97) and good for FTT (ICC=0.89). Smartphone EF measures were strongly associated with the UDS3-EF composite (ß's=0.6-0.8, all p<.001), and the memory test was strongly correlated with total immediate recall on the CVLT3-BF (ß=0.7, p<.001). Smartphone FTT was associated with mechanical FTT (ß=0.9, p=.02), and greater acceleration on the balance test was associated with more motor features (ß=0.6, p=0.02). Worse performance on all cognitive tests was associated with greater disease severity (ß's=0.5-0.7, all p<.001). Poorer performance on the smartphone EF tasks was associated with smaller frontoparietal/subcortical volume (ß's=0.4-0.6, all p<.015) and worse memory scores with smaller hippocampal volume (ß=0.5, p<.001).
Conclusions:
These results suggest remote digital data collection of cognitive and motor functioning in FTD research is feasible and acceptable. These findings also support the reliability and validity of unsupervised ALLFTD Mobile App cognitive tests and provide preliminary support for the motor measures, although further study in larger samples is required.
Antimicrobial-resistant (AMR) bacteria are a threat to public health as they can resist treatment and pass along genetic material that allows other bacteria to become drug-resistant. To assess foodborne AMR risk, the Codex Guidelines for Risk Analysis of Foodborne AMR provide a framework for risk profiles and risk assessments. Several elements of a risk profile may benefit from a scoping review (ScR). To contribute to a larger risk profile structured according to the Codex Guidelines, our objective was to conduct a ScR of the current state of knowledge on the distribution, frequency and concentrations of extended-spectrum β-lactamase (ESBL)-producing Enterobacteriaceae in salmon and shrimp. Articles were identified via a comprehensive search of five bibliographic databases. Two reviewers screened titles and abstracts for relevance and characterised full-text articles with screening forms developed a priori. Sixteen relevant studies were identified. This review found that there is a lack of Canadian data regarding ESBL-producing Enterobacteriaceae in salmon and shrimp. However, ESBL- producing Escherichia coli, Klebsiella pneumoniae and other Enterobacteriaceae have been isolated in multiple regions with a history of exporting seafood to Canada. The literature described herein will support future decision-making on this issue as research/surveillance and subsequent assessments are currently lacking.
Campylobacter spp. are one of the most common causes of bacterial gastroenteritis in Canada and worldwide. Fluoroquinolones are often used to treat complicated human campylobacteriosis and strains of Campylobacter spp. resistant to these drugs are emerging along the food chain. A scoping review was conducted to summarise how human (fluoro)quinolone-resistant (FQR; quinolones including fluoroquinolones) Campylobacter spp. infections are characterised in the literature by describing how burden of illness (BOI) associated with FQR is measured and reported, describing the variability in reporting of study characteristics, and providing a narrative review of literature that compare BOI measures of FQR Campylobacter spp. infections to those with susceptible infections. The review identified 26 studies that yielded many case reports, a lack of recent literature and a lack of Canadian data. Studies reported 26 different BOI measures and the most common were hospitalisation, diarrhoea, fever and duration of illness. There were mixed results as BOI measures reported in literature were inconsistently defined and there were limited comparisons between resistant and susceptible infections. This presents a challenge when attempting to assess the magnitude of the BOI due to FQR Campylobacter spp., highlighting the need for more research in this area.
Resistance to carbapenems in human pathogens is a growing clinical and public health concern. The carbapenems are in an antimicrobial class considered last-resort, they are used to treat human infections caused by multidrug-resistant Enterobacterales, and they are classified by the World Health Organization as ‘High Priority Critically Important Antimicrobials’. The presence of carbapenem-resistant Enterobacterales (CREs) of animal-origin is of concern because targeted studies of Canadian retail seafood revealed the presence of carbapenem resistance in a small number of Enterobacterales isolates. To further investigate this issue, a risk profile was developed examining shrimp and salmon, the two most important seafood commodities consumed by Canadians and Escherichia coli, a member of the Enterobacterales order. Carbapenem-resistant E. coli (CREc) isolates have been identified in shrimp and other seafood products. Although carbapenem use in aquaculture has not been reported, several classes of antimicrobials are utilised globally and co-selection of antimicrobial-resistant microorganisms in an aquaculture setting is also of concern. CREs have been identified in retail seafood purchased in Canada and are currently thought to be uncommon. However, data concerning CRE or CREc occurrence and distribution in seafood are limited, and argue for implementation of ongoing or periodic surveillance.
Case management has been an integral part of psychiatric practice in the United States for over a decade and has generated a large body of literature. The application of case management principles to the care of people suffering from psychiatric disorders is becoming increasingly popular in the United Kingdom and Europe and literature is now beginning to be published. However, no definitive statements about the efficacy of case management have been made due to a range of conceptual and methodological problems. The present paper is a critical review of the case management outcome literature. Reported outcomes are reviewed in the context of study design and service characteristics. The authors conclude that case management practice can have at least some impact on patients' use of services (including marked decrease in in-patient bed days); satisfaction with services; engagement with services; and social networks and relationships when it is delivered as a direct, clinical service with high staff: patient ratios. A set of recommendations are suggested for the future practice and presentation of research into case management.
A primary barrier to translation of clinical research discoveries into care delivery and population health is the lack of sustainable infrastructure bringing researchers, policymakers, practitioners, and communities together to reduce silos in knowledge and action. As National Institutes of Healthʼs (NIH) mechanism to advance translational research, Clinical and Translational Science Award (CTSA) awardees are uniquely positioned to bridge this gap. Delivering on this promise requires sustained collaboration and alignment between research institutions and public health and healthcare programs and services. We describe the collaboration of seven CTSA hubs with city, county, and state healthcare and public health organizations striving to realize this vision together. Partnership representatives convened monthly to identify key components, common and unique themes, and barriers in academic–public collaborations. All partnerships aligned the activities of the CTSA programs with the needs of the city/county/state partners, by sharing resources, responding to real-time policy questions and training needs, promoting best practices, and advancing community-engaged research, and dissemination and implementation science to narrow the knowledge-to-practice gap. Barriers included competing priorities, differing timelines, bureaucratic hurdles, and unstable funding. Academic–public health/health system partnerships represent a unique and underutilized model with potential to enhance community and population health.
A low finishing weight and poor carcass characteristics are major causes of lower incomes in extensive sheep flocks; however, the use of terminal sire crossbreeding would improve lamb performance and carcass traits under these conditions. The aim of this study was to evaluate sire breed effects on the performance of lambs born to Corriedale ewes in extensive sheep systems in Western Patagonia. A total of 10 Corriedale, 10 Dorset, nine Suffolk and seven Texel sires, 16 of which were under a genetic recorded scheme and 20 selected from flocks not participating in genetic improvement programmes, were used across six commercial farms for 2 successive years. Data were collected from 685 lambs of the four resulting genotypes. Overall, Corriedale lambs were 0.47 kg lighter at birth than crossbred lambs (P<0.001). Suffolk and Texel sired lambs required more assistance (P<0.01) at birth than Corriedale or Dorset sired lambs, with Suffolk sired lambs requiring the most assistance (8%). Ewes sired with Suffolk rams had larger (P<0.05) litters than ewes sired with Texel or Corriedale rams. Lamb live weight gain from birth to weaning was higher (P<0.001) in crossbred lambs compared with Corriedale lambs, therefore, crossbred lambs averaged 2.9 kg heavier BW (P<0.001) than Corriedale lambs. A significant sire breed x sire source interaction was detected for lamb live weight gain (P<0.05) and lamb live weight at weaning (P<0.01), showing that the heaviest lambs were from recorded sires, except for Suffolk crossbred lambs. Mortality rate to weaning was increased (P<0.05) in Suffolk cross lambs (31%), with Corriedale lambs showing the lowest (17%) mortality. Terminal sire breeds increased (P<0.001) cold carcass weight, with 13.8, 16.0, 15.2 and 14.9 kg for the Corriedale, Dorset, Suffolk and Texel sired lambs, respectively. Carcass length, kidney knob and channel fat, fat grade, grade rule and fat depth measurements were not affected by sire breed (P>0.05). Carcass conformation was higher in Texel sired lambs compared with Corriedale lambs (P<0.05), with Dorset and Suffolk sired lambs being intermediate. Crossbred lambs showed a greater (P<0.001) eye muscle than Corriedale. Commercial cuts were affected by sire breed, as a result of the Corriedale lambs being smaller and having lighter carcass than crossbred lambs. Significant improvement in lamb weights at weaning and carcass traits could be expected when using a terminal sire on Corriedale ewes in Western Patagonia. However, no advantages were detected with the use of recorded sires under these production systems.
The emergence of invasive fungal wound infections (IFIs) in combat casualties led to development of a combat trauma-specific IFI case definition and classification. Prospective data were collected from 1133 US military personnel injured in Afghanistan (June 2009–August 2011). The IFI rates ranged from 0·2% to 11·7% among ward and intensive care unit admissions, respectively (6·8% overall). Seventy-seven IFI cases were classified as proven/probable (n = 54) and possible/unclassifiable (n = 23) and compared in a case-case analysis. There was no difference in clinical characteristics between the proven/probable and possible/unclassifiable cases. Possible IFI cases had shorter time to diagnosis (P = 0·02) and initiation of antifungal therapy (P = 0·05) and fewer operative visits (P = 0·002) compared to proven/probable cases, but clinical outcomes were similar between the groups. Although the trauma-related IFI classification scheme did not provide prognostic information, it is an effective tool for clinical and epidemiological surveillance and research.
To assess the prevalence of traumatic stress experienced by secondary responders to disaster events to determine if mental health education should be included in HAZWOPER training.
Methods
Preexisting survey tools for assessing posttraumatic stress disorder (PTSD), resiliency, and mental distress were combined to form a web-based survey tool that was distributed to individuals functioning in secondary response roles. Data were analyzed using the Fisher exact test, 1-way ANOVA, and 1-sample t tests.
Results
Respondents reported elevated PTSD levels (32.9%) as compared to the general population. HAZWOPER-trained responders with disaster work experience were more likely to be classified as PTSD positive as compared to untrained, inexperienced responders and those possessing only training or experience. A majority (68.75%) scored below the mean resiliency level of 80.4 on the Connor-Davidson Resilience Scale. Respondents with only training or both training and experience were more likely to exhibit lower resiliency scores than those with no training or experience. PTSD positivity correlated with disaster experience. Among respondents, 91% indicated support for mental health education.
Conclusions
Given the results of the survey, consideration should be given to the inclusion of pre- and postdeployment mental health education in the HAZWOPER training regimen. (Disaster Med Public Health Preparedness. 2013;0:1-9)
Over the past decade, a growing number of deep imaging surveys have started to provide meaningful constraints on the population of extrasolar giant planets at large orbital separation. Primary targets for these surveys have been carefully selected based on their age, distance and spectral type, and often on their membership to young nearby associations where all stars share common kinematics, photometric and spectroscopic properties. The next step is a wider statistical analysis of the frequency and properties of low mass companions as a function of stellar mass and orbital separation. In late 2009, we initiated a coordinated European Large Program using angular differential imaging in the H band (1.66 μm) with NaCo at the VLT. Our aim is to provide a comprehensive and statistically significant study of the occurrence of extrasolar giant planets and brown dwarfs at large (5-500 AU) orbital separation around ~150 young, nearby stars, a large fraction of which have never been observed at very deep contrast. The survey has now been completed and we present the data analysis and detection limits for the observed sample, for which we reach the planetary-mass domain at separations of ≳50 AU on average. We also present the results of the statistical analysis that has been performed over the 75 targets newly observed at high-contrast. We discuss the details of the statistical analysis and the physical constraints that our survey provides for the frequency and formation scenario of planetary mass companions at large separation.
A study was undertaken to investigate the performance of breeding ewes fed a range of forage and concentrate-based diets in late pregnancy, balanced for supply of metabolizable protein (MP). For the final 6 weeks before lambing, 104 twin-bearing multiparous ewes were offered one of four diets: adlibitum precision-chop grass silage + 0.55 kg/day concentrates (GS); ad libitum maize silage + 0.55 kg/day concentrates (MS); a 1 : 1 mixture (on a dry matter (DM) basis) of grass silage and maize silage fed ad libitum + 0.55 kg/day (GSMS); or 1.55 kg/day concentrates + 50 g/day chopped barley straw (C). The CP content of the concentrates was varied between treatments (157 to 296 g/kg DM) with the aim of achieving a daily intake of 130 g/day MP across all treatments. Compared with ewes fed GS, forage DM intake was higher (P < 0.05) in ewes fed MS (+0.21 kg/day) and GSMS (+0.16 kg/day), resulting in higher (P < 0.001) total DM intakes with these treatments. C ewes had the lowest total DM intake of all the treatments examined (P < 0.001). C ewes lost more live weight (LW; P < 0.001) and body condition score (BCS; P < 0.05) during the first 3 weeks of the study but there were no dietary effects on ewe LW or BCS thereafter. The incidence of dystocia was lower (P < 0.01) in C ewes compared with those offered silage-based diets (7.5% v. 37.4% ewes), and was higher (P < 0.01) in ewes fed MS compared with GS or GSMS (50.7%, 34.7% and 26.9%, respectively). There were no significant dietary effects on the plasma metabolite concentrations of ewes in late pregnancy, pre-weaning lamb mortality, weaned lamb output per ewe or on lamb growth rate. The results of this study demonstrate that both maize silage and all-concentrate diets can replace grass silage in pregnant ewe rations without impacting on performance, provided the supply of MP is non-limiting. The higher incidence of dystocia in ewes fed maize silage as the sole forage is a concern.
Knowledge of the approximate rate of nymphal development at different altitudes should be most useful in planning control operations against the clearwinged grasshopper, Carmnula Pellucida( Scudder), on rangelands in the interior of British Columbia. When hatching and development are observed ar any one altitude varying from under 1500 ft. to over 3500 ft., it should he possible on the basis of tGs knowledge to predict the most advantageous time to commence control operations at other altitudes on the same rangeland. Economic damage can be controlled by a single treatment, applied before the nymphs have developed to the fourth instar.
Sixty-five Holstein–Friesian calves were randomly allocated to one of eight nutritional treatments at 4 days of age. In this factorial design study, the treatments comprised of four levels of milk replacer (MR) mixed in 6 l of water (500, 750, 1000 and 1250 g/day) × two crude protein (CP) concentrations (230 and 270 g CP/kg dry matter (DM)). MR was fed via automatic teat feeders and concentrates were offered via automated dispensers during the pre-wean period. MR and calf starter concentrate intake were recorded until weaning with live weight and body measurements recorded throughout the rearing period until heifers entered the dairy herd at a targeted 24 months of age. There was no effect of MR protein concentration on concentrate or MR intake, and no effect on body size or live weight at any stage of development. During the pre-weaning period, for every 100 g increase in MR allowance, concentrate consumption was reduced by 39 g/day. While, for every 100 g increase in the amount of MR offered, live weight at days 28 and 270 increased by 0.76 and 2.61 kg, respectively (P < 0.05). Increasing MR feed levels increased (P < 0.05) heart girth and body condition score at recordings during the first year of life, but these effects disappeared thereafter. Increasing MR feeding level tended to reduce both age at first observed oestrus and age at first service but no significant effect on age at first calving was observed. Neither MR feeding level nor MR CP content affected post-calving live weight or subsequent milk production. Balance measurements conducted using 44 male calves during the pre-weaning period showed that increasing milk allowance increased energy and nitrogen (N) intake, diet DM digestibility, true N digestibility and the biological value of the dietary protein. Increasing the MR protein content had no significant effect on the apparent digestibility of N or DM.
The objectives of this study were to investigate the effects of fish oil supplementation on performance and muscle fatty acid composition of hill lambs finished on grass-based or concentrate-based diets, and to examine the interaction with selenium (Se) status. In September 2006, 180 entire male lambs of mixed breeds were sourced from six hill farms after weaning and finished on five dietary treatments: grazed grass (GG), grass +0.4 kg/day cereal-based concentrate (GC), grass +0.4 kg/day cereal-based concentrate enriched with fish oil (GF), ad libitum cereal-based concentrate (HC) and ad libitum fish oil-enriched concentrate (HF). Within each treatment, half of the lambs were also supplemented with barium selenate by subcutaneous injection. At the start of the trial, the proportion of lambs with a marginal (<0.76 μmol/l) or deficient (<0.38 μmol/l) plasma Se status was 0.84 and 0.39, respectively. Compared with control lambs, GG lambs treated with Se had higher (P < 0.01) plasma Se levels, whereas erythrocyte glutathione peroxidase activity was higher (P < 0.01) for Se-supplemented lambs fed diets GG and GF. However, Se supplementation had no effects on any aspect of animal performance. Fish oil increased (P < 0.05) levels of 22:5n-3 and 22:6n-3 in the Longissimus dorsi of HF lambs but otherwise had no effect on the health attributes of lamb meat. There were no significant effects of fish oil on dry matter intake, animal performance or lamb carcass characteristics. Daily carcass weight gain (CWG; P < 0.001), carcass weight (P < 0.01) and conformation score (P < 0.01) increased with increasing concentrate inputs. Lambs fed concentrate-based diets achieved a higher mean CWG (P < 0.001), dressing proportion (P < 0.001) and carcass weight (P < 0.011), and were slaughtered up to 8.3 days earlier (P < 0.05) and at 1.2 kg lower (P < 0.05) live weight than pasture-fed lambs. However, carcasses from grass-fed lambs contained lower levels of perinephric and retroperitoneal fat (P < 0.05), and had less fat over the Iliocostalis thoracis (P < 0.001) and Obliquus internus abdominis (P < 0.05). Meat from grass-fed lambs also had lower levels of 18:2n-6 and total n-6 fatty acids compared with those finished indoors. The results of this study demonstrate that fish oil supplementation has some benefits for the health attributes of meat from lambs fed concentrate-based diets but not grass-based diets. Supplementing Se-deficient lambs with barium selenate will improve Se status of lambs fed zero-concentrate diets, but has no additional benefit when lambs are already consuming their daily Se requirement from concentrates or when fish oil-enriched diets are fed.
Government policies relating to red meat production take account of the carbon footprint, environmental impact, and contributions to human health and nutrition, biodiversity and food security. This paper reviews the impact of grazing on these parameters and their interactions, identifying those practices that best meet governments’ strategic goals. The recent focus of research on livestock grazing and biodiversity has been on reducing grazing intensity on hill and upland areas. Although this produces rapid increases in sward height and herbage mass, changes in structural diversity and plant species are slower, with no appreciable short-term increases in biodiversity so that environmental policies that simply involve reductions in numbers of livestock may not result in increased biodiversity. Furthermore, upland areas rely heavily on nutrient inputs to pastures so that withdrawal of these inputs can threaten food security. Differences in grazing patterns among breeds increase our ability to manage biodiversity if they are matched appropriately to different conservation grazing goals. Lowland grassland systems differ from upland pastures in that additional nutrients in the form of organic and inorganic fertilisers are more frequently applied to lowland pastures. Appropriate management of these nutrient applications is required, to reduce the associated environmental impact. New slurry-spreading techniques and technologies (e.g. the trailing shoe) help reduce nutrient losses but high nitrogen losses from urine deposition remain a key issue for lowland grassland systems. Nitrification inhibitors have the greatest potential to successfully tackle this problem. Greenhouse gas (GHG) emissions are lower from indoor-based systems that use concentrates to shorten finishing periods. The challenge is to achieve the same level of performance from grass-based systems. Research has shown potential solutions through the use of forages containing condensed tannins or establishing swards with a high proportion of clover and high-sugar grasses. Relative to feeding conserved forage or concentrates, grazing fresh grass not only reduces GHG emissions but also enhances the fatty acid composition of meat in terms of consumer health. It is possible to influence biodiversity, nutrient utilisation, GHG emissions and the nutritional quality of meat in grass-based systems, but each of these parameters is intrinsically linked and should not be considered in isolation. Interactions between these parameters must be considered carefully when policies are being developed, in order to ensure that strategies designed to achieve positive gains in one category do not lead to a negative impact in another. Some win–win outcomes are identified.