We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This article explores and discusses the change processes and pedagogical dilemmas ignited by introducing wild pedagogies to pedagogical employees in Danish early childhood institutions. By analysing experiments aimed at developing new play and learning environments, carried out as part of a large design-based research project, we discuss how existing “roots” of early childhood education in Denmark provide a fertile soil for the introduction of wild pedagogies. We identify two “shoots of change” with a potential for pushing the status quo in relations between children, adults, and more-than-human nature. Centreing on altering the place of nature in early childhood education and carving out time for more open approaches, these shoots are in close dialogue with wild pedagogies. By experimenting with these shoots of change, pedagogical dilemmas became more visible, important, and present to the participants. Attending to and exploring such dilemmas are crucial aspects of keeping socio-cultural change processes in motion.
Attention deficit/hyperactivity disorder (ADHD) prevalence has increased in the last 10 years, most likely due to increased recognition by clinicians. Even so, an issue with under-diagnostics may persist. Historically ADHD has been described as a male-dominant disorder. However, recent evidence shows that ADHD prevalence is similar between the sexes, but that the related impairment or symptomatology might vary. This study estimated the prevalence of undiagnosed ADHD symptoms (pADHD) and explored the sex-stratified symptomatology and associations with self-perceived health-related quality of life (HRQL) and experience of depressive symptoms.
Methods
This was done in a unique cohort of 50,937 healthy blood donors – individuals who successfully maintain regular commitments despite potential ADHD symptoms. ADHD symptoms were estimated using the Adult ADHD Self-Report Scale (ASRS), health-related quality of life (HRQL) measured using mental and physical component scores (MCS/PCS) estimated based on a 12-item Short-Form Health Survey (SF-12) with a higher score indicating better HRQL, and depressive symptoms were measured using Major Depression Inventory (MDI) with higher score indicating more depressive symptoms.
Results
In total, 3% were classified with pADHD (sex ratio 1:1). pADHD was associated with reduced MCS and PCS, and increased MDI score. Males scored on average higher on inattentive symptoms compared to females, whereas females scored on average higher on hyperactive-impulsive symptoms. Individuals scoring high on the combined inattentive and hyperactive-impulsive ADHD symptom presentation were most likely to be impaired in terms of higher MDI scores and lower PCS when compared to non-ADHD controls.
Conclusions
In conclusion, ADHD symptoms are common in this seemingly healthy and undiagnosed population. Symptom presentations differ between sexes and the type of presentation seems to impact the association with depressive symptoms and level of reduced HRQL.
Internal and external rotation of the shoulder is often challenging to quantify in the clinic. Existing technologies, such as motion capture, can be expensive or require significant time to setup, collect data, and process and analyze the data. Other methods may rely on surveys or analog tools, which are subject to interpretation. The current study evaluates a novel, engineered, wearable sensor system for improved internal and external shoulder rotation monitoring, and applies it in healthy individuals. Using the design principles of the Japanese art of kirigami (folding and cutting of paper to design 3D shapes), the sensor platform conforms to the shape of the shoulder with four on-board strain gauges to measure movement. Our objective was to examine how well this kirigami-inspired shoulder patch could identify differences in shoulder kinematics between internal and external rotation as individuals moved their humerus through movement patterns defined by Codman’s paradox. Seventeen participants donned the sensor while the strain gauges measured skin deformation patterns during the participants’ movement. One-dimensional statistical parametric mapping explored differences in strain voltage between the rotations. The sensor detected distinct differences between the internal and external shoulder rotation movements. Three of the four strain gauges detected significant temporal differences between internal and external rotation (all p < .047), particularly for the strain gauges placed distal or posterior to the acromion. These results are clinically significant, as they suggest a new class of wearable sensors conforming to the shoulder can measure differences in skin surface deformation corresponding to the underlying humerus rotation.
Antibiotic overuse for asymptomatic bacteriuria is common in older adults and can lead to harmful outcomes including antimicrobial resistance. Our objective was to evaluate the impact of a simple scoring tool on urine culturing and antibiotic prescribing for adults with presumed urinary tract infections (UTI).
Design:
Quasi-experimental study using interrupted time series with segmented regression to evaluate urine culturing and urinary antibiotic use and length of stay (LOS), acute care transfers, and mortality 18 months before and 16 months after the intervention.
Setting:
134-bed complex continuing care and rehabilitation hospital in Ontario, Canada.
Participants:
Nurses, nurse practitioners, physicians, and other healthcare professionals.
Intervention:
A multifaceted intervention focusing on a 6-item mnemonic scoring tool called the BLADDER score was developed based on existing minimum criteria for prescribing antibiotics in patients with presumed UTI. The BLADDER score was combined with ward- and prescriber-level feedback and education.
Results:
Before the intervention, the mean rate of urine culturing was 12.47 cultures per 1,000 patient days; after the intervention, the rate was 7.92 cultures per 1,000 patient days (IRR 0.87; 95% CI, 0.67–1.12). Urinary antibiotic use declined after the intervention from a mean of 40.55 DDD per 1,000 patient days before and 25.96 DDD per 1,000 patient days after the intervention (IRR 0.68; 95% CI, 0.59–0.79). There was no change in mean patient LOS, acute care transfers, or mortality.
Conclusions:
The BLADDER score may be a safe and effective tool to support improved diagnostic and antimicrobial stewardship to reduce unnecessary treatment for asymptomatic bacteriuria.
Helium or neopentane can be used as surrogate gas fill for deuterium (D2) or deuterium-tritium (DT) in laser-plasma interaction studies. Surrogates are convenient to avoid flammability hazards or the integration of cryogenics in an experiment. To test the degree of equivalency between deuterium and helium, experiments were conducted in the Pecos target chamber at Sandia National Laboratories. Observables such as laser propagation and signatures of laser-plasma instabilities (LPI) were recorded for multiple laser and target configurations. It was found that some observables can differ significantly despite the apparent similarity of the gases with respect to molecular charge and weight. While a qualitative behaviour of the interaction may very well be studied by finding a suitable compromise of laser absorption, electron density, and LPI cross sections, a quantitative investigation of expected values for deuterium fills at high laser intensities is not likely to succeed with surrogate gases.
Diets deficient in fibre are reported globally. The associated health risks of insufficient dietary fibre are sufficiently grave to necessitate large-scale interventions to increase population intake levels. The Danish Whole Grain Partnership (DWP) is a public–private enterprise model that successfully augmented whole-grain intake in the Danish population. The potential transferability of the DWP model to Slovenia, Romania and Bosnia-Herzegovina has recently been explored. Here, we outline the feasibility of adopting the approach in the UK. Drawing on the collaborative experience of DWP partners, academics from the Healthy Soil, Healthy Food, Healthy People (H3) project and food industry representatives (Food and Drink Federation), this article examines the transferability of the DWP approach to increase whole grain and/or fibre intake in the UK. Specific consideration is given to the UK’s political, regulatory and socio-economic context. We note key political, regulatory, social and cultural challenges to transferring the success of DWP to the UK, highlighting the particular challenge of increasing fibre consumption among low socio-economic status groups – which were also most resistant to interventions in Denmark. Wholesale transfer of the DWP model to the UK is considered unlikely given the absence of the key ‘success factors’ present in Denmark. However, the DWP provides a template against which a UK-centric approach can be developed. In the absence of a clear regulatory context for whole grain in the UK, fibre should be prioritised and public–private partnerships supported to increase the availability and acceptability of fibre-rich foods.
The aim of this study was to quantify the time delay between screening and initiation of contact isolation for carriers of extended-spectrum beta-lactamase (ESBL)–producing Enterobacterales (ESBL-E).
Methods:
This study was a secondary analysis of contact isolation periods in a cluster-randomized controlled trial that compared 2 strategies to control ESBL-E (trial no. ISRCTN57648070). Patients admitted to 20 non-ICU wards in Germany, the Netherlands, Spain, and Switzerland were screened for ESBL-E carriage on admission, weekly thereafter, and on discharge. Data collection included the day of sampling, the day the wards were notified of the result, and subsequent ESBL-E isolation days.
Results:
Between January 2014 and August 2016, 19,122 patients, with a length of stay ≥2 days were included. At least 1 culture was collected for 16,091 patients (84%), with a median duration between the admission day and the day of first sample collection of 2 days (interquartile range [IQR], 1–3). Moreover, 854 (41%) of all 2,078 ESBL-E carriers remained without isolation during their hospital stay. In total, 6,040 ESBL-E days (32% of all ESBL-E days) accrued for patients who were not isolated. Of 2,078 ESBL-E-carriers, 1,478 ESBL-E carriers (71%) had no previous history of ESBL-E carriage. Also, 697 (34%) were placed in contact isolation with a delay of 4 days (IQR, 2–5), accounting for 2,723 nonisolation days (15% of ESBL-E days).
Conclusions:
Even with extensive surveillance screening, almost one-third of all ESBL-E days were nonisolation days. Limitations in routine culture-based ESBL-E detection impeded timely and exhaustive implementation of targeted contact isolation.
Textbook recommendations for gavaging rats vary between 1-5 ml for an adult rat. Rats weighing either 130 g or 250 g were gavaged with varying dosages of barium sulphate (BaSO4). After dosing, radiographs were taken at 0, 15 and 60 min. Animals showing a section of the small intestine totally filled with BaSO4 were scored as displaying spontaneous release. Other rats of the same sizes were gavaged with similar doses and subsequently tested in an open-field arena for behavioural abnormalities that might indicate stress or pain resulting from the procedure. Body temperature before and after treatment was recorded using microchip transponders. None of the 250 g rats in the 1 ml dosage group showed spontaneous release through the pyloric sphincter. In the 2 ml and 4 ml dosage groups, only one out of five animals showed spontaneous release. In the 6 ml dosage group, half of the animals showed spontaneous release. In the 8 ml and 10 ml dosage groups, five out of six and four out of five, respectively, showed spontaneous release. If doses were higher than 12 ml, no animal was able to keep all of the BaSO4 in its stomach. In the rats weighing 130 g, the 3 ml dosage group showed only one out of four rats with spontaneous release, whereas in the 5 ml and 7 ml dosage groups, all animals showed spontaneous release. After 15 min, all of the rats in both weight groups showed BaSO4 in the duodenum. Ambulation, rearing up onto the hind legs and defecation, as well as body temperature immediately after dosing, correlated very strongly with the dose (ml kg−1); increasing the dose resulted in reduced ambulation, rearing, defecation and body temperature. However, 10 min after performance of the open-field test, neither body temperature, serum corticosterone nor serum glucose showed any correlation with dose. This study indicates that high doses (ie doses up to 10 ml for a 250 g rat) might be safe to use; however, if an adverse impact on the rat is to be avoided, use of much lower doses should be considered—for example, doses that do not enforce opening of the pyloric sphincter in any rat. This would be less than 4 ml kg−1 in a 250 g rat.
The aim of this study was to investigate whether there were differences in fearfulness between laying hens (Gallus gallus domesticus) housed in aviaries and in cages. The tonic immobility (TI) test was used to assess the fearfulness. Norwegian light hybrid White Leghorn hens were housed in battery cages and in three types of aviaries: the Marielund, the Laco-Volétage and the Tiered Wire Floor. Each system housed about 1,500 birds. Tests were performed on 50 birds per housing system at 70 weeks of age in one laying flock and at 30 and 70 weeks of age in the next.
At 30 weeks of age in the second laying flock, the duration of the tonic immobility response was unaffected by type of system. At 70 weeks, however, hens in cages showed tonic immobility of longer duration than hens in aviaries, in the first as well as in the second laying flock No differences in TI between hens from the three types of aviaries were found. The duration of TI did not correlate with plumage condition or body-weight, except for a longer duration of TI with poorer plumage condition in aviaries at 30 weeks. These results indicate that the fearfulness of hens in cages, as measured by the TI test, increased considerably with time. The lower fearfulness shown by hens in aviaries suggests that this important aspect of welfare is more secured in aviaries than in cages.
The effects of cage enrichments and additional space were studied in 60 pairs of mink kits kept in standard cages (STD) and 67 pairs of mink kits kept in enriched cages (ENR). During the period from mid July to the end of September both groups had alternate access to one and two connected cages. From October, half of the mink in each group had permanent access to one cage and the other half permanent access to two cages. The enrichment of the cages consisted of extra resting places (tubes made of wire mesh and plastic) and occupational materials in terms of table-tennis balls and ropes to pull and chew. The mink were observed for an experimental period of nine months, from late lactation until the beginning of the following mating season. The welfare was assessed through behavioural traits (use of nest box and enrichments, activity out in the cage, stereotypies and fur-chewing) consumption of food and straw, bodyweight and level of faecal corticoid metabolites. The presence of enrichments resulted in less tail-chewing, fewer stereotypies, and a reduced level of faecal corticoid metabolites. In addition, the presence of enrichments led to fewer social interactions and reduced the consumption of straw. Regarding the frequency of utilising different occupational materials, the mink did not use the table-tennis balls, but the tubes and pull-ropes were given extensive use. Access to one or double cages had no effect on stereotypies, fur-chewing and physiology linked to welfare, but mink with access to double cages used the nest box less, had a lower consumption of straw and pull-ropes than the mink with access to only one cage. However, there were no indications of frustration when the mink were deprived of using double cages. We conclude that increased environmental complexity in the form of occupational materials improved the welfare of the mink, whereas doubling the cage size had little or no effect in relation to mink welfare.
Most on-farm welfare assessment systems have been developed for use in dairy and pig farms. These production systems are non-synchronous, in the sense that the same processes occur continuously throughout the year. Animal welfare during most or all phases of production may therefore be assessed at any time of the year, except for some effects of season. Many domesticated farm animals such as sheep, goats, deer and mink are seasonally synchronised in their production, in the same way as were their wild ancestors. A comprehensive welfare assessment system including animal-based indicators for these species must therefore take an entire production cycle into consideration. This can be illustrated by a welfare assessment protocol developed and tested by the Danish Institute of Agricultural Sciences (DIAS) for mink production. The DIAS concept is based on indicators from four sources: the system, the system's management, animal behaviour, and animal health. An advantage of seasonality is that the measurement of welfare indicators can be optimised and standardised in terms of age/season and sample size, making reliable results relatively cheap to obtain. Furthermore, there is ample time to plan the requisite interventions. A disadvantage of seasonality is that the entire herd may have been at risk when a welfare problem is disclosed by direct animal-based indicators; for example, the entire herd may have been exposed to a social grouping causing bite marks, which can be observed at pelting. Based on observation of the social grouping, this can be corrected before fighting and biting occurs. Based on observation of the bite marks, corrections are postponed until next season. Welfare assessment intended for decision support in a synchronous production system should therefore include a higher proportion of early indicators based on the system and management, in order to prevent the development of potential welfare problems involving the entire herd. The assessment of animal-based indicators may be relatively cheap and more reliable in synchronous production compared to non-synchronous production, and these indicators are therefore given high priority as they reflect the welfare resulting from the corrections made based on indirect system and management indicators.
Animal welfare is a major issue in Europe, and the production of mink, Mustela vison, has also been under debate. One common method of solving animal welfare problems is to adapt the environment to fit the behavioural needs of the animals. In comparison with other forms of husbandry, the mink production environment has remained relatively unchanged over the years and provides for some of the most obvious needs of mink. Whether today's typical housing conditions adequately meet the welfare requirements of mink is currently a topic of discussion. An alternative approach to improving welfare is to modify the animals so that they are better adapted to farming conditions. In large-scale animal production, handling of the individual can be a sporadic event, making an animal's inherent characteristics for temperament and adaptability important factors to consider with respect to its resultant welfare.
In this review we present and discuss experiments on behavioural selection for temperament, and against undesirable behaviours, such as fur chewing, in mink. Fur chewing behaviour can be reduced by selection, apparently without any negative effects, whereas only a little is known about the nature and consequences of selecting against stereotypic behaviours. Long-term selection experiments have shown that it is possible to reduce fearfulness in farmed mink. Using a relatively simple test, it is possible for farmers to add behavioural measurements to their normal selection criteria and thereby improve the welfare of farmed mink.
The welfare of transgenic animals is often not considered prior to their generation. However, we demonstrate here how a welfare risk assessment can be carried out before transgenic animals are created. We describe a risk assessment identifying potential welfare problems in transgenic pigs generated for future xeno-donation of organs. This assessment is based on currently available information concerning transgenic animal models in which one or more transgenes relevant to future xeno-donation have been inserted. The welfare risk assessment reveals that future xeno-donor pigs may have an increased tendency toward septicaemias, reduced fertility and/or impaired vision. The transgenic animal models used in generating hypotheses about the welfare of xeno-donor pigs can also assist in the testing of these hypotheses. To ensure high levels of welfare of transgenic animals, analogous risk assessments can be used to identify potential welfare problems during the early stages of the generation of new transgenic animals. Such assessments may form part of the basis on which licenses to generate new transgenic animals are granted to research groups.
The WelFur project aims at the development of on-farm welfare assessment protocols for farmed foxes (the blue fox [Vulpes lagopus], the silver fox [Vulpes vulpes]) and mink (Neovison vison). The WelFur protocols are based on Welfare Quality® (WQ) principles and criteria. Here, we describe the WelFur protocols after two years of developmental work. Reviews for each of the 12 WQ welfare criteria were written for foxes and mink to identify the welfare measures that have been used in scientific studies. The reviews formed the basis for potential measures to be included in the WelFur protocols. All measures were evaluated for their validity, reliability and feasibility. At present, we have identified 15 fox and 9 mink animal-based (or outcome-based) welfare measures, and 11 and 13 input-based (resource-based or management-based) measures. For both foxes and mink, each of the four WQ principles is judged by at least one criterion, and seven out of the 12 criteria include animal-based measures. The protocols will be piloted in 2012. Using the WQ project and protocols as a model has been a fruitful approach in developing the WelFur protocols. The effects of the WelFur protocols will provide benchmarks from which the welfare of animals on European fur farms can be assessed.
Heat stress can have severe deleterious effects on embryo development and survival. The present study evaluated whether CSF2 can protect the developmental competence of the bovine embryo following exposure to a heat shock of 41°C at the zygote and morula stages. In the first experiment, putative zygotes and 2-cell embryos were assigned to receive either 10 ng/ml CSF2 or vehicle, and then cultured for 15 h at either 38.5°C or 41°C and then at 38.5°C until day 7.5. Heat shock reduced blastocyst development for embryos treated with vehicle but not for embryos cultured with CSF2. In the second experiment, day 5 embryos (morula) were treated with CSF2 or vehicle and then cultured for 15 h at either 38.5°C or 41°C and then at 38.5°C until day 7.5. Temperature treatment did not affect development to the blastocyst stage and there was no effect of CSF2 treatment or the interaction. Results indicate that CSF2 can reduce the deleterious effects of heat shock at the zygote or two-cell stage when the embryo is transcriptionally inactive.
Recent estimates suggest that 40% of dementia cases could be avoided by treating recognised cardiovascular risk factors such as hypertension, diabetes, smoking and physical inactivity. Whether diet is associated with dementia remains largely unknown. We tested if low adherence to established dietary guidelines is associated with elevated lipids and lipoproteins and with increased risk of Alzheimer's disease and non-Alzheimer's dementia – a dementia subtype with a high frequency of cardiovascular risk factors.
Methods
We used the prospective Copenhagen General Population Study including 94 184 individuals with dietary information and free of dementia at baseline. Mean age at study entry was 58 years, and 55% (N = 51 720) were women and 45% (N = 42 464) were men. Adherence to dietary guidelines was grouped into low, intermediate and high adherence based on food frequency questionnaires. Main outcomes were non-Alzheimer's dementia and Alzheimer's disease.
Results
Low-density lipoprotein cholesterol, non-high-density lipoprotein cholesterol and plasma triglyceride levels were higher in individuals with intermediate and low adherence to dietary guidelines compared with individuals with high adherence (all p for trends <0.001). Age and sex-adjusted hazard ratios (HRs) for non-Alzheimer's dementia v. individuals with high adherence were 1.19 (95% confidence interval 0.97–1.46) for intermediate adherence, and 1.54 (1.18–2.00) for low adherence. Corresponding HRs in multivariable-adjusted models including APOE genotype were 1.14 (0.92–1.40) and 1.35 (1.03–1.79). These relationships were not observed in individuals on lipid-lowering therapy.
Conclusions
Low adherence to national dietary guidelines is associated with an atherogenic lipid profile and with increased risk of non-Alzheimer's dementia – the subtype of dementia with a high frequency of vascular risk factors. This study suggests that implementation of dietary guidelines associated with an anti-atherogenic lipid profile could be important for prevention of non-Alzheimer's dementia.
Glacier motion responds dynamically to changing meltwater inputs, but the multi-decadal response of basal sliding to climate remains poorly constrained due to its sensitivity across multiple timescales. Observational records of glacier motion provide critical benchmarks to decode processes influencing glacier dynamics, but multi-decadal records that precede satellite observation and modern warming are rare. Here we present a record of motion in the ablation zone of Saskatchewan Glacier that spans seven decades. We combine in situ and remote-sensing observations to inform a first-order glacier flow model used to estimate the relative contributions of sliding and internal deformation on dynamics. We find a significant increase in basal sliding rates between melt-seasons in the 1950s and those in the 1990s and 2010s and explore three process-based explanations for this anomalous behavior: (i) the glacier surface steepened over seven decades, maintaining flow-driving stresses despite sustained thinning; (ii) the formation of a proglacial lake after 1955 may support elevated basal water pressures; and (iii) subglacial topography may cause dynamic responses specific to Saskatchewan Glacier. Although further constraints are necessary to ascertain which processes are of greatest importance for Saskatchewan Glacier's dynamic evolution, this record provides a benchmark for studies of multi-decadal glacier dynamics.
Maintenance is an essential aspect to keeping production facilities running and safe. However, without an overview of the maintenance impact on production, gaining clarification of the impact of maintenance is difficult. This paper introduces modularization of maintenance based on the dimensions of maintenance: physical, action, and process. The approach is applied in a case study where maintenance decisions are improved and faster than prior to the introduction of the modularized maintenance.
The objectives and scope of a construction project is defined in the early design stage, the fuzzy front-end. This stage is crucial for project risk management and success, but traditional risk management tend to focus on operational risk in later design stages. This action research study leverages co-design methodology and the project management actuality perspective to tailor a risk management process for the fuzzy front-end of construction projects in a large client organization. The co-design process help enchance stakeholder value perception of the designed solution.
One of the most promising trends in healthcare digitalisation is the personalisation and individualisation of therapy based on virtual representations of the human body through Human Digital Twins (HDTs). Despite the growing number of articles on HDTs, to-date no consensus on how to design such systems exists. A systematic literature review for designing HDTs used for behaviour-changing therapy and rehabilitation resulted in eight key design considerations across four themes: regulatory and ethical, transparency and trust, dynamism and flexibility, and behaviour and cognitive mechanisms.