We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The extensive clearing and modification of forests by anthropogenic activities is a major driver of biodiversity loss. Declines of common species are especially concerning because of the potentially large cascading effects they might have on ecosystems. Regrowth of secondary forests may help reverse population declines by restoring habitats to similar conditions prior to land conversion but the value of these secondary forests to fauna is not well understood. We compared the abundance of a direct-developing terrestrial frog, Craugastor stejnegerianus, in riparian and upland habitats of pasture, secondary forest, and mature forest sites. Mean abundance per transect was lower in upland pasture compared to mature forest. Secondary forest had similar abundance to mature forest regardless of age. We show that conversion of forest habitat to pasture represents a conservation threat to this species. However, riparian buffers help mitigate the negative effect of conversion of forest to pasture, and regrowth of secondary forest is an effective management strategy for restoring the abundance of this common leaf-litter species.
Obesity in adolescents with intellectual and developmental disabilities) occurs at twice the frequency as their typically developing peers. Typically developing adolescents with obesity have abnormal cardiac function (as measured by strain echocardiography) and cardiac mass, but the effects of obesity on cardiac health in adolescents with Down syndrome or autism spectrum disorder are unknown. The purpose of this study was to evaluate the impact of body mass index on cardiac function in adolescents with Down syndrome or autism.
Methods:
Adolescents (age 12–21 years) with Down syndrome (n = 28), autism (n = 33), and age-/sex-matched typically developing controls (n = 15) received an echocardiogram optimised for strain analysis at a single timepoint. Measures of ventricular function, mass, and size were collected. Regression modelling evaluated the impact of body mass index and intellectual and developmental disabilities diagnosis on these cardiac measures.
Results:
In regression modelling, an elevated body mass index z-score was associated with diminished systolic biventricular function by global strain (left ventricular longitudinal strain β 0.87, P < 0.001; left ventricular circumferential strain β 0.57, p 0.003; right ventricular longitudinal strain β 0.63, P < 0.001). Diminished left ventricular diastolic function by early diastolic strain rate was also associated with elevated body mass index (global longitudinal end-diastolic strain rate β −0.7, P < 0.001). No association was found between traditional (non-strain) measures of systolic and diastolic ventricular function and body mass index z-score.
Conclusions:
Obesity in adolescents with Down syndrome or autism negatively impacts cardiac function as measured by echocardiographic strain analysis that was not detected by traditional parameters.
OBJECTIVES/GOALS: Individuals with intellectual and developmental disabilities (IDD) have lower levels of moderate-to-vigorous physical activity (MVPA) and a greater risk for sedentary-related comorbidities compared to their typically developing peers. Understanding activity patterns may provide opportunities for targeted physical activity interventions. METHODS/STUDY POPULATION: Secondary analyses were performed on baseline accelerometer data pooled from 2 clinical trials and a pilot study in adolescents (11-17 years) and young adults (18-21 years) with IDD. MVPA was assessed using accelerometers worn on the non-dominant hip during waking hours over 7 consecutive days. Data were collected at 60 hertz and activity counts were aggregated over 60 second epochs. Wear time was determined with the Choi algorithm and MVPA was classified using the Troiano adult or Freedson age-specific child cut-points. Mixed effects linear regressions were used to determine the effects of day of the week, time of the day, and season on MVPA. Diagnosis, gender, and age were used as fixed effect covariates with random intercepts varying among the participants and days of observation within each participant. RESULTS/ANTICIPATED RESULTS: There were 231 individuals (15.6 ± 2.8 years, 51.5% female) who had IDD (36.8% Autism, 48.1% Down syndrome) with 22,498 minutes of MVPA. Individuals with IDD wore the accelerometers an average of 592 ± 254 min./day and completed 13.5 ± 17.9 min./day of MVPA. Average MVPA was lowest in individuals with Autism (12.6 ± 11.4 min./day) and Down syndrome (13.2 ± 9.3 min./day) when compared to those with other IDDs (16.8 ± 10.8 min./day). Participation in MVPA was similar in males (13.4 ± 10.7 min./day) and females (13.7 ± 9.9 min./day). Mixed effects linear regressions showed that individuals participated in fewer minutes of MVPA on the weekend (β = -0.75, p < 0.001) and from 12-3 pm (reference) when compared to before 12 pm (β = 0.87, p < 0.001) and 3-7 pm (β = 0.66, p=0.007). No significant seasonal effects were found. DISCUSSION/SIGNIFICANCE: Individuals with IDD were significantly less active on the weekend, but they did participate in more minutes of MVPA in the morning and late afternoon/early evening. Physical activity interventions aiming to increase MVPA on the weekend and during the early afternoon may increase the number of weekly minutes of MVPA in individuals with IDD.
The purpose of this study was to assess impact of different volumes of exercise as well as cumulative moderate to vigorous physical activity (MVPA) on energy intake (EI) and diet quality, as assessed by the Healthy Eating Index-2010(HEI-2010), across a 12-month weight maintenance intervention. Participants were asked to attend group behavioural sessions, eat a diet designed for weight maintenance and exercise either 150, 225 or 300 min/week. Dietary intake was assessed by 3-d food records, and MVPA was assessed by accelerometry. Two hundred and twenty-four participants (42·5 years of age, 82 % female) provided valid dietary data for at least one time point. There was no evidence of group differences in EI, total HEI-2010 score or any of the HEI-2010 component scores (all P > 0·05). After adjusting for age, sex, time, group and group-by-time interactions, there was an effect of cumulative MVPA on EI (1·08, P = 0·04), total HEI-2010 scores (–0·02, P = 0·003), Na (–0·006, P = 0·002) and empty energy scores (–0·007, P = 0·004. There was evidence of a small relationship between cumulative daily EI and weight (β: 0·00187, 95 % CI 0·001, P = 0·003). However, there was no evidence for a relationship between HEI total score (β: −0·006, 95 % CI 0·07, 0·06) or component scores (all P > 0·05) and change in weight across time. The results of this study suggest that increased cumulative MVPA is associated with clinically insignificant increases in EI and decreases in HEI.
Smoking-related diseases (e.g., lung cancer) are the leading cause of mortality in HIV-infected patients. While many PLWH who smoke report a desire to quit, a majority of them have low readiness to quit. This study used logistic and linear regression to examine the relations among two (continuous vs. binary) measures of readiness to quit, smoking cessation self-efficacy (SE), quality of life (QoL), and perceived vulnerability (PV) using baseline data from 100 PLWH who smoke who participated in a clinical trial. Results showed no significant main effects (SE, QoL, and PV) or interaction effects (SE × QoL and SE × PV) on a continuous measure of readiness to quit. However, a follow-up analysis revealed that SE had a curvilinear effect on readiness to quit such that self-efficacy was positively associated with readiness to quit except at the highest levels of self-efficacy where readiness to quit declined. Greater SE significantly increased the likelihood of reporting readiness to quit (yes/no) among those with low QoL or high PV. For PLWH who smoke, improving self-efficacy may increase readiness to quit especially among those with lower quality of life. Psychoeducation tailored to PLWH designed to reduce unrealistic invulnerability to smoking-related diseases along with interventions that target self-efficacy may improve readiness to quit.
Despite the scientific evidence, most families of people with schizophrenia in Europe never receive a carer education programme. We evaluated whether a carer education course delivered by telepsychiatry was as effective as a carer education course delivered in situ.
Method
We delivered the carer education course for schizophrenia simultaneously to a carers group in rural north west Ireland (remote) via three ISDN lines and live to a carers group in a city (host). We compared knowledge gains using the Knowledge Questionnaire before and after each course.
Results
Fifty-six carers of people with schizophrenia participated in the trial. At baseline, participants at the remote and host centers did not differ in terms of knowledge about schizophrenia. After the course, carers at both centers improved significantly and the knowledge gains between groups were equivalent at 6 weeks.
Conclusion
Telepsychiatry can deliver effective carer education programmes about schizophrenia and may provide one solution to bridging the chasm between scientific evidence and clinical reality.
Sepsis – syndrome of infection complicated by organ dysfunction – is responsible for over 750 000 hospitalisations and 200 000 deaths in the USA annually. Despite potential nutritional benefits, the association of diet and sepsis is unknown. Therefore, we sought to determine the association between adherence to a Mediterranean-style diet (Med-style diet) and long-term risk of sepsis in the REasons for Geographic Differences in Stroke (REGARDS) cohort. We analysed data from REGARDS, a population-based cohort of 30 239 community-dwelling adults age ≥45 years. We determined dietary patterns from a baseline FFQ. We defined Med-style diet as a high consumption of fruit, vegetables, legumes, fish, cereal and low consumption of meat, dairy products, fat and alcohol categorising participants into Med-style diet tertiles (low: 0–3, moderate: 4–5, high: 6–9). We defined sepsis events as hospital admission for serious infection and at least two systematic inflammatory response syndrome criteria. We used Cox proportional hazard models to determine the association between Med-style diet tertiles and first sepsis events, adjusting for socio-demographics, lifestyle factors, and co-morbidities. We included 21 256 participants with complete dietary data. Dietary patterns were: low Med-style diet 32·0 %, moderate Med-style diet 42·1 % and high Med-style diet 26·0 %. There were 1109 (5·2 %) first sepsis events. High Med-style diet was independently associated with sepsis risk; low Med-style diet referent, moderate Med-style diet adjusted hazard ratio (HR) 0·93 (95 % CI 0·81, 1·08), high Med-style diet adjusted HR=0·74 (95 % CI 0·61, 0·88). High Med-style diet adherence is associated with lower risk of sepsis. Dietary modification may potentially provide an option for reducing sepsis risk.
Introduction: In the past few years, there has been an increase in awareness of the challenge of managing work related stress in EMS. Extant research has liked different types of chronic and critical incident stress to stress reactions like posttraumatic stress. However, there is no tool to capture the transactional stresses which are associated with the day to day provision of service (e.g., dealing with offload delays or mandatory overtime) and interacting with allied professions (e.g., emergency department staff) or allied agencies (e.g., law enforcement). The purpose of this study was to develop and validate a measure which captured transactional stresses in paramedics Methods: An online survey was conducted with ten Canadian Paramedic Services with a 40.5% response rate (n= 717). Factor analysis was used to identify variation in responses related to the latent factor of transactional stress. The scale was validated using both exploratory and confirmatory factor analyses. Results: The sample of transactional stress questions was split to allow for multiple analyses (EFA n=360/ CFA n=357). In the exploratory factor analysis, principal axis factoring with an oblique rotation revealed a two-factor, twelve item solution, (KMO=.832, x2=1440.19, df=66, p<.001). Confirmatory factor analysis also endorsed a two factor, 12 item solution, (x2 =130.39, df=51, p<.001, CFI=.95, TLI= .93, RMSEA= .07, SRMR= .06). Results supported two groups of six-item factors that captured transactional stress in the provision of service. The factors, clearly aligned with transactional stress issues internal to the ambulance and transactional stress relationships external to the ambulance. Both subscales demonstrated good internal reliability (= .843/ =.768) and were correlated (p.01) with a convergent validity measure. Conclusion: This study successfully validated a two-factor scale which captures stress associated with the day to day provision of EMS and the interaction with allied professions. The development of this measure of transactional stresses further expands the potential that paramedics, Paramedic Services, employers, and prehospital physicians may understand the dynamics that influence provider health and safety. As a result, there may be greater opportunities to intervene holistically to improve paramedic health and well-being.
Sericea lespedeza (Lespedeza cuneata (Dumont) G. Don, var. Serala) was planted in three field experiments at three seeding rates with four rates of S-propyl dipropylthiocarbamate (vernolate). Grass and broadleaf weed populations were reduced, but sericea was unaffected by vernolate incorporated preplant. Use of a herbicide permitted a reduction in seeding rate with no decrease in sericea stand or forage yield. This treatment permitted harvesting a cutting of hay the establishment year with only a minor effect on production the next year. The beneficial effect of vernolate on sericea yield continued the second year, mainly because sericea plants were better established and more vigorous. Sericea seeding rate did not affect weed competition or forage yield. A seeding rate of 11 kg/ha with 3.4 kg/ha vernolate was the most practical treatment.
Long-term sediment and ground-penetrating radar data from Davis Pond, a small lake near the Hudson River valley, reveal past droughts in a historically humid region that presently supplies water to millions of people in and around New York City. A minimum of eleven sandy paleoshoreline deposits in the lake date from 13.4 to 0.6 cal ka BP. The deposits span 1500 to 200 yr between bracketing radiocarbon ages, and intrude into lacustrine silts up to 9.0 m below the modern lake surface in a transect of six sediment cores. Three low stands, ca. 13.4–10.9, 9.2 and 8.2 cal ka BP indicate low regional moisture balance when low temperatures affected the North Atlantic region. Consistent with insolation trends, water levels rose from ca. 8.0 cal ka BP to present, but five low stands interrupted the rise and are likely associated with ocean–atmosphere interactions. Similar to evidence from other studies, the data from Davis Pond indicate repeated multi-century periods of prolonged or frequent droughts super-imposed on long-term regional trends toward high water levels. The patterns indicate that water supplies in this heavily populated region have continuously varied at multiple time scales and confirm that humid regions such as the northeastern United States are more prone to severe drought than historically expected.
The MIAMI* facility at the University of Huddersfield is one of a number of facilities worldwide that permit the ion irradiation of thin foils in-situ in a transmission electron microscope. MIAMI has been developed with a particular focus on enabling the in-situ implantation of helium and hydrogen into thin electron transparent foils, necessitating ion energies in the range 1 – 10 keV. In addition, however, ions of a variety of species can be provided at energies of up to 100 keV (for singly charged ions), enabling studies to focus on the build up of radiation damage in the absence or presence of implanted gas.
This paper reports on a number of ongoing studies being carried out at MIAMI, and also at JANNuS (Orsay, France) and the IVEM / Ion Accelerator Facility (Argonne National Lab, US). This includes recent work on He bubbles in SiC and Cu; the former work concerned with modification to bubble populations by ion and electron beams and the latter project concerned with the formation of bubble super-lattices in metals.
A study is also presented consisting of experiments aimed at shedding light on the origins of the dimensional changes known to occur in nuclear graphite under irradiation with either neutrons or ions. Single crystal graphite foils have been irradiated with 60 keV Xe ions in order to create a non-uniform damage profile throughout the foil thickness. This gives rise to varying basal-plane contraction throughout the foil resulting in almost macroscopic (micron scale) deformation of the graphite. These observations are presented and discussed with a view to reconciling them with current understanding of point defect behavior in graphite.
*Microscope and Ion Accelerator for Materials Investigations
This study aimed to determine whether there was a difference in skin permeability to methylene blue dye or skin morphology between dairy cows that differed in their susceptibility to digital dermatitis (DD) and to assess the effect of contact with slurry on skin permeability. Twenty nine dairy cows were monitored for DD during the winter housing period and classed as DD+ (previous DD infection, n = 17), or DD− (no recorded infection, n = 12). The animals were culled and a skin sample was taken from above the heel of each hind foot and frozen. Samples were later defrosted and one sample from each cow was tested for permeability, whereas the other was treated with slurry for 24 h before permeability testing. To test permeability, methylene blue dye was applied to the skin surface in a Franz diffusion cell. After 48 h, the amount of dye that had passed through the skin was estimated. The stratum corneum thickness and the density of hair follicles were determined from additional heel skin samples. Skin permeability to methylene blue dye was significantly greater for samples that had been treated with slurry but did not differ between DD+ and DD− animals. No difference was found in the stratum corneum thickness or density of hair follicles between DD+ and DD− animals. These findings imply that individual differences in general skin permeability are not a major factor in determining DD susceptibility and suggest that contact with slurry could contribute to DD infection by increasing the permeability of the skin, which may facilitate pathogen entry. Further work is required to clarify the role played by slurry in the pathogenesis of DD.
Conversion of natural habitats to anthropogenic land uses is a primary cause of amphibian declines in species-rich tropical regions. However, agricultural lands are frequently used by a subset of forest-associated species, and the habitat value of a given land use is likely modified by the presence and characteristics of remnant trees. Here we used mark–recapture methods to examine abundances and movement probability of the poison frog, Oophaga pumilio, at individual trees in forest-fragment edges and adjacent pastures in north-eastern Costa Rica. One hundred and forty-seven trees were surveyed at three replicate sites that each included a forest fragment and adjacent pasture. Trees were sampled at distances of ≤30 m into forest and ≤150 m into pastures for Oophaga pumilio, and local environmental characteristics were measured at each tree. We also measured indices of physical condition (size and endurance) of frogs captured in forest edges and in nearby pastures. Analyses of 167 marked individuals showed no difference in per-tree abundances or sex ratios between pasture and forest edges. We found significant interactions between habitat type and leaf-litter cover, tree dbh and number of logs, indicating greater influence of local variables on abundances in pastures. Movement among trees was infrequent and not predicted by sex, size, habitat type or environmental variables. While results of endurance tests did not differ for individuals from the two habitats, frogs captured in pastures were, on average, larger than frogs captured in forest edges. These data indicate that remnant trees are important habitat features for O. pumilio in pastures and corroborate research in other systems that suggests that large relictual trees should be retained to maximize the potential for altered landscapes to provide habitat for native species.
from
Section 2
-
Adaptation, speciation and extinction
By
A. Donnelly, Trinity College Dublin, Ireland,
A. Caffarra, Istituto Agrario San Michele all'Adige, Italy,
E. Diskin, Trinity College Dublin, Ireland,
C. T. Kelleher, National Botanic Gardens, Glasnevin, Dublin, Ireland,
A. Pletsers, Trinity College Dublin, Ireland,
H. Proctor, Trinity College Dublin, Ireland,
R. Stirnemann, Trinity College Dublin, Ireland,
M. B. Jones, Trinity College Dublin, Ireland,
J. O'Halloran, University College Cork, Ireland,
B. F. O'Neill, Trinity College Dublin, Ireland,
J. Peñuelas, Campus Universitat Autònoma de Barcelona, Spain,
T. Sparks, Technische Universität München, Germany and Institute of Zoology, Poznań University of Life Sciences, University of Cambridge, UK
The impact of climate change, in particular increasing spring temperatures, on life-cycle events of plants and animals has gained scientific attention in recent years. Leafing of trees, appearance and abundance of insects, and migration of birds, across a range of species and countries, have been cited as phenotrends that are advancing in response to warmer spring temperatures. The ability of organisms to acclimate to variations in environmental conditions is known as phenotypic plasticity. Plasticity allows organisms to time developmental stages to coincide with optimum availability of environmental resources. There may, however, come a time when the limit of this plasticity is reached and the species needs to adapt genetically to survive. Here we discuss evidence of the impact of climate warming on plant, insect and bird phenology through examination of: (1) phenotypic plasticity in (a) bud burst in trees, (b) appearance of insects and (c) migration of birds; and (2) genetic adaptation in (a) gene expression during bud burst in trees, (b) the timing of occurrence of phenological events in insects and (c) arrival and breeding times of migratory birds. Finally, we summarise the potential consequences of future climatic changes for plant, insect and bird phenology.
Introduction
The recent resurgence of interest in phenology (the timing of recurring life-cycle events in plants and animals) has stemmed from research on the impact of climate change, in particular, global warming.
To report a rare condition affecting the temporal bone. Immunoglobulin G4 related systemic sclerosing disease is a recently described autoimmune condition with manifestations typically involving the pancreas, biliary system, salivary glands, lungs, kidneys and prostate. Histologically, it is characterised by T-cell infiltration, fibrosis and numerous immunoglobulin G4-positive plasma cells. This condition previously fell under the umbrella diagnosis of inflammatory pseudotumour and inflammatory myofibroblastic tumour.
Case report:
We present the case of a 58-year-old woman with multiple inflammatory masses involving the pharynx, gall bladder, lungs, pelvis, omentum, eyes and left temporal bone, over a seven-year period. We describe this patient's unusual clinical course and pathological features, which resulted in a change of diagnosis from metastatic inflammatory myofibroblastic tumour to immunoglobulin G4 related systemic sclerosing disease. We also review the literature regarding the management of inflammatory pseudotumours of the temporal bone, and how this differs from the management of immunoglobulin G4 related systemic sclerosing disease.
Conclusion:
We would recommend a full review of all histological specimens in patients with a diagnosis of temporal bone inflammatory pseudotumour or inflammatory myofibroblastic tumour. Consideration should be given to immunohistochemical analysis for anaplastic lymphoma kinase and immunoglobulin G4, with measurement of serum levels of the latter. Management of the condition is medical, with corticosteroids and immunosuppression, rather than surgical excision.
The purpose of the present study was to determine the dietary predictors of visceral adipose tissue (VAT) area in overweight young adults. A total of 109 young adults (fifty males and fifty-nine females) ate ad libitum in a university cafeteria for 14 d. All food and beverages consumed in the cafeteria were measured using observer-recorded weighed plate waste. Food consumption outside the cafeteria (i.e. snacks) was assessed by multiple-pass 24 h recall procedures. VAT was determined using computed tomography. Stepwise regression demonstrated that the best predictor of visceral adiposity in women was total dietary fat (P ≤ 0·05). In men, the model for predicting visceral adiposity included Ca and total dietary fat. We concluded that total dietary fat is the best predictor of VAT area in both men and women. While this relationship was independent in women, in men there was a synergistic relationship between dietary fat consumption and Ca consumption in predicting VAT.
Various measures of skeletal size were made on five or six occasions on 30 Dorset Horn and 30 Corriedale sheep (10 entire males, 10 females and 10 castrated males) commencing at 1 month of age (live weight 10 kg) and then at increments of 10 kg until 55 kg. After weaning at 6–7 weeks, they were fed ad libitumon a high-quality diet. The data sets for each sheep were analysed separately and, where appropriate, pooled equations for sex and breed were generated. Within both breeds, males had the widest shoulders at any given age and, within the Corriedales, males had the deepest chests. Dorset Horns grew faster than Corriedales and, except for leg length, were larger and heavier at corresponding ages.
At any given live weight, there was no difference between sexes within breeds and the breeds had similar chest depths. The Corriedales had longer legs and smaller shoulders than the Dorset Horns at all weights and, when heavier than 30 kg, were also larger in other body dimensions.