We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Individuals present in lower Manhattan during the 9/11 World Trade Center (WTC) disaster suffered from significant physical and psychological trauma. Studies of longitudinal psychological distress among those exposed to trauma have been limited to relatively short durations of follow-up among smaller samples.
Methods
The current study longitudinally assessed heterogeneity in trajectories of psychological distress among WTC Health Registry enrollees – a prospective cohort health study of responders, students, employees, passersby, and residents in the affected area (N = 30 839) – throughout a 15-year period following the WTC disaster. Rescue/recovery status and exposure to traumatic events of 9/11, as well as sociodemographic factors and health status, were assessed as risk factors for trajectories of psychological distress.
Results
Five psychological distress trajectory groups were found: none-stable, low-stable, moderate-increasing, moderate-decreasing, and high-stable. Of the study sample, 78.2% were classified as belonging to the none-stable or low-stable groups. Female sex, being younger at the time of 9/11, lower education and income were associated with a higher probability of being in a greater distress trajectory group relative to the none-stable group. Greater exposure to traumatic events of 9/11 was associated with a higher probability of a greater distress trajectory, and community members (passerby, residents, and employees) were more likely to be in greater distress trajectory groups – especially in the moderate-increasing [odds ratios (OR) 2.31 (1.97–2.72)] and high-stable groups [OR 2.37 (1.81–3.09)] – compared to the none-stable group.
Conclusions
The current study illustrated the heterogeneity in psychological distress trajectories following the 9/11 WTC disaster, and identified potential avenues for intervention in future disasters.
The present study aims at measuring the association between household food insecurity and psychological distress in adolescents in Inuit communities, concurrently and overtime from childhood to adolescence.
Design:
The study used measures of internalising behaviours (anxiety, withdrawn attitude, somatic complaints and depression) as indicators of psychological distress during adolescence, a concurrent measure of household food insecurity in adolescence and an assessment of longitudinal patterns of household food insecurity from childhood to adolescence. We collected descriptive information at birth, childhood and adolescence on potential confounders.
Setting:
Inuit communities of Nunavik in northern Quebec, Canada
Participants:
The study consisted of 212 participants from the Nunavik Child Development Study, who have been assessed at birth, childhood (mean age = 11 years, range = 9–13 years) and adolescence (mean age = 18 years, range = 16–21 years).
Results:
Concurrent severe household food insecurity in adolescence was associated with higher measures of psychological distress: depression (βstd = 0·26, P < 0·01) and withdrawn attitude (βstd = 0·20, P = 0·04). Persistent household food insecurity (both at childhood and adolescence) was associated with higher levels of adolescent depression (βstd = 0·18, P = 0·02) and anxiety (βstd = 0·17, P = 0·03).
Conclusions:
Adolescents from Nunavik living with higher food insecurity and those having experienced food insecurity in both childhood and adolescence were more likely to report symptoms of psychological distress. Considering the high level of distress experienced by young Inuit, existing initiatives to reduce food insecurity in Nunavik communities should be targeted to include children and adolescents.
Adherence to practice guidelines for diagnosing and treating attention-deficit/hyperactivity disorder (ADHD) by primary care providers (PCPs) is important for optimizing care for many children and youth. However, adherence is often low. To address this problem, we implemented an intensive intervention in 2009 aimed at improving diagnosis and management of ADHD among PCPs.
Objectives
The study objective is to assess the sustainability of intervention-attributable outcomes.
Aims
The study aims are to assess the sustained effect of the intervention on PCP intentions to implement, attitudes toward, and obstacles to implement ADHD practice guidelines.
Methods
During November 2009, 48 PCPs from 31 clinical practices completed a 3-day training, 6 months of biweekly telephone peer group reinforcement, and baseline questionnaires; follow-up questionnaires were completed at 12 months. To assess sustainability, we tracked PCPs and administered the questionnaire in 2016.
Results
Intentions to implement ADHD guidelines remained stable over seven years, with all mean values ranging from “probably will” to “definitely will” implement guidelines.
Conclusions
Generally, favorable self-reported intentions (see Exhibits 1 & 2), attitudes and obstacles to implementing ADHD guidelines were sustained seven years after the intensive training and follow-up intervention.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
The last shot of Season One of Les Revenants/The Returned (Canal+, 2012–15) finds the hit French television show's nameless mountain town suddenly, mysteriously, catastrophically flooded. Water has rushed in to a place that, for the past eight episodes, has been the site of another unruly invasion: of the dead, les revenants, come back to life. They are not violent. They just want to resume the lives they lost months or years earlier. By the beginning of Season Two, the army has come to take control, managing the double catastrophe caused by a dam that cannot hold back water and the town's returned dead.
When the show first aired, it brought the French past itself into the present, for the image of a mountain community in the Haute-Savoie threatened by water, almost vacant, and under state control recalls similar historical events of more than 50 years earlier. In March 1952, the valley containing the mountain village of Tignes was flooded to make way for a new hydroelectric dam on the Isère River. Four hundred CRS were called in to remove resisting residents and dynamite their homes to prevent them from returning. These actions were the culmination of one of the most contentious and visible battles between state technocrats and the opponents – one might say victims – of modernisation that animated the early years of France's Trente Glorieuses (Fourastié 2000; Frost 1985). Although never explicitly identified, this is the same dam that appears throughout Les Revenants. Its return to French screens underpins a broader allegory about state-sponsored modernisation and its legacy, a once-‘glorious’ project returned as a story of failure and unintended consequences.
For critics and industry professionals, the returned dead of the show represented another level of allegory: the return of French dramatic television to a level of quality it had not enjoyed for decades. LesRevenants was one of the first French shows to garner public interest and critical accolades during what critics have dubbed ‘the golden age of television’. Its first season attracted record audiences and critical praise in France before being picked up for diffusion by Swedish, British, Dutch, and American channels and winning a 2013 International Emmy for Best Dramatic Series.
Many observers have argued that the US health care system could be more efficient, and achieve better outcomes if providers focused more on improving the community's health, not just the welfare of individual patients. The passage of the Affordable Care Act (ACA) in 2010 seemed to herald the promise of such reforms, and greater integration of the health care and public systems. In this article, we reassess the quest for integration, a quest we call the “integration project.” After examining the modest steps taken so far toward integration, we consider some conceptual barriers to integration as well as some of the risks it might portend for the public health system. Our assessment contributes to wide-ranging debates among public health advocates, practitioners, and policymakers as to the health care system's role in protecting public health, and how the public health system can best be organized to meet expected population health challenges.
Objectives: Attention-deficit/hyperactivity disorder (ADHD) is a common neurological disorder with symptom onset early in childhood. Growing evidence suggests anomalous brain development across multiple brain regions is evident in school-aged children; however, few studies have examined whether such differences are notable in the preschool years when symptom onset typically occurs. Methods: High resolution anatomical (MPRAGE) images and cognitive and behavioral measures were analyzed in a total of 90 medication-naïve preschoolers, ages 4–5 years (52 with ADHD, 38 controls; 64.4% boys). Results: Results revealed reductions in bilateral frontal, parietal, and temporal lobe gray matter volumes in children with ADHD relative to typically developing children, with largest effect sizes noted for right frontal and left temporal lobe volumes. Examining frontal lobe sub-regions, the largest between group effect sizes were evident for left orbitofrontal cortex, left primary motor cortex (M1), and left supplementary motor complex (SMC). ADHD-related reductions in specific sub-regions (left prefrontal, left premotor, left frontal eye field, left M1, and right SMC) were significantly correlated with symptom severity, such that higher ratings of hyperactive/impulsive symptoms were associated with reduced cortical volumes. Conclusions: These findings represent the first comprehensive examination of cortical volume in preschool children with ADHD, providing evidence that anomalous brain structure in ADHD is evident very early in development. Furthermore, findings set the stage for developing our understanding of the way in which developmental trajectories of anomalous brain development are associated with the unfolding of symptoms in childhood ADHD. (JINS, 2018, 24, 531–539)
An adverse early life environment can increase the risk of metabolic and other disorders later in life. Genetic variation can modify an individual’s susceptibility to these environmental challenges. These gene by environment interactions are important, but difficult, to dissect. The nucleus is the primary organelle where environmental responses impact directly on the genetic variants within the genome, resulting in changes to the biology of the genome and ultimately the phenotype. Understanding genome biology requires the integration of the linear DNA sequence, epigenetic modifications and nuclear proteins that are present within the nucleus. The interactions between these layers of information may be captured in the emergent spatial genome organization. As such genome organization represents a key research area for decoding the role of genetic variation in the Developmental Origins of Health and Disease.
Objectives: Sleep quality affects memory and executive function in older adults, but little is known about its effects in midlife. If it affects cognition in midlife, it may be a modifiable factor for later-life functioning. Methods: We examined the association between sleep quality and cognition in 1220 middle-aged male twins (age 51–60 years) from the Vietnam Era Twin Study of Aging. We interviewed participants with the Pittsburgh Sleep Quality Index and tested them for episodic memory as well as executive functions of inhibitory and interference control, updating in working memory, and set shifting. Interference control was assessed during episodic memory, inhibitory control during working memory, and non-memory conditions and set shifting during working memory and non-memory conditions. Results: After adjusting for covariates and correcting for multiple comparisons, sleep quality was positively associated with updating in working memory, set shifting in the context of working memory, and better visual-spatial (but not verbal) episodic memory, and at trend level, with interference control in the context of episodic memory. Conclusions: Sleep quality was associated with visual-spatial recall and possible resistance to proactive/retroactive interference. It was also associated with updating in working memory and with set shifting, but only when working memory demands were relatively high. Thus, effects of sleep quality on midlife cognition appear to be at the intersection of executive function and memory processes. Subtle deficits in these age-susceptible cognitive functions may indicate increased risk for decline in cognitive abilities later in life that might be reduced by improved midlife sleep quality. (JINS, 2018, 24, 67–76)
The composition, magnetic characteristics, and pollen content of sediments from two small kettle lakes near the prairie-forest border in west-central Minnesota were used to infer changes in terrestrial vegetation, shoreline erosion, eolian inputs, carbonate deposition, and aquatic productivity. Several important changes in sediment stratigraphy coincide closely with changes in the terrestrial pollen assemblage. Accumulation rates of organic, inorganic, and carbonate sediment fractions as well as the concentration of magnetic particulates increased abruptly as prairie replaced pine forest, and decreased gradually throughout the subsequent transitions to oak scrub, and later, mixed woodland. Results suggest that by its mediation of wind exposure, local vegetation may affect (1) influx of eolian particulates, (2) erosion of shorelines, (3) water circulation, and (4) carbonate equilibria. In addition, low rates of accumulation of organic matter and increased humidicity during periods of conifer vegetation suggest that humic matter leached from forest soils may influence lake-water chemistry and reduce productivity. Changes in lake level caused by differential rates of evapotranspiration of the various vegetation types (conifer forest, deciduous hardwood forest, brush scrub, and prairie) could not be detected. Sediment-forming processes can therefore be altered by changes in local terrestrial vegetation through several mechanisms that are independent of changes in lake level.
Each of us has written about the importance of reframing the debate over public health paternalism. Our individual explorations of the many and varied paths forward from libertarian “nanny state” objections to the “new public health” have been intimately informed by collaboration. This article represents a summary of our current thinking — reflecting the ground gained through many fruitful exchanges and charting future collaborative efforts.
Our starting point is that law is a vitally important determinant of population health, and the interplay among law, social norms, cultural beliefs, health behaviors, and healthy living conditions is complex. Anti-paternalists’ efforts to limit the scope of public health law to controlling only the proximal determinants of infectious diseases are utterly unjustifiable in the face of so much preventable death, disability, and disparity. Equally important, the anti-paternalism push is deeply counter-majoritarian and undemocratic, threatening to disable communities from undertaking measures to improve their own well-being.
A total of 207 wild rodents were caught on nine pig farms, five chicken farms and five non-farm locations in Sweden and surveyed for a selection of bacteria, parasites and viruses. Lawsonia intracellularia and pathogenic Yersinia enterocolitica were only detected in rodents on pig farms (9% and 8% prevalence, respectively) which indicate that these agents are more likely to be transmitted to rodents from pigs or the environment on infected farms. Brachyspira hyodysenteriae (1%), Brachyspira intermedia (2%), Campylobacter jejuni (4%), Campylobacter upsaliensis (2%), leptospires (7%) and encephalomyocarditis virus (9%) were also detected from rodents not in contact with farm animals. Giardia and Cryptosporidium spp. were common, although no zoonotic types were verified, and Salmonella enterica was isolated from 1/11 mice on one farm but not detected by PCR from any of the rodents. Trichinella spp. and Toxoplasma gondii were not detected.
As has been discussed throughout this volume, EF is a broad term describing the range of skills required for purposeful, goal-directed activity, socially appropriate conduct, and independent regulation of action and affect. EF skills can be considered a “domain of neurocognitive competence” that sets the stage for learning, academic achievement, and rule-governed behavioral functioning. In practical terms, EF involves developing and implementing an approach to performing a task that has not been habitually performed. When skills become overlearned through practice and thus automatized, they require less executive or “top-down” control.
In performance-based activities, implementation of EF occurs after perception but before action, thus involving a preparedness to respond. The central components of EF are those that facilitate a “pause” that occurs after perception but before action, that allows for appropriate response preparation. These components include response inhibition, attention regulation, WM, and planning. Expanded definitions of EF also include problem solving skills, organization of behavior, mental flexibility, set-shifting, and the capacity to delay gratification. These sub-components are considered separable from the specific cognitive domains and modalities in which they are assessed, but are nevertheless crucial to performance, and critical for remediation of learning difficulties of all kinds.
The pale western cutworm, Agrotis orthogonia Morr., a pest of crops in the plains areas, occurs in central Alberta and Saskatchewan in Canada southward to various areas of Oklahoma, Texas, and New Mexico in the United States. It has been suggested that in the prepupal stage this cutworm is able to adapt itself to a wide range of climatic and geographic conditions and to retain a univoltine life cycle. The investigations reported here were made to determine the effects of temperature, moisture, and larval weights on the duration of the prepupal and pupal stages.
Cutworms are excellent insects for investigations on physiology, behaviour, and toxicology in the laboratory. Although they are often plentiful and easily obtained in the field during outbreaks their use as laboratory animals is dependent upon satisfactory methods of rearing. Cutworms vary in life history and feeding habits, and many species require distinctive rearing techniques.
Methods of rearing various stages of the pale western cutworm, Agrotis orthogonia Morr., in the laboratory have been described (Hocking, 1952; Jacobson, 1952; King and Atkinson, 1927; Lindsay. 1954; Parker et al, 1921; Seamans and McMillan, 1935). From these a very satisfactory method has been evolved at the Lethbridge laboratory that this insect to be reared for several generations, providing a source of experimental material at any stage of development. The rearing may be done very satisfactorily a t room temperature but, if desired, the method is adaptable to rearing at various conditions of constant temperature and relative humidity.
When a crop has been destroyed by the pale western cutworm, Agrotis orthogonia Morr., one to two weeks may elapse before plants of the second seeding emerge. During this time the larvae may be starved or, at least, subjected to a suboptimum food supply. Frequently the second seeding may be of a different crop from the one that was destroyed.
A previous investigation (Jacobson, 1952) showed that mortality of the starved larvae varied directly with temperature and inversely with size of larvae. Seamans and McMillan (1935) reported that, when the larvae were fed various foods, differences were found in the rate of development and survival.
The pale western cutworm, Agrotis orthogonia Morr., has been a major pest of cereals in the prairie region of Western Canada since 1911. Present control measures consist of (a) cultural measures that inhibit oviposition in fields being summer-fallowed and (b) starvation of the young larvae by destroying the volunteer plant growth by means of cultivation at a critical period. Cultural operations to minimize oviposition in fields being summer-fallowed have become accepted procedure throughout the area. Control by starvation is practised during outbreak years in fields that have-been “stubbled in” or in those tilled during the moth flight. This procedure has been recommended generally throughout potentially infested areas, but more precise methods of evaluating larval numbers are required so that control may be used only where critical populations are present.
Larvae of the pale western cutworm, Agrotis orthogonia Morr., may be starved in the field by planned control (Seamans and Rock, 1945) or elimination of their food supply as a result of their own depredation. Starvation may occur at any time during the feeding period. Some effects of starvation on mortality, particularly in the early instars, have been reported (Jacobson, 1952). Larvae that were fed only two hours each day had an additional instar, developed more slowly, and were smaller (McGinnis and Kasting, 1959). Starvation during the fourth instar when the larvae were fed on various foods resulted in smaller pupae, and the size and fecundity of females were directly associated with pupal size (Jacobson and Blakeley, 1958).
Larvae of the pale western cutworm, Agrotis orthogonia Morr., feed almost entirely below ground, attacking their food plants just below the soil surface. Larval movement and feeding usually occur at the interface between dry and moist soil. They can absorb moisture from the soil and also from the plants on which they are feeding.
In a previous investigation, Jacobson (1952) found that mortality from starvation varied directly with temperature and inversely with the size of larvae when the relative humidity was kept near 100 per cent. This paper is a report on the role of moisture during starvation.