We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Domestic abuse – abusive behaviour perpetrated by an adult towards another adult to whom they are personally connected (e.g. partners, ex-partners or family members) – damages mental health, increases mental health service use and challenges clinical management. Training and guidance for mental health professionals on identifying and responding to patients exposed to domestic abuse are available, but there has been less development of resources for mental health professionals in identifying, assessing and responding to perpetrators of domestic abuse. In this article, we describe a framework for responding to domestic abuse perpetration in clinical settings in general adult mental health services, aimed at improving practice. This could support mental health professionals in sensitive enquiry and assessment for domestic abuse perpetration, and guide appropriate responses, as part of routine training and continuing professional development.
Little is known about the early history of the chicken (Gallus gallus domesticus), including the timing and circumstances of its introduction into new cultural environments. To evaluate its spatio-temporal spread across Eurasia and north-west Africa, the authors radiocarbon dated 23 chicken bones from presumed early contexts. Three-quarters returned dates later than those suggested by stratigraphy, indicating the importance of direct dating. The results indicate that chickens did not arrive in Europe until the first millennium BC. Moreover, a consistent time-lag between the introduction of chickens and their consumption by humans suggests that these animals were initially regarded as exotica and only several centuries later recognised as a source of ‘food’.
Effective incident management is essential for coordinating efforts of multiple disciplines and stakeholders when responding to emergencies, including public health disasters such as the ongoing coronavirus disease 2019 (COVID-19) pandemic.
Methods:
Existing research frameworks tend to focus on formal structures and doctrine (eg, ICS-NIMS); however, organizational processes that underlie incident management have not been systematically assessed and synthesized into a coherent conceptual framework.
Results:
The lack of a framework has hindered the development of measures of performance that could be used to further develop the evidence base and facilitate process improvement. To address this gap, we present a conceptual framework of incident management drawn from expert feedback and a review of literature on incident management and related fields. The framework features 23 measurement constructs grouped into 5 domains: (1) situational awareness and information sharing, (2) incident action and implementation planning, (3) resource management and mobilization, (4) coordination and collaboration, and (5) feedback and continuous quality improvement.
Conclusions:
As such, the article provides a first step toward the development of robust measures for assessing the performance and effectiveness of incident management systems.
Growing evidence indicates that Vitamin D deficiency is associated with psychotic symptoms. Although evidence suggesting a causal relationship is limited, theories regarding neuro-inflammatory modulation are promising. Alternatively, deficiency may signify chronic illness or poor functioning. Nevertheless, Vitamin D levels below 50nmol/L increase the risk of osteoporosis, muscle weakness, falls and fractures, thus identification and treatment are important.
The association between Vitamin D levels in patients within the Tameside Early Intervention in Psychosis Team (EIT) was studied, hypothesising a strong correlation.
Method
The records of all patients in the EIT as of 01/07/2020, over the age of 16 years old (n = 183), were studied. The first Vitamin D level taken while under the EIT and the CGI scores closest to the date of this level were recorded. Vitamin D levels of 25nmol/L and under were classified as deficient, levels of 25.1 - 50nmol/L were insufficient.
Result
45.90% (n = 84) of patients did not have their levels recorded. Of the 55% (n = 99) patients who had Vitamin D levels recorded, 49.50% (n = 49) were insufficient and 22.22% (n = 22) were deficient. Therefore, only 28.28% (n = 28) had either optimal or sufficient Vitamin D levels. The majority of Vitamin D levels were taken in Autumn (36.46% n = 36).
75.76% (n = 75) of patients had both vitamin D levels and CGI scores recorded, with an average of 35.65 days between date level and score recorded. A weak negative correlation between overall CGI scores and vitamin D level was calculated, producing Spearman R Correlation Coefficient of -0.15.
Conclusion
Almost 3/4 of the studied patients being assessed for psychotic symptoms had either insufficient or deficient levels of Vitamin D. The correlation between symptom severity and Vitamin D level was weak however. While we cannot comment on the causality of the relationship, it appears that there is an association between our studied patient group and Vitamin D insufficiency.
The evidence to suggest that supplementation can reduce psychotic symptoms is limited however, supplementation can reduce the risk of osteoporosis and falls, therefore would improve patient care. Only 55% of the patients within the EIT had their Vitamin D levels tested. As a result of this study, the authors recommend that all patients in the EIT have their Vitamin D levels tested as part of their psychosis assessment.
The study is limited due to low numbers of patients studied and the fact that recorded CGI scores were often recorded at a later date to Vitamin D levels.
Few studies have derived data-driven dietary patterns in youth in the USA. This study examined data-driven dietary patterns and their associations with BMI measures in predominantly low-income, racial/ethnic minority US youth. Data were from baseline assessments of the four Childhood Obesity Prevention and Treatment Research (COPTR) Consortium trials: NET-Works (534 2–4-year-olds), GROW (610 3–5-year-olds), GOALS (241 7–11-year-olds) and IMPACT (360 10–13-year-olds). Weight and height were measured. Children/adult proxies completed three 24-h dietary recalls. Dietary patterns were derived for each site from twenty-four food/beverage groups using k-means cluster analysis. Multivariable linear regression models examined associations of dietary patterns with BMI and percentage of the 95th BMI percentile. Healthy (produce and whole grains) and Unhealthy (fried food, savoury snacks and desserts) patterns were found in NET-Works and GROW. GROW additionally had a dairy- and sugar-sweetened beverage-based pattern. GOALS had a similar Healthy pattern and a pattern resembling a traditional Mexican diet. Associations between dietary patterns and BMI were only observed in IMPACT. In IMPACT, youth in the Sandwich (cold cuts, refined grains, cheese and miscellaneous) compared with Mixed (whole grains and desserts) cluster had significantly higher BMI (β = 0·99 (95 % CI 0·01, 1·97)) and percentage of the 95th BMI percentile (β = 4·17 (95 % CI 0·11, 8·24)). Healthy and Unhealthy patterns were the most common dietary patterns in COPTR youth, but diets may differ according to age, race/ethnicity or geographic location. Public health messages focused on healthy dietary substitutions may help youth mimic a dietary pattern associated with lower BMI.
As colleges and universities respond to the COVID-19 outbreak, many in the media call it unprecedented. This is not the first time that institutions of higher education have had to respond to an epidemic, however. A historical review of college and university reactions to illnesses such as yellow fever and the 1918 influenza pandemic provides prior examples of institutional responses to epidemic diseases.
The utility of questionnaire based self-report measures for non-clinical psychotic symptoms is unclear and there are few reliable data about the nature and prevalence of these phenomena in children. The study aimed to investigate psychosis-like symptoms (PLIKS) in children utilizing both self-report measures and semi-structured observer rated assessments.
Methods:
The study was cross-sectional; the setting being an assessment clinic for members of the ALSPAC birth cohort in Bristol, UK. 6455 respondents were assessed over 21 months, mean age 12.9 years. The main outcome measure was: 12 self-report screening questions for psychotic symptoms followed by semi-structured observer rated assessments by trained psychology graduates. The assessment instrument utilised stem questions, glossary definitions, and rating rules adapted from DISC-IV and SCAN items.
Results:
The 6-month period prevalence for one or more PLIKS rated by self-report questions was 38.9 % (95% CI = 37.7-40.1). Prevalence using observer rated assessments was 13.7% (95% CI = 12.8-14.5). Positive Predictive Values for the screen questions versus observer rated scores were low, except for auditory hallucinations (PPV=70%; 95% CI = 67.1-74.2). The most frequent observer rated symptom was auditory hallucinations (7.3%); in 18.8% of these cases symptoms occurred weekly or more. The prevalence of DSM-IV ‘core’ schizophrenia symptoms was 3.62%. Rates were significantly higher in children with low socio-economic status.
Conclusions:
With the exception of auditory hallucinations, self-rated questionnaires are likely to substantially over-estimate the frequency of PLIKS in 12-year-old children. However, more reliable observer rated assessments reveal that PLIKS occur in a significant proportion of children.
To describe snacking characteristics and patterns in children and examine associations with diet quality and BMI.
Design:
Children’s weight and height were measured. Participants/adult proxies completed multiple 24 h dietary recalls. Snack occasions were self-identified. Snack patterns were derived for each sample using exploratory factor analysis. Associations of snacking characteristics and patterns with Healthy Eating Index-2010 (HEI-2010) score and BMI were examined using multivariable linear regression models.
Setting:
Childhood Obesity Prevention and Treatment Research (COPTR) Consortium, USA: NET-Works, GROW, GOALS and IMPACT studies.
Two snack patterns were derived for three studies: a meal-like pattern and a beverage pattern. The IMPACT study had a similar meal-like pattern and a dairy/grains pattern. A positive association was observed between meal-like pattern adherence and HEI-2010 score (P for trend < 0⋅01) and snack occasion frequency and HEI-2010 score (β coefficient (95 % CI): NET-Works, 0⋅14 (0⋅04, 0⋅23); GROW, 0⋅12 (0⋅02, 0⋅21)) among younger children. A preference for snacking while using a screen was inversely associated with HEI-2010 score in all studies except IMPACT (β coefficient (95 % CI): NET-Works, −3⋅15 (−5⋅37, −0⋅92); GROW, −2⋅44 (−4⋅27, −0⋅61); GOALS, −5⋅80 (−8⋅74, −2⋅86)). Associations with BMI were almost all null.
Conclusions:
Meal-like and beverage patterns described most children’s snack intake, although patterns for non-Hispanic Blacks or adolescents may differ. Diets of 2–5-year-olds may benefit from frequent meal-like pattern snack consumption and diets of all children may benefit from decreasing screen use during eating occasions.
The pine bark adelgid, Pineus strobi (Hartig) (Hemiptera: Adelgidae), is an herbivore native to eastern North America that specialises on eastern white pine, Pinus strobus Linnaeus (Pinaceae). Little is known about P. strobi, especially in its southern range in the Appalachian Mountains, United States of America, and the composition of its predator complex has not yet been documented in this region. The current study identifies arthropod predators associated with P. strobi in Appalachian forests of Virginia based on a two-year survey. Predators were identified using morphology and DNA barcoding. Predator species include: Laricobius rubidus LeConte (Coleoptera: Derodontidae), Leucopis piniperda Malloch (Diptera: Chamaemyiidae), and Leucopis argenticollis Zetterstedt (Diptera: Chamaemyiidae), that are known adelgid specialists. Also found were predators from the families Cecidomyiidae (Diptera), Coccinellidae (Coleoptera), Chrysopidae (Neuroptera), Hemerobiidae (Neuroptera), and Syrphidae (Diptera). The Cecidomyiidae were especially diverse, with 14 different species inferred from their DNA barcodes. Knowledge of this predator complex is particularly valuable for anticipation and detection of potential interactions between native predator species and those that are being considered for the introduction for biological control of invasive adelgid pests within the southern Appalachian ecosystem.
A controversy at the 2016 IUCN World Conservation Congress on the topic of closing domestic ivory markets (the 007, or so-called James Bond, motion) has given rise to a debate on IUCN's value proposition. A cross-section of authors who are engaged in IUCN but not employed by the organization, and with diverse perspectives and opinions, here argue for the importance of safeguarding and strengthening the unique technical and convening roles of IUCN, providing examples of what has and has not worked. Recommendations for protecting and enhancing IUCN's contribution to global conservation debates and policy formulation are given.
The challenges faced in the analysis of high-throughput sequencing data are discussed so frequently that the issues have become palpable stereotypes. Phrases such as ‘data deluge’, ‘hockey stick graph’ and ‘bioinformatics bottleneck’ are ubiquitous to the point of spawning an internet bingo card of overused sound bytes for audiences to check off during seminars (http://bit.ly/wYNxrF). Yet for all the discussion of these challenges, the dialogue about potential solutions is ignored or wildly speculative. In the sequencing world, the game is changing and no one knows how to make the next play. Computational pipelines progress slowly compared to the pace of sequencing technology, with each new platform requiring updated iterations of code and new empirical tests of error rates and data formats.
In spite of the myriad challenges left to surmount, high-throughput sequencing has already transformed and accelerated the pace of biodiversity research. Our current bioinformatic capabilities have been hard-won: characterizing and grappling with fundamentally different sequencing chemistries and order-of-magnitude-increases in file size have required substantial initial investments. The infancy of high-throughput fields means that the current biological insights are rudimentary compared to the sophisticated, complex analyses that will become available over the next decade. Yet by simply investigating ecosystems from a new perspective (genome-scale and community-level exploration, versus the narrower genetic and taxonomic questions previously necessitated by lower throughput Sanger sequencing), we have instantly gained a transformative view of biodiversity and ecological processes. These fledgling insights are already unprecedented, and the steadily increasing breadth of computational tools continues to widen our capacity for integrative data analysis.
The birth and death of sequencing technologies
Researchers impact sequencing technology almost as much as sequencing technology drives research. The platform currently in vogue may quickly fall out of fashion when a better (and cheaper) option hits the market. Biomedical applications drive the market and design for sequencers, with many large-scale sequencing centres focusing their resources on clinical applications (BGI@UCDavis, the Broad Institute), or species of agricultural or economic importance (BGI's facilities in China). Although many ‘megasequencing’ projects focused on biodiversity are now underway (Table 7.1), more fundamental and blue-skies research questions are inherently at the mercy of the technology and protocols favoured across biomedical fields. The dominance of BGI and the falling cost of sequencing are also prompting a reshuffling of long-term visions for many core facilities.
Sauropod dinosaurs were the largest terrestrial animals and their growth rates remain a subject of debate. By counting growth lines in histologic sections and relating bone length to body mass, it has been estimated that Apatosaurus attained its adult body mass of about 25,000 kg in as little as 15 years, with a maximum growth rate over 5000 kg/yr. This rate exceeds that projected for a precocial bird or eutherian mammal of comparable estimated body mass. An alternative method of estimating limb length and body mass for each growth line, and fitting the resulting age/mass data to the von Bertalanffy growth equation, yields a revised growth curve suggesting that Apatosaurus adult mass was reached by 70 years with a maximum growth rate of 520 kg/yr. This alternative method for growth rate determination can also be applied to histological studies of other sauropods. At only about half the mass of Apatosaurus, Janenschia took between 20 and 30 years to attain its adult size (over 14,000 kg). This result is supported by independent evidence of estimated bone apposition rates. Despite having an adult body mass greater than Apatosaurus, the titanosaurid Alamosaurus attained a mass over 32,000 kg within 45 years and a maximum growth rate of 1000 kg/yr. Titanosaurids may have been the fastest growing of all sauropods. Even so, sauropod growth rate estimates produced using the von Bertalanffy equation fall between those projected for reptiles and those for precocial birds of equivalent projected body mass. These results are comparable to those found for smaller dinosaurs, and suggest that sauropods grew at rates similar to other dinosaurs in spite of their great size.
The impact of oligofructose (OF) intake on stool frequency has not been clearly substantiated, while significant gastrointestinal (GI) symptoms have been reported in some individuals. The aim of the present study was to determine the effects of OF on stool frequency and GI symptoms in healthy adults. In an 8-week, randomised, double-blind, parallel-arm study, ninety-eight participants were provided with 16 g OF in yogurt and snack bars (twenty male and thirty female) or matching control foods (seventeen male and thirty-one female), to incorporate, by replacement, into their usual diets. Participants completed a daily online questionnaire recording stool frequency and rating four symptoms: bloating, flatulence, abdominal cramping and noise, each on a Likert scale from ‘0’ for none (no symptoms) to ‘6’ for very severe, with a maximum symptom intensity score of 24 (sum of severities from all four symptoms). Online 24 h dietary recalls were completed during pre-baseline and weeks 4, 6 and 8 to determine fibre intake. When provided with OF foods, fibre intake increased to 24·3 (sem 0·5) g/d from pre-baseline (12·1 (sem 0·5) g/d; P < 0·001). Stool frequency increased with OF from 1·3 (sem 0·2) to 1·8 (sem 0·2) stools per d in males and 1·0 (sem 0·1) to 1·4 (sem 0·1) stools per d in females during intervention weeks compared with pre-baseline (P < 0·05),but did not change for control participants (males: 1·6 (sem 0·2) to 1·8 (sem 0·2); females: 1·3 (sem 0·1) to 1·4 (sem 0·1)). Flatulence was the most commonly reported symptom. Mean GI symptom intensity score was higher for the OF group (3·2 (sem 0·3)) v. control (1·7 (sem 0·1)) (P < 0·01), with few participants reporting above moderate symptoms. No change in symptom intensity occurred over time. Consuming yogurt and snack bars with 16 g OF improves regularity in young healthy adults. However, GI symptoms, resulting from an increase in oligofructose intake, may not diminish with time.
Antimicrobial stewardship programs (ASPs) are a mechanism to ensure the appropriate use of antimicrobials. The extent to which ASPs are formally implemented in freestanding children's hospitals is unknown. The objective of this study was to determine the prevalence and characteristics of ASPs in freestanding children's hospitals.
Methods.
We conducted an electronic survey of 42 freestanding children's hospitals that are members of the Children's Hospital Association to determine the presence and characteristics of their ASPs. For hospitals without an ASP, we determined whether stewardship strategies were in place and whether there were barriers to implementing a formal ASP.
Results.
We received responses from 38 (91%) of 42. Among responding institutions, 16 (38%) had a formal ASP, and 15 (36%) were in the process of implementing a program. Most ASPs (13 [81%] of 16) were started after 2007. The median number of full-time equivalents dedicated to ASPs was 0.63 (range, 0.1–1.8). The most common antimicrobials monitored by ASPs were linezolid, vancomycin, and carbapenems. Many hospitals without a formal ASP were performing stewardship activities, including elements of prospective audit and feedback (9 [41%] of 22), formulary restriction (9 [41%] of 22), and use of clinical guidelines (17 [77%] of 22). Antimicrobial outcomes were more likely to be monitored by hospitals with ASPs (100% vs 68%; P = .01), although only 1 program provided support for a data analyst.
Conclusions.
Most freestanding children's hospitals have implemented or are developing an ASP. These programs differ in structure and function, and more data are needed to identify program characteristics that have the greatest impact.
To examine the use of vitamin D supplements during infancy among the participants in an international infant feeding trial.
Design
Longitudinal study.
Setting
Information about vitamin D supplementation was collected through a validated FFQ at the age of 2 weeks and monthly between the ages of 1 month and 6 months.
Subjects
Infants (n 2159) with a biological family member affected by type 1 diabetes and with increased human leucocyte antigen-conferred susceptibility to type 1 diabetes from twelve European countries, the USA, Canada and Australia.
Results
Daily use of vitamin D supplements was common during the first 6 months of life in Northern and Central Europe (>80 % of the infants), with somewhat lower rates observed in Southern Europe (>60 %). In Canada, vitamin D supplementation was more common among exclusively breast-fed than other infants (e.g. 71 % v. 44 % at 6 months of age). Less than 2 % of infants in the USA and Australia received any vitamin D supplementation. Higher gestational age, older maternal age and longer maternal education were study-wide associated with greater use of vitamin D supplements.
Conclusions
Most of the infants received vitamin D supplements during the first 6 months of life in the European countries, whereas in Canada only half and in the USA and Australia very few were given supplementation.
In a widely cited 2003 article, DiMasi, Hansen, and Grabowski estimated the cost of pharmaceutical research and development to be $1.1 billion (year 2000 U.S. dollars) per new medicine coming onto the market in 2001. They also estimate that this cost is going up at a real (inflation-adjusted) rate of 7.4% annually. According to these estimates, the innovation cost per new medicine today is about $2.1 billion (year 2000 U.S. dollars) or $2.65 billion (year 2010 U.S. dollars).
Historically, measurement of gastrointestinal transit time has required collection and X-raying of faecal samples for up to 7 d after swallowing radio-opaque markers; a tedious, labour-intensive technique for both subjects and investigators. Recently, a wireless motility capsule (SmartPill®), which uses gut pH, pressure and temperature to measure transit time, has been developed. This device, however, has not been validated with dietary interventions. Therefore, we conducted a controlled cross-over trial to determine whether the device could detect a significant difference in transit time after ten healthy subjects (five men and five women) consumed 9 g of wheat bran (WB) or an equal volume, low-fibre control for 3 d. A paired t test was used to determine differences in transit times. Colonic transit time decreased by 10·8 (sd 6·6) h (P = 0·006) on the WB treatment. Whole-gut transit time also decreased by 8·9 (sd 5·4) h (P = 0·02) after the consumption of WB. Gastric emptying time and small-bowel transit time did not differ between treatments. Despite encouraging results, the present study had several limitations including short duration, lack of randomisation and unusable data due to delayed gastric emptying of the capsule. With minimal participant burden, the SmartPill technology appears to be a potentially useful tool for assessing transit time after a dietary intervention. This technology could be considered for digestive studies with novel fibres and other ingredients that are promoted for gut health.
Palmore, Nowlin, and Wang (1985) posit a theoretical framework in which to predict the health status of older adults over time as a function of social, economic, physical and mental health variables. The current study provides the first independent, empirical test of this model based upon longitudinal data derived from a representative sample of older British Columbians. The results of this study support the basic structure of the Palmore model; however, various revisions were required to improve the fit of data. Findings indicate that demographic and socio-economic variables have a direct effect upon functional status at baseline. Results further suggest significant consistency in the functional status of older adults over a two-year interval. Directions for future research as well as limitations of the current study are discussed.