We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This article examines photographs taken by U.S. Marine Joe O'Donnell, who was tasked with documenting bombed urban centers immediately after the Asia-Pacific War. O'Donnell's photos document not just the physical damage wrought by U.S. bombing raids, but the human suffering as well. While most published images at the time projected the U.S. military's destructive potential through mushroom clouds or razed cities, O'Donnell shifted the visual focus to the struggles of Japanese citizens in the ruins. In doing so, O'Donnell disrupted notions of American superiority by giving voice to those rebuilding their lives in the war's aftermath.
The SDMPH 10-year anniversary conference created an opportunity for a researcher to present at a professional association conference to advance their research by seeking consensus of statements using Delphi methodology.
Methods
Conference attendees and SDMPH members who did not attend the conference were identified as Delphi experts. Experts rated their agreement of each statement on a 7- point linear numeric scale. Consensus amongst experts was defined as a standard deviation < = 1. Presenters submitted statements relevant to advancing their research to the authors to edit to fit Delphi statement formatting.
Statements attaining consensus were included in the final report after the first round. Those not attaining consensus moved to the second round in which experts were shown the mean response of the expert panel and their own response for opportunity to reconsider their rating for that round. If reconsideration attained consensus, these statements were included in the final report. This process repeated in a third and final round.
Results
37 Experts agreed to participate in the first round; 35 completed the second round, and 34 completed the third round; 35 statements attained consensus; 3 statements did not attain consensus.
Conclusions
A Delphi technique was used to establish expert consensus of statements submitted by the SDMPH conference presenters to guide their future education, research, and training.
This article focuses on Bramshill, Hampshire, one of the most important country houses of the early modern period. From 1605 Bramshill was the home of Edward la Zouche (1556–1625), 11th Baron Zouche – a major courtier during the reigns of Elizabeth i and James i – and was reworked in phases up to and just beyond his death. Based on rare-surviving documents from the 1630s as well as analysis of the house’s fabric, this article reconstructs Bramshill’s plan and interiors as they existed in the late Jacobean and Caroline years. It reveals the arrangement of two significant state apartments – for king and queen – and also the lodgings of the owner, his family and guests, as well as service areas. The rich source material additionally allows the furnishings to be analysed. By using Bramshill as an example, the article aims to shed light on English country houses of similar size and date. The early modern period is important as representing an apogee in the history of country house building – reflecting the ambition of owners such as Lord Zouche and the popularity of the royal progress. Visitors to Bramshill in the early seventeenth century included James i, Charles i and Queen Henrietta Maria.
OBJECTIVES/GOALS: To examine the individual and combined association between preoperative sleep disturbance (SD) and depression and 12-month disability, back pain, and leg pain after lumbar spine surgery (LSS). METHODS/STUDY POPULATION: We analyzed prospectively collected multi-center registry data from 700 patients undergoing LSS (mean age=60.9 years, 37% female, 89% white). Preoperative SD and depression were assessed with PROMIS measures. Established thresholds defined patients with moderate/severe symptoms. Disability (Oswestry Disability Index) and back and leg pain (Numeric Rating Scales) were assessed preoperatively and at 12 months. We conducted separate regressions to examine the influence of SD and depression on each outcome. Regressions examined each factor with and without accounting for the other and in combination as a 4-level variable. Covariates included age, sex, race, education, insurance, body mass index, smoking status, preoperative opioid use, fusion status, revision status, and preoperative outcome score. RESULTS/ANTICIPATED RESULTS: One hundred thirteen (17%) patients reported moderate/severe SD alone, 70 (10%) reported moderate/severe depression alone, and 57 (8%) reported both moderate/severe SD and depression. In independent models, preoperative SD and depression were significantly associated with 12-month outcomes (all p’s<0.05). After accounting for depression, preoperative SD was only associated with disability, while preoperative depression adjusting for SD remained associated with all outcomes (all p’s<0.05). Patients reporting both moderate/severe SD and moderate/severe depression had 12.6 points higher disability (95%CI=7.4 to 17.8) and 1.5 points higher back (95%CI=0.8 to 2.3) and leg pain (95%CI=0.7 to 2.3) compared to patients with no/mild SD and no/mild depression. DISCUSSION/SIGNIFICANCE: Preoperative SD and depression are independent predictors of 12-month disability and pain when considered in isolation. The combination of SD and depression impacts postoperative outcomes considerably. The high-risk group of patients with moderate/severe SD and depression could benefit from targeted treatment strategies.
Studies on vulnerability to interference have shown promise in distinguishing between normal and pathological aging, such as the early stage of Alzheimer’s disease (AD) or amnestic Mild Cognitive Impairment (aMCI). However, these studies did not include a non-semantic condition essential in distinguishing between what is attributable specifically to semantic memory impairments and more generalized vulnerability to interference. The present study aimed to determine whether the increased vulnerability to semantic interference previously observed in individuals at increased risk of AD (aMCI) is specifically associated with the semantic nature of the material, or if it also affects other types of material, suggesting more generalized executive and inhibitory impairment.
Participants and Methods:
Seventy-two participants (N = 72) divided into two groups (33 aMCI and 39 NC) matched for age and education were included in the study. They underwent a comprehensive neuropsychological examination, and took the adapted French version of the LASSI-L (semantic interference test), as well as a homologous experimental phonemic test, the TIP-A. Independent sample t-tests, mixed ANOVA and ANCOVA on memory and vulnerability to interference scores with the Group (NC, aMCI) as between-group factor and the Type of material (semantic, phonemic) as within-subject factor were conducted to compare memory and interference in both contexts for both groups.
Results:
For all memory scores, results revealed a significant main effect of group (NC > aMCI), a significant main effect of the type of material (semantic > phonemic) and a significant Group x Type interaction (disproportionately poorer performance in a semantic context for aMCI compared to NC). Word recognition was equivalent in both contexts for aMCI, whereas NC were better in a semantic context. aMCI also committed more phonemic false recognition errors, were disproportionately more vulnerable to retroactive semantic interference and showed a disproportionately higher percentage of intrusion errors associated with proactive semantic interference than NC.
Conclusions:
To our knowledge, this is the first study to meticulously compare aMCI and elderly control vulnerability to inter-list interference and its impact on memory processes in two very similarly designed conditions using different types of material (semantic vs. phonemic). Indeed, many studies on interference focused solely on intra-list buildup of interference or on semantic material. Taken together, our results suggest that aMCI patients present generalized difficulties in source memory and inhibition, but that their inability to benefit normally from the depth of processing of semantic material results in even more semantic intrusion errors during proactive interference. This superficial semantic processing also significantly impacts the ability of aMCI to show good recall after being exposed to an interference list and the passage of time, resulting in a greater vulnerability to semantic retroactive interference than controls. In summary, our results suggest that impairment of semantic memory, and, more precisely, the loss of benefit from the depth of semantic processing, represents the cornerstone of their memory and vulnerability to interference patterns. The classical level of processing theory therefore constitutes an ideal, simple framework to predict aMCI patients’ performance when facing interference, a parallel too rarely addressed in the literature.
Semantic memory deficits have been reported in both Alzheimer's disease (AD) and amnestic mild cognitive impairment (aMCI). However, the nature of this decline is still a matter of debate. The aim of this study was to explore the patterns of semantic memory impairment in aMCI by examining performance on naming tasks, and on tests assessing both general and specific semantic knowledge.
Participants and Methods:
Participants were divided in two groups matched for age and education, one comprising 33 aMCI individuals and the other 39 healthy controls. Three experimental tests assessing naming and semantic knowledge of unique items of famous persons (FACE) and places (PLACE), logos recognition (LOGO: brands and pictograms), and non-unique entities (Boston Naming Test: BNT) were administered, and the performance of the two groups was compared.
Results:
Lower scores were observed on all naming tests (PLACE, FACE, LOGO and BNT) in the aMCI group compared to controls. On the PLACE test, the general knowledge mean score (M=84.5, SD=12.9) was significantly higher than the specific knowledge mean score (M=54.2, SD=18.5) in aMCI participants (t(31)=11.9, p<.001), but not in controls (general: M=92.2, SD=11.1; specific: M=73.7, SD=15.8), and there was a significant Group X Type of knowledge interaction (F(1,1)=15.13, p <.001, n2 = 18). On the FACE test, in addition to significant group and condition (naming, semantic questions) main effects, a significant interaction was found (F(1,1)=7.19, p = .009, n2 = .09). On the LOGO task, controls were significantly better on brand items (M= 94.4, SD=10.5) than on pictograms (M=83.3, SD=12.2), while no significant difference was noted in aMCI (brands: M=81.5, SD=22.6; pictograms: M=77.5, SD=14.1). Lastly, on the BNT, aMCI participants benefited more from phonemic cues than controls (F(1,1)=16.56, p<.001, n2=19), suggesting a lexical access deficit, in addition to their semantic memory impairment.
Conclusions:
This study adds to the growing evidence confirming the presence of semantic memory deficits in aMCI. Specific semantic knowledge seems to be more affected than general semantic knowledge, a finding reported in previous studies. Lexical access deficits, in addition to semantic decline, were also observed in the aMCI group. These results allow for a better understanding of the pattern of semantic memory deficits in the prodromal stage of AD and could potentially facilitate diagnosis of aMCI.
Despite the critical role that quantitative scientists play in biomedical research, graduate programs in quantitative fields often focus on technical and methodological skills, not on collaborative and leadership skills. In this study, we evaluate the importance of team science skills among collaborative biostatisticians for the purpose of identifying training opportunities to build a skilled workforce of quantitative team scientists.
Methods:
Our workgroup described 16 essential skills for collaborative biostatisticians. Collaborative biostatisticians were surveyed to assess the relative importance of these skills in their current work. The importance of each skill is summarized overall and compared across career stages, highest degrees earned, and job sectors.
Results:
Survey respondents were 343 collaborative biostatisticians spanning career stages (early: 24.2%, mid: 33.8%, late: 42.0%) and job sectors (academia: 69.4%, industry: 22.2%, government: 4.4%, self-employed: 4.1%). All 16 skills were rated as at least somewhat important by > 89.0% of respondents. Significant heterogeneity in importance by career stage and by highest degree earned was identified for several skills. Two skills (“regulatory requirements” and “databases, data sources, and data collection tools”) were more likely to be rated as absolutely essential by those working in industry (36.5%, 65.8%, respectively) than by those in academia (19.6%, 51.3%, respectively). Three additional skills were identified as important by survey respondents, for a total of 19 collaborative skills.
Conclusions:
We identified 19 team science skills that are important to the work of collaborative biostatisticians, laying the groundwork for enhancing graduate programs and establishing effective on-the-job training initiatives to meet workforce needs.
BACKGROUND: IGTS is a rare phenomenon of paradoxical germ cell tumor (GCT) growth during or following treatment despite normalization of tumor markers. We sought to evaluate the frequency, clinical characteristics and outcome of IGTS in patients in 21 North-American and Australian institutions. METHODS: Patients with IGTS diagnosed from 2000-2017 were retrospectively evaluated. RESULTS: Out of 739 GCT diagnoses, IGTS was identified in 33 patients (4.5%). IGTS occurred in 9/191 (4.7%) mixed-malignant GCTs, 4/22 (18.2%) immature teratomas (ITs), 3/472 (0.6%) germinomas/germinomas with mature teratoma, and in 17 secreting non-biopsied tumours. Median age at GCT diagnosis was 10.9 years (range 1.8-19.4). Male gender (84%) and pineal location (88%) predominated. Of 27 patients with elevated markers, median serum AFP and Beta-HCG were 70 ng/mL (range 9.2-932) and 44 IU/L (range 4.2-493), respectively. IGTS occurred at a median time of 2 months (range 0.5-32) from diagnosis, during chemotherapy in 85%, radiation in 3%, and after treatment completion in 12%. Surgical resection was attempted in all, leading to gross total resection in 76%. Most patients (79%) resumed GCT chemotherapy/radiation after surgery. At a median follow-up of 5.3 years (range 0.3-12), all but 2 patients are alive (1 succumbed to progressive disease, 1 to malignant transformation of GCT). CONCLUSION: IGTS occurred in less than 5% of patients with GCT and most commonly after initiation of chemotherapy. IGTS was more common in patients with IT-only on biopsy than with mixed-malignant GCT. Surgical resection is a principal treatment modality. Survival outcomes for patients who developed IGTS are favourable.
Firestone & Scholl's (F&S's) techniques to combat task demand by manipulating expectations and offering alternative cover stories are fundamentally flawed because they introduce new forms of demand. We review five superior techniques to mitigate demand used in confirmatory studies of top-down effects. We encourage researchers to apply the same standards when evaluating evidence on both sides of the debate.
This article aims to reconstruct the plan of Theobalds, Hertfordshire, built between 1564 and 1585 by Sir William Cecil, Lord Burghley. Theobalds was perhaps the most significant English country house of the Elizabethan period and in 1607 was taken on as a royal palace. It was visited by all the major court and political figures of the age, while its fame also extended overseas. Theobalds was innovative in various respects, as the article makes clear, and it had a profound impact on the architecture of its generation. Its importance is all the more extraordinary given that Theobalds was so short-lived: the house was taken down shortly after 1650 and few traces of it survive today. The assumption has been that, because the house was demolished so long ago, it could not be well understood. This article contradicts that view by reconstructing in detail the plan of Theobalds, using evidence provided by primary documents.
Governments are increasingly implementing policies that encourage early father-infant bonding. However, to date, research has not systematically examined fathers’ perspectives and experiences of early bonding. Using a social constructionist embodiment perspective we argue that paternal bonding is best conceived as a process of repeated, embodied performances that are shaped by gendered parenting discourses. Drawing on 100 semi-structured interviews with a diverse group of Australian fathers of young infants, we argue that most men believe they are capable of developing early strong bonds. They assume that bonding is a product of spending sufficient time with a child, irrespective of the parent's gender. In contrast, a sizable minority of fathers assert that physiology means fathers are ‘largely useless’ to very young infants, and tend to remain distant in the early months. We conclude that social policies promoting early paternal bonding must engage with and challenge gendered/physiological discourses.
The current paper describes Diet In Nutrients Out (DINO), an integrated dietary assessment system incorporating dietary data entry and nutritional analysis within one platform for use in dietary assessment in small-scale intervention studies to national surveys.
Design
DINO contains >6000 food items, mostly aggregated composites of branded foods, across thirty-one main food groups divided into 151 subsidiary groups for detailed reporting requirements, with fifty-three core nutrient fields.
Setting
MRC Human Nutrition Research (HNR), Cambridge, UK and MRC Keneba, Gambia.
Subjects
DINO is used across dietary assessment projects at HNR and MRC Keneba.
Results
DINO contains macro- and micronutrients as well as additional variables of current research and policy interest, such as caffeine, whole grains, vitamin K and added sugars. Disaggregated data are available for fruit, vegetables, meat, fish and cheese in composite foods, enabling greater accuracy when reporting food consumption or assessing adherence to dietary recommendations. Portion sizes are categorised in metric and imperial weights, with standardised portion sizes for each age group. Regular reviews are undertaken for portion sizes and food composition to ensure contemporary relevance. A training programme and a checking schedule are adhered to for quality assurance purposes, covering users and data. Eating context questions are integrated to record where and with whom the respondent is eating, allowing examination between these factors and the foods consumed.
Conclusions
An up-to-date quality-assured system for dietary assessment is crucial for nutritional surveillance and research, but needs to have the flexibility to be tailored to address specific research questions.
Few studies have considered the combined effects of home-related determinants on children's diet. The present study investigated independent associations between sociodemographic and food practice (SFP) characteristics and fruit and vegetable consumption in UK children and the combined effects of SFP on consumption using pattern analysis.
Design
Diet was assessed using 4 d food diaries, SFP were collected using computer-assisted personal interview. Linear regressions were used to test associations; principal component analysis was used to identify patterns of SFP characteristics. Regression of fruit (g/d) and vegetables (g/d) v. component scores of each pattern were performed.
Setting
UK National Diet and Nutrition Survey Rolling Programme (2008–2010).
Subjects
Children aged 1·5–10 years (n 642).
Results
Significant associations were found between fruit and vegetable consumption and household socio-economic status. Pattern 1, which was positively correlated with household structure characteristics, was associated with increased fruit consumption (P < 0·001). Pattern 2, characterised by positive correlations for socio-economic status, fruit availability and organic food purchase, and negatively correlated with household size and the number of children per household, was associated with higher fruit and vegetable consumption (both P < 0·001). Pattern 3, characterised by high frequency of eating out and eating takeaway, was associated with a lower consumption of both fruit (P < 0·012) and vegetables (P < 0·023).
Conclusions
Patterns of SFP determinants may be more informative than individual characteristics in relation to dietary outcomes. Results have public health implications on the healthfulness of meals eaten out of home and in takeaways, as well as the need to reduce diet inequality in larger households with lower socio-economic status.
The National Diet and Nutrition Survey (NDNS) is a cross-sectional survey designed to gather data representative of the UK population on food consumption, nutrient intakes and nutritional status. The objectives of the present paper were to identify and describe food consumption and nutrient intakes in the UK from the first year of the NDNS rolling programme (2008–09) and compare these with the 2000–01 NDNS of adults aged 19–64 years and the 1997 NDNS of young people aged 4–18 years. Differences in median daily food consumption and nutrient intakes between the surveys were compared by sex and age group (4–10 years, 11–18 years and 19–64 years). There were no changes in energy, total fat or carbohydrate intakes between the surveys. Children aged 4–10 years had significantly lower consumption of soft drinks (not low calorie), crisps and savoury snacks and chocolate confectionery in 2008–09 than in 1997 (all P < 0·0001). The percentage contribution of non-milk extrinsic sugars to food energy was also significantly lower than in 1997 in children aged 4–10 years (P < 0·0001), contributing 13·7–14·6 % in 2008–09 compared with 16·8 % in 1997. These changes were not as marked in older children and there were no changes in these foods and nutrients in adults. There was still a substantial proportion (46 %) of girls aged 11–18 years and women aged 19–64 years (21 %) with mean daily Fe intakes below the lower reference nutrient intake. Since previous surveys there have been some positive changes in intakes especially in younger children. However, further attention is required in other groups, in particular adolescent girls.
Background: In contrast to a wealth of research on the treatment of Obsessive Compulsive Disorder (OCD), there is a relative paucity of work examining how OCD begins. Available data suggest that there is often a slow progression from the onset of symptoms to meeting criteria for a diagnosis of OCD. Aims: The current study sought to add to existing data documenting the slow-development of OCD, and to extend previous findings by examining potential moderators of this symptom phase and to examine patients’ explanations for the transition from symptoms to disorder. Method: One hundred and ninety-nine individuals with OCD reported on the start of their symptoms and the disorder via an internet-based survey. Results: Over two-thirds of respondents reported that the development of their OCD was gradual. Further, participants reported having experienced obsessions and/or compulsions for an average of 5 years before experiencing full-blown OCD. This extended symptom phase was observed in individuals with early- and late-onset OCD, with an even more protracted symptom phase in the later group. Female gender and onset of compulsions prior to obsessions were also associated with slower progression to full-blown OCD. Finally, explanations for the transition from symptoms to disorder suggest that changes in daily routines and general stress may be particularly important in this transition for individuals that develop clinical OCD at age 18 or later. Conclusions: Existence of a protracted symptom phase may present opportunities for elucidating risk factors for OCD disease progression and a window of opportunity for indicated prevention programs.