We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In this retrospective cohort study of military trainees, symptomatic-only coronavirus disease 2019 (COVID-19) arrival antigen testing decreased isolation requirements without increasing secondary cases compared to universal antigen testing. Symptomatic-only arrival antigen testing is a feasible alternative for individuals entering a congregant setting with a high risk of COVID-19 transmission.
From early on, infants show a preference for infant-directed speech (IDS) over adult-directed speech (ADS), and exposure to IDS has been correlated with language outcome measures such as vocabulary. The present multi-laboratory study explores this issue by investigating whether there is a link between early preference for IDS and later vocabulary size. Infants’ preference for IDS was tested as part of the ManyBabies 1 project, and follow-up CDI data were collected from a subsample of this dataset at 18 and 24 months. A total of 341 (18 months) and 327 (24 months) infants were tested across 21 laboratories. In neither preregistered analyses with North American and UK English, nor exploratory analyses with a larger sample did we find evidence for a relation between IDS preference and later vocabulary. We discuss implications of this finding in light of recent work suggesting that IDS preference measured in the laboratory has low test-retest reliability.
Cancer health research relies on large-scale cohorts to derive generalizable results for different populations. While traditional epidemiological cohorts often use costly random sampling or self-motivated, preselected groups, a shift toward health system-based cohorts has emerged. However, such cohorts depend on participants remaining within a single system. Recent consumer engagement models using smartphone-based communication, driving projects, and social media have begun to upend these paradigms.
Methods:
We initiated the Healthy Oregon Project (HOP) to support basic and clinical cancer research. HOP study employs a novel, cost-effective remote recruitment approach to effectively establish a large-scale cohort for population-based studies. The recruitment leverages the unique email account, the HOP website, and social media platforms to direct smartphone users to the study app, which facilitates saliva sample collection and survey administration. Monthly newsletters further facilitate engagement and outreach to broader communities.
Results:
By the end of 2022, the HOP has enrolled approximately 35,000 participants aged 18–100 years (median = 44.2 years), comprising more than 1% of the Oregon adult population. Among those who have app access, ∼87% provided consent to genetic screening. The HOP monthly email newsletters have an average open rate of 38%. Efforts continue to be made to improve survey response rates.
Conclusion:
This study underscores the efficacy of remote recruitment approaches in establishing large-scale cohorts for population-based cancer studies. The implementation of the study facilitates the collection of extensive survey and biological data into a repository that can be broadly shared and supports collaborative clinical and translational research.
Prior research has reported an association between divorce and suicide attempt. We aimed to clarify this complex relationship, considering sex differences, temporal factors, and underlying etiologic pathways.
Methods
We used Swedish longitudinal national registry data for a cohort born 1960–1990 that was registered as married between 1978 and 2018 (N = 1 601 075). We used Cox proportional hazards models to estimate the association between divorce and suicide attempt. To assess whether observed associations were attributable to familial confounders or potentially causal in nature, we conducted co-relative analyses.
Results
In the overall sample and in sex-stratified analyses, divorce was associated with increased risk of suicide attempt (adjusted hazard ratios [HRs] 1.66–1.77). Risk was highest in the year immediately following divorce (HRs 2.20–2.91) and declined thereafter, but remained elevated 5 or more years later (HRs 1.41–1.51). Divorcees from shorter marriages were at higher risk for suicide attempt than those from longer marriages (HRs 3.33–3.40 and 1.20–1.36, respectively). In general, HRs were higher for divorced females than for divorced males. Co-relative analyses suggested that familial confounders and a causal pathway contribute to the observed associations.
Conclusions
The association between divorce and risk of suicide attempt is complex, varying as a function of sex and time-related variables. Given evidence that the observed association is due in part to a causal pathway from divorce to suicide attempt, intervention or prevention efforts, such as behavioral therapy, could be most effective early in the divorce process, and in particular among females and those whose marriages were of short duration.
Early in the COVID-19 pandemic, the World Health Organization stressed the importance of daily clinical assessments of infected patients, yet current approaches frequently consider cross-sectional timepoints, cumulative summary measures, or time-to-event analyses. Statistical methods are available that make use of the rich information content of longitudinal assessments. We demonstrate the use of a multistate transition model to assess the dynamic nature of COVID-19-associated critical illness using daily evaluations of COVID-19 patients from 9 academic hospitals. We describe the accessibility and utility of methods that consider the clinical trajectory of critically ill COVID-19 patients.
In 2018, the Neurodevelopmental and Psychosocial Interventions Working Group of the Cardiac Neurodevelopmental Outcome Collaborative convened through support from an R13 grant from the National Heart, Lung, and Blood Institute to survey the state of neurodevelopmental and psychosocial intervention research in CHD and to propose a slate of critical questions and investigations required to improve outcomes for this growing population of survivors and their families. Prior research, although limited, suggests that individualised developmental care interventions delivered early in life are beneficial for improving a range of outcomes including feeding, motor and cognitive development, and physiological regulation. Interventions to address self-regulatory, cognitive, and social-emotional challenges have shown promise in other medical populations, yet their applicability and effectiveness for use in individuals with CHD have not been examined. To move this field of research forward, we must strive to better understand the impact of neurodevelopmental and psychosocial intervention within the CHD population including adapting existing interventions for individuals with CHD. We must examine the ways in which dedicated cardiac neurodevelopmental follow-up programmes bolster resilience and support children and families through the myriad transitions inherent to the experience of living with CHD. And, we must ensure that interventions are person-/family-centred, inclusive of individuals from diverse cultural backgrounds as well as those with genetic/medical comorbidities, and proactive in their efforts to include individuals who are at highest risk but who may be traditionally less likely to participate in intervention trials.
Background: Hemolysis of blood samples is the leading cause of specimen rejection from hospital laboratories. It contributes to delays in patient care and disposition decisions. Coagulation tests (prothrombin time/international normalized ratio [PT/INR] and activated partial thromboplastin time [aPTT]) are especially problematic for hemolysis in our academic hospital, with at least one sample rejected daily from the emergency department (ED). Aim Statement: We aimed to decrease the monthly rate of hemolyzed coagulation blood samples sent from the ED from a rate of 2.9% (53/1,857) to the best practice benchmark of less than 2% by September 1st, 2019. Measures & Design: Our outcome measure was the rate of hemolyzed coagulation blood samples. Our process measure was the rate of coagulation blood tests sent per 100 ED visits. Our balancing measure was the number of incident reports by clinicians when expected coagulation testing did not occur. We used monthly data for our Statistical Process Control (SPC) charts, as well as Chi square and Mann-Whitney U tests for our before-and-after evaluation. Using the Model for Improvement to develop our project's framework, we used direct observation, broad stakeholder engagement, and process mapping to identify root causes. We enlisted nursing champions to develop our Plan-Do-Study-Act (PDSA) cycles/interventions: 1) educating nurses on hemolysis and coagulation testing; 2) redesigning the peripheral intravenous and blood work supply carts to encourage best practice; and 3) removing PT/INR and aPTT from automatic inclusion in our electronic chest pain bloodwork panel. Evaluation/Results: The average rate of hemolysis remained unchanged from baseline (2.9%, p = 0.83). The average rate of coagulation testing sent per 100 ED visits decreased from 41.5 to 28.8 (absolute decrease 12.7 per 100, p < 0.05), avoiding $4,277 in monthly laboratory costs. The SPC chart of our process measure showed special cause variation with greater than eight points below the centerline. Discussion/Impact: Our project reduced coagulation testing, without changing hemolysis rates. Buy-in from frontline nurses was integral to the project's early success, prior to implementing our electronic approach – a solution ranked higher on the hierarchy of intervention effectiveness – to help sustainability. This resource stewardship project will now be spread to a nearby institution by utilizing similar approaches.
This study investigated the attitudes of medical students towards psychiatry, both as a subject on the medical curriculum and as a career choice. Three separate questionnaires previously validated on medical student populations were administered prior to and immediately following an 8-week clinical training programme. The results indicate that the perception of psychiatry was positive prior to clerkship and became even more so on completion of training. On completion of the clerkship, there was a rise in the proportion of students who indicated that they might choose a career in psychiatry. Attitudes toward psychiatry correlated positively with the psychiatry examination results. Those that intended to specialise in psychiatry achieved significantly higher examination scores in the psychiatry examination.
Objective: The human gut microbiota has been demonstrated to be associated with a number of host phenotypes, including obesity and a number of obesity-associated phenotypes. This study is aimed at further understanding and describing the relationship between the gut microbiota and obesity-associated measurements obtained from human participants. Subjects/Methods: Here, we utilize genetically informative study designs, including a four-corners design (extremes of genetic risk for BMI and of observed BMI; N = 50) and the BMI monozygotic (MZ) discordant twin pair design (N = 30), in order to help delineate the role of host genetics and the gut microbiota in the development of obesity. Results: Our results highlight a negative association between BMI and alpha diversity of the gut microbiota. The low genetic risk/high BMI group of individuals had a lower gut microbiota alpha diversity when compared to the other three groups. Although the difference in alpha diversity between the lean and heavy groups of the BMI-discordant MZ twin design did not achieve significance, this difference was observed to be in the expected direction, with the heavier participants having a lower average alpha diversity. We have also identified nine OTUs observed to be associated with either a leaner or heavier phenotype, with enrichment for OTUs classified to the Ruminococcaceae and Oxalobacteraceae taxonomic families. Conclusion: Our study presents evidence of a relationship between BMI and alpha diversity of the gut microbiota. In addition to these findings, a number of OTUs were found to be significantly associated with host BMI. These findings may highlight separate subtypes of obesity, one driven by genetic factors, the other more heavily influenced by environmental factors.
The present study aimed to examine the correlates of fruit and vegetable intake (FVI) separately among parents and their adolescents.
Design
Cross-sectional surveys.
Setting
Online survey.
Subjects
Parents and adolescents completed the Family Life, Activity, Sun, Health, and Eating (FLASHE) survey through the National Cancer Institute. The survey assessed daily intake frequencies of food/beverage groups, psychosocial, parenting and sociodemographic factors. Generalized linear models were run for both parents and adolescents, for a total of six models (three each): (i) sociodemographic characteristics; (ii) psychosocial factors; (iii) parent/caregiver factors.
Results
Parent participants (n 1542) were predominantly 35–59 years old (86 %), female (73 %), non-Hispanic White (71 %) or non-Hispanic Black (17 %), with household income <$US 100 000 (79 %). Adolescents (n 805) were aged 12–14 years (50 %), non-Hispanic White (66 %) and non-Hispanic Black (15 %). Parents consumed 2·9 cups fruits and vegetables (F&V) daily, while adolescents consumed 2·2 cups daily. Educational attainment (higher education had greater FVI) and sex (men consumed more than women; all P<0·001) were significant FVI predictors. Parents with greater autonomous and controlled motivation, self-efficacy and preferences for fruit reported higher FVI (all P<0·001). Similarly, adolescents with greater autonomous and controlled motivation, self-efficacy and knowledge reported higher FVI (all P<0·001). Parenting factors of importance were co-deciding how many F&V teens should have, rules, having F&V in the home and cooking meals from scratch (all P<0·05).
Conclusions
Findings suggest factors that impact FVI among parents and their adolescent(s), which highlight the importance of the role of parent behaviour and can inform tailored approaches for increasing FVI in various settings.
Surgical site infections (SSIs) following colorectal surgery (CRS) are among the most common healthcare-associated infections (HAIs). Reduction in colorectal SSI rates is an important goal for surgical quality improvement.
OBJECTIVE
To examine rates of SSI in patients with and without cancer and to identify potential predictors of SSI risk following CRS
DESIGN
American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) data files for 2011–2013 from a sample of 12 National Comprehensive Cancer Network (NCCN) member institutions were combined. Pooled SSI rates for colorectal procedures were calculated and risk was evaluated. The independent importance of potential risk factors was assessed using logistic regression.
SETTING
Multicenter study
PARTICIPANTS
Of 22 invited NCCN centers, 11 participated (50%). Colorectal procedures were selected by principal procedure current procedural technology (CPT) code. Cancer was defined by International Classification of Disease, Ninth Revision, Clinical Modification (ICD-9-CM) codes.
MAIN OUTCOME
The primary outcome of interest was 30-day SSI rate.
RESULTS
A total of 652 SSIs (11.06%) were reported among 5,893 CRSs. Risk of SSI was similar for patients with and without cancer. Among CRS patients with underlying cancer, disseminated cancer (SSI rate, 17.5%; odds ratio [OR], 1.66; 95% confidence interval [CI], 1.23–2.26; P=.001), ASA score ≥3 (OR, 1.41; 95% CI, 1.09–1.83; P=.001), chronic obstructive pulmonary disease (COPD; OR, 1.6; 95% CI, 1.06–2.53; P=.02), and longer duration of procedure were associated with development of SSI.
CONCLUSIONS
Patients with disseminated cancer are at a higher risk for developing SSI. ASA score >3, COPD, and longer duration of surgery predict SSI risk. Disseminated cancer should be further evaluated by the Centers for Disease Control and Prevention (CDC) in generating risk-adjusted outcomes.
Deriving glacier outlines from satellite data has become increasingly popular in the past decade. In particular when glacier outlines are used as a base for change assessment, it is important to know how accurate they are. Calculating the accuracy correctly is challenging, as appropriate reference data (e.g. from higher-resolution sensors) are seldom available. Moreover, after the required manual correction of the raw outlines (e.g. for debris cover), such a comparison would only reveal the accuracy of the analyst rather than of the algorithm applied. Here we compare outlines for clean and debris-covered glaciers, as derived from single and multiple digitizing by different or the same analysts on very high- (1 m) and medium-resolution (30 m) remote-sensing data, against each other and to glacier outlines derived from automated classification of Landsat Thematic Mapper data. Results show a high variability in the interpretation of debris-covered glacier parts, largely independent of the spatial resolution (area differences were up to 30%), and an overall good agreement for clean ice with sufficient contrast to the surrounding terrain (differences ∼5%). The differences of the automatically derived outlines from a reference value are as small as the standard deviation of the manual digitizations from several analysts. Based on these results, we conclude that automated mapping of clean ice is preferable to manual digitization and recommend using the latter method only for required corrections of incorrectly mapped glacier parts (e.g. debris cover, shadow).